For 25 years, DSP processor vendors have benefited from a very powerful secret weapon: extremely talented, hard-working customers. These customers understood their applications and algorithms inside and out. They were experts on processor instruction sets. They studied every nuance of microarchitecture, from pipelines to memory bank structures. And they spent countless weeks applying that knowledge to create dazzlingly efficient implementations of their applications on DSP processors.
Read more...
In 2002, BDTI published the first rigorous, independent benchmarking study comparing the signal processing capabilities of high-end FPGAs to those of high-performance DSP processors. We benchmarked both technologies on the same demanding, highly-parallelizable multi-channel wireless application, and we looked at how many channels could be supported on each chip and the corresponding cost per channel. We knew that FPGAs had begun to find use as high-end signal processing engines, but we were
Read more...
We’re on the cusp of profound changes in the competitive landscape for embedded processing engines—changes that I believe will “shuffle the deck” with respect to which kinds of architectures are dominant in many applications.
Consider this scenario: You’re a systems engineer working on the next generation of your digital-signal-processing-oriented product. You need to upgrade processors because the one you’ve got won’t cut it. Perhaps it’s not fast enough. Or maybe it doesn’t have
Read more...
Back in the early 1990’s, compilers for DSP processors were pretty lame. Even if a compiler generated code that was functionally correct (which, sadly, wasn’t always the case) the code was usually far from efficient. At the time, this wasn’t a big deal: DSP applications were still fairly small (in terms of lines of code), and DSP processor architectures weren’t nearly as complex as they are today. A reasonably skilled DSP software engineer could optimize an application by hand, sometimes
Read more...
It’s a tough world out there for small processor companies. It’s tough to attract new customers, and tough to support the ones you manage to get. A key challenge is the trend towards customers consolidating their purchasing: many system companies prefer to use fewer unique processors in their systems, for both business and technical reasons. From a business standpoint, using fewer different processors (and thus, using fewer vendors) can help streamline procurement and provide negotiating
Read more...
In October of 2007, I wrote a column called “When Worlds Collide,” which was about NVIDIA’s emerging strategy of offering “general-purpose GPUs.” At the time, I thought it was interesting that NVIDIA had begun to move beyond graphics applications to target “high-performance computing” (HPC) applications like financial and seismic analysis, thus competing with processors outside of the GPU space. I also observed that the ubiquity of GPUs in PCs would likely help NVIDIA gain traction in non-
Read more...
In the last decade most companies in the electronics industry have invested significant efforts in streamlining their design, testing, and manufacturing processes. Time-to-market pressures are intensifying; engineers and technical support staff often work overtime to meet product deadlines. But there’s one task that is still typically slower than molasses in winter—and that’s procurement.
Ask nearly anyone who works for a medium- to large-sized tech company about their procurement process
Read more...
The beauty of digital signal processing is that it enables people to convert available processing power into cool new features, better performance, and lower power in their products. There are countless examples, including MP3 players, wireless communications of all kinds, medical imaging, and voice recognition.
Microcontrollers historically haven’t had enough processing power to do much DSP, but that’s changing—today’s high-end microcontrollers offer DSP performance levels that were once
Read more...
In 2004 my friend Nick Tredennick wrote an interesting article in which he made the case that the x86 architecture would ultimately dominate embedded applications. At the time, I thought Nick’s argument was slightly loopy. But I have to admit that I’m having second thoughts. These second thoughts have almost nothing to do with any snazzy new chips introduced by Intel, and everything to do with software development for multicore processors.
This month Intel announced its acquisition of
Read more...
A couple of months ago I wrote a column about the frequently unpleasant “getting-the-box” experience. This month I’d like to rant about a separate but related topic: the frustration of dealing with distributors.
I’ve always thought that the point of a distributor was to get products into the hands of customers. But in my experience, distributors are often more of an impediment than an enabler. For example: A few weeks ago BDTI needed to buy the software development tools for a well-known
Read more...