For 25 years, DSP processor vendors have benefited from a very powerful secret weapon: extremely talented, hard-working customers. These customers understood their applications and algorithms inside and out. They were experts on processor instruction sets. They studied every nuance of microarchitecture, from pipelines to memory bank structures. And they spent countless weeks applying that knowledge to create dazzlingly efficient implementations of their applications on DSP processors.
Unfortunately for DSP processor vendors, today such customers are few and far between. What happened? There are two key underlying factors here. First, highly talented engineers specializing in DSP software optimization are becoming harder to find. Second, those highly talented specialists now have a lot more on their plates than they did 10 or 20 years ago. That’s because, gradually but steadily, applications have become much more complex. When I began programming DSPs in 1986, a complete, sophisticated application might have been implemented in a few thousand lines of assembly code. These days, a complete signal processing subsystem (such as a wireless transceiver) may comprise a few thousand files of C code. That, in turn, is only one component of a complete system.
Along the way, DSP processors have also become much more complex. Although DSPs have always been somewhat exotic, specialized machines, today’s DSP architectures are vastly more complicated than their predecessors. They use very long instruction words, highly specialized instructions, single-instruction-multiple-data operations, deep pipelines, multi-dimensional DMA engines, etc., etc. And that’s before we even begin talking about multi-core architectures.
So, applications have become much larger and more complex. Processors have become more complex. Highly skilled, specialist engineers are harder to find. And engineering teams have, generally speaking, not become larger. In many cases, in fact, they’ve become smaller. As a result, DSP engineers have less and less time to become experts on the inner workings of their algorithms and the details of their target processors. As a colleague of mine said recently, “More customers want to learn less about the processor architecture.” The key consequence of this trend is that the value proposition of DSP processors is now in question.
Why do DSP processors exist? It’s simple: DSP processors exist to deliver superior performance, cost/performance and energy efficiency on digital signal processing workloads compared to general-purpose processors. 25 years ago—even 10 years ago—that meant superior performance when programmed in assembly language by an expert DSP software engineer. But today, it increasingly means superior performance when programmed in C by a non-expert—or by an overworked expert.
Unfortunately, the C compilers for many DSP processors are not impressive. The net result of this is that the performance and efficiency realized when using a DSP processor with its C compiler may be no better than that obtained using a general-purpose CPU with its C compiler. Whatever hardware disadvantage the general-purpose CPU has may be offset by the advantage of its superior compiler.
The game is certainly not over yet. There are a few good DSP processor C compilers out there. And there are still a few DSP software engineers with the talent and the time to tweak their code to get the maximum out of the hardware. But, by and large, if DSP processors are going to continue to deliver a compelling value proposition, they’re going to need better compilers—and soon.
Add new comment