Let’s face it: Applications are getting more complicated. Chips are getting more complicated. And engineering teams are generally getting smaller, not larger. As a result, it’s incumbent on chip vendors to provide robust, easy-to-use development kits. Design engineers rely on these kits to quickly evaluate chips and prototype key portions of their systems.
Clearly chip manufacturers recognize that development kits are important, and there are hundreds available. But the quality of these kits varies dramatically. The best are a pleasure to use: well-documented and easy to install, they make it easy to get started learning the chip and building an application. The worst are a nightmare of confusing documentation, incompatible software, and missing pieces.
Today, few major chip companies routinely ship defective chips. But many chip vendors do ship defective development kits—defective in the sense that they fail to meet the main goal of a development kit, which is to make it easy for the customer to get started using the chip.
BDTI was recently engaged to evaluate a development kit from a major chip manufacturer. The goal was to provide an independent assessment of the user experience, compare the kit to kits from competing vendors, and make recommendations for improvements. BDTI obtained, installed, and used the kit in the ways that a typical customer would, carefully documenting what worked well and what didn’t. The bad news was that the client’s kit had a number of serious problems. The good news was that many of the problems were straightforward to fix.
For example, the kit contained a number of elements (board, tools, sample designs, etc.). Each of these elements came with its own documentation, but there was no overall document explaining how the elements were supposed to work together. And the documentation was a confusing mix of hardcopy, electronic documents provided with the kit, and on-line documents. Furthermore, the supplied demonstration applications were not well aligned with the chip’s target applications—and one of them didn’t work.
BDTI compared the kit to several competing kits and created a “hit list” of areas where the vendor could make improvements to the documentation, tools, hardware, and sample applications. Based on feedback from BDTI, the client was able to quickly make huge improvements in the usability of its kits. And, to confirm they’d hit the mark, the client then engaged BDTI to evaluate the improved kit. The result was a vastly improved user experience—one that will attract customers instead of driving them away.
To learn more about BDTI’s evaluation capabilities, contact Jeremy Giddings at BDTI (giddings@BDTI.com) or visit www.BDTI.com.
Add new comment