The recently concluded Advantest VOICE conference in San Diego has brought to light some revelations and a bit of stroll down memory lane as regards testing. The conference began in 2006 as a forum for Verigy (now Advantest) ATE users to network, collaborate and share technical solutions.

As expected, there were a lot of "then" and "now" references during the welcome and keynote messages by Doug Lefever, CEO of Advantest America, and Michael Campbell, SVP of engineering at Qualcomm, respectively. Certainly, the end markets identified by the foregoing speakers and other presenters as the key drivers for ATE (cloud apps, smartphones, system level test, automotive and IoT) have changed in the past decade. However, the ATE value proposition precipitated by those key drivers does not appear to have changed much. Even Campbell seems to agree. He effectively said at one point in his talk that we don't care what the IoT endpoint does, we just need to know how to test the microprocessor, wireless transceiver and other components comprising the endpoint module.

Another clue to the foregoing conclusion is that the roadmap presentations shown looked eerily the same as the overly detailed and largely unreadable presentations I created ten years ago. As I did back then, these presentations promised lower cost-of-test and faster time-to-market for all who would join their camp. While lowering cost and improving time-to-market are still important, some of the common approaches to providing such value (higher density channels, parallel and/or concurrent test, multi-purpose instrumentation and others) could be reaching certain limits after more than 10 years.

ATE Figure 1: Cloud-based test capacity management capabilities offer something new under the ATE sun.

Even though it may seem there is not much new under the ATE sun, all is not vanity. We in the semiconductor test industry are instead tasked to discover new and innovative ways to increase the value provided by test. For example, along with the now obligatory higher density instrument introduction, I was definitely pleased to see at VOICE this year such innovative efforts as open-sourced test IP integration software, subscription-based test IP downloadable from the cloud (with free desktop test hardware) and various system-level test solutions.

More sophisticated test capacity planning is another relatively unexplored area that can offer a new approach to lowering cost-of-test and even improving time-to-market. As it has always been under the ATE sun, about half of the cost to test a given device is still attributable to the capital cost of the underlying test-cell equipment. As equipment utilisation is improved, so is the cost-of-test. Planning your test capacity with more accuracy (e.g., responsive to dynamic device forecasts) and more precision (e.g., down to the board, channel and license levels) can effectively provide a new cost-of-test reduction mechanism rarely used. A good solution for providing test capacity management techniques will also facilitate test capacity planning collaboration across the test ecosystem, further improving time-to-market.

What do you see as the latest innovation in test that will bring the most value?