The software industry is special in many ways. Having such low capital requirements means that large amounts of money are not required to produce a prototype or even a product. This preserves the notion that even a single person working in a garage can come up with an idea and implement it with no more equipment than I am using to write this article.
The electronic design automation (EDA) industry is an even more interesting case. Creating a generic piece of software is only the first piece of the puzzle. You also have to decide how to market and sell it.
With EDA, if you have solved the right problem, the customers are often sitting there waiting for a viable solution. An additional interesting facet of the EDA industry is that it is constantly consolidating. Almost all successful companies (and some not-so-successful ones) are bought up by the three large EDA companies. Entrepreneurs soon get itchy feet and want to start the next company. And on it goes with the next generation of startups.
G-Analog Design Automation, a new analog company headquartered in Taiwan, used the 50th Design Automation Conference (DAC) as its launch pad. The company's founder, Dr. Jeff Tuan, was also a founder of Nassda, which Synopsys bought in 2004 as part of the settlement of an ugly patent and trade theft case.
After the Synopsys purchase, Tuan spent time at Chartered Semi and GlobalFoundries and saw some of the problems and potential solutions from the other side of the fence. One area that seemed ripe to exploit is cell characterization, especially for small geometries where process variation becomes much more important.
Few digital designers think about analog, and yet cell libraries have to be designed as if they were analog circuits to be robust enough for digital circuitry. They go through extensive analysis to show that they continue to operate correctly over a wide range of operating conditions. The operating parameters must be evaluated to ensure that the synthesis tools can use them correctly. This is a time-consuming process involving Monte Carlo simulations to create on-chip variation tables.
G-Analog says this is a problem with a small amount of data, where each cell is only a few transistors; what is needed is a highly replicated process that could be mapped on to graphics processing units, rather than running them serially on a workstation or server farm. Tuan told me at the DAC that the trick is to make all the GPUs use the same copy of the cell in memory, because even a GPU is constrained by memory bandwidth. Having them all share the same image provides a significant part of the speedup.
The company says it gets about a 50X speedup using four GPUs with no loss of accuracy, because it has built a complete GPU-based version of SPICE. The cells and device models are first analyzed to determine the process sensitivity and the worst-case timing arc for each cell. Monte Carlo simulations are then performed to determine the de-rate values for each cell.
The product is targeting cell characterization today, but it seems that GPUs could be applied to a number of analog analysis problems, though they may require different memory solution. Which analog applications seem most suitable for porting to GPUs?