The Tour
While this trip to Centaur wasn't my first visit to Austin, it was the first time I had ever visited VIA's x86 CPU division. VIA acquired Centaur 12 years ago, Glenn Henry has been there since the beginning - back when the company was made up of only 4 people.
Centaur's Glenn Henry in front of a giant dual-core Nano die shot
Glenn took us all on a tour of Centaur. The company itself has 101 employees, a number that grew from 70 at the time of the VIA acquisition. Glen insists that in order to compete in this market with much more powerful competitors (AMD:Centaur::Intel:AMD) his operation has to be lean and efficient. Everyone likes to say that, but Glen actually showed me proof.
When designing a microprocessor you don't just come up with an architecture and hope it works. You do tons of simulation. First you simulate in software. You build up a C model of parts of your architecture or the entire architecture if possible and run it against datasets. This is how you determine things like cache sizes, balance of resources, and model even more fundamental architectural decisions. When you get further along you'll actually simulate the hardware on large FPGAs or other systems with similar functionality. The idea at this point is less about performance validation but just functional validation. The road to manufacturing is expensive (silicon masks cost a lot of money) and time intensive (from tapeout to first silicon is at least 2 months), so you want to figure out as much about your chip's performance and functionality/bugs ahead of time.
Visit any chip company and you'll find a server farm. Dozens if not hundreds of networked PCs all designed to do simulation and validation tests on chip designs. Intel, AMD, NVIDIA, they've all got them. I remember visiting NVIDIA's validation labs and being told that they are limited by the amount of physical power they can get to the lab so each server upgrade has to provide better power efficiency.
Glen took me on a tour of Centaur's simulation lab. To say it was a different experience would be an understatement. While some machines were racked, there were a lot of desktop motherboards running Core i5s and Core i7s running out of cases:
The systems that were in cases were water cooled Core i7s, overclocked to 5GHz. There are two folks at Centaur who build each and every one of these machines, and overclock them. You and I know that overclocking both Nehalem and Sandy Bridge results in much better performance for the same dollar amount, but this is the first time I've seen overclocking used to speed up the simulation of microprocessors.
There are similar efforts made all over Centaur. If something can be built more cheaply than it can be bought, Centaur takes the more affordable route. Even Centaur's ovens used for thermal stress testing use a lot of Centaur-built components in order to reduce their total purchase cost to one fifth of what they would be.
While Centaur didn't have a wafer saw on hand, it can solder and package its own die. This station was used to package a dual-core Nano while a number of journalists watched:
I do wonder about Centaur's future especially as its message of low power operation is now front and center with the current smartphone revolution taking place. In the early days Centaur had to convince users that power was important and that performance was good enough. These days the convincing isn't necessary; it's more about execution, vendor relationships and all of the other pieces of the integrated puzzle. Can VIA and Centaur play a more significant role in the future? The ingredients are there, the question is whether or not VIA is willing to take the risk to give it a try.
ncG1vNJzZmivp6x7orrAp5utnZOde6S7zGiqoaenZIF0f5ForaKZo2K%2Btq3DnKarnV2jrq%2B7jKCcratdl7aos8SrZms%3D