How Simulation Started a Billion-Dollar Company

How Simulation Started a Billion-Dollar Company

For this blog post, we will go back in time to the early 1990s. At that time, “PC graphics” was almost an oxymoron. If you wanted to do real graphics, you bought a “real machine”, most likely a Silicon Graphics* MIPS*- based workstation. At the PC price-point, fast hardware-accelerated 3D graphics wasn’t doable… until it was. Moore’s law made it inevitable, and a few people left Silicon Graphics and started 3dfx, the first company to create fast 3D graphics for the PC. 3dfx had to prove that their ideas were workable – and that proof came in the shape of a simulator. 

Computer History Museum Panel

This story is based on a panel organized by the Computer History Museum (http://www.computerhistory.org/), in Mountain View, California, US, in July of 2013. The video is found at http://www.computerhistory.org/collections/catalog/102746832, and a full transcript at http://www.computerhistory.org/collections/catalog/102746834. The panel interviews the four founders of 3dfx, Gordon Campbell, Scott Sellers, Ross Smith, and Gary Tarolli. 

Keeping it Simple

The founders of 3dfx had an idea – find a way to build a good enough graphics solution that would enable a PC to produce live 3D graphics at a price point several orders of magnitude less than the then-current workstations. In order to do this, they had to simplify the problem and become smart about where to spend their limited silicon budget. 

The first insight was that they could do some of the work on the PC main processor instead of on the graphics card. The standard design at the time was to have the graphics card do all the processing to draw an image, which resulted in a very large and expensive system. In the 3dfx case, they realized that the Intel® Pentium® 90 processor had become fast enough to actually take care of the geometry phase of the rendering.  This meant that it was possible to do a graphics card that would only do rasterization – and that cut down the size and cost quite a bit.  

The second insight was that good enough was indeed good enough. They targeted games and games only. Not CAD applications where quality is paramount and you really want a line engine that can draw anti-aliased lines, or general-purpose Windows desktops where 2D drawing was necessary.

There was no room for such features given the targeted cost and price. Games are nice to optimize for, since in a game, it is more important that things move smoothly than that they are perfectly rendered in every detail.  This made it possible to dial back on bit depths, saving memory and bandwidth. 

The ideas sound good – but how do you prove that they work?

Simulating to Prove and Demonstrate the Idea

In the world of graphics, seeing is believing. If you just told a potential investor about the idea, they would be rather skeptical. Indeed, the design team itself wanted to see if their insights and ideas were correct before committing to silicon design. The solution was to build a simulator of the chip to be designed. 

The simulator was coded in C and ran on the same Pentium 90 machine as the geometry phase. In this way, a single PC could also work as demo system, no extra hardware needed. They claim they lugged a PC around quite a bit for live demonstrations, using their simulator. 

The simulator was a lot slower than the hardware would be, for obvious reasons, so in order to show the graphics in motion they would often use it offline – render a sequence of frames from a demo, and then play it all back at full speed as a video. For a demo to a set of potential investors, the recorded video was the right solution.  It showed what the product would be able to do, and 3dfx got funded. 

Using a simulator as a demonstration tool to prove an idea is a great way to get ideas in front of people quickly.  A simulator can be built using much less resources than the real thing, and in much less time. 

Simulation as an Architecture Tool

The simulator let the team experiment with different levels of quality and different ways to implement things. The simulator let them dial up the precision of computations to match the existing 3D workstations to get a baseline for comparison. Given that baseline, they could then try to lower the data representation precision, use “cheating” algorithms, and other simplifications, and compare the outcome. 

From the transcript:

Sellers: I'd say that was the simulator we mentioned before. This thing was just all software that Gary created was the sort of the research part of how we would develop all of it. And Gary would map an algorithm the right way. 

Tarolli: Right. 

Sellers: I remember it had all these different– 

Tarolli: Oh, yeah. Do it the right way. 

Sellers: –flags, where you could do full floating point calculations and do everything kind of the SGI right way. And then Gary would use that as a kind of apples to apples comparison against OK, here's the cheap way. And since this was all about gaming and consumer use, there wasn't a perfect answer, right? Because it ultimately comes down to does it look good enough? And that's very subjective. And so you could really go to the extreme of when you can start visually seeing artifacts and visually seeing something that's not quite right. And then you just kind of come back slightly from that.

This is a very nice example of doing “what-if” analysis with a human judge to determine what is good enough. In this way, the design could be tuned to yield the best possible final result within the given hardware budget.  It let the design team quickly try different solutions and evaluate them. Such architecture work was and is a mainstay of simulation technology. 

Simulation Driving Chip Testing

Another beneficial effect of the simulator was that it made it possible to create software ahead of silicon availability. As mentioned above, in order to evaluate the design and demonstrate it to venture capitalists, 3dfx had to have software to run what would use their technology and 3D APIs. At this point in time, there was no 3D software available for PCs, since there was no 3D hardware on the market, and no standard APIs for games. Thus, the design team ended up coding their own software for demonstrations and design purposes. 

That design and demonstration software came in handy once the actual chips started to show up! It was basically repurposed as a test and validation vehicle. From the interview:

Sellers: And we keep talking about this simulator that Gary had written. So Gary had this ability to instead of doing all the rendering in pure software simulator, he could nest and strip all that out and send the real commands down to the actual graphics. So when we got the hardware back pretty quickly, we could actually do something interesting with it. 

What this meant was that you could pull traffic out of the simulator not just at the point where the software talked to the hardware, but also within the hardware. In this first test, they only had the framebuffer chip that was the last step in the pipeline. Using the simulator, they could actually drive the chip with the commands it would normally be getting from the first chip in the Voodoo two-chip pipeline, and validate that it worked before they had a complete hardware platform. 

The real chip was given a set of inputs from a real program, via a simulation of the missing parts of the hardware. This proved that the hardware did indeed work as intended – and it was an amazing sense of validation to see real pixels being drawn on a real screen. 

Today, this kind of work is often done with part of the hardware design being run as RTL on an emulator of FPGA prototyping rig rather than a real chip. It is still the same idea though – use a simulator to provide the ability to run full software stacks, and then siphon out traffic to the hardware under test. Without having to have the full system available in RTL or hardware form. 

Simulation Enabling the Ecosystem

The final part of any hardware launch is to get software to support the hardware. Without software, the hardware is rather useless. For the 3dfx launch, this meant they had to get programmers to use their GLide* API – since there was no standard API for 3D graphics on PC at all. 

This took two parts: one was developing a set of feature demos and examples to show developers how to use the API and what they could do with the hardware. The other part was making the API available to a few selected developers to do development before the hardware was available. The solution to this was a bit ironic – 3dfx bought a few high-end graphics workstations, and throttled them down to match the performance of their upcoming graphics cards. This let the game developers tune the performance and graphics quality of their games to the specific capabilities of the eventual hardware. 

Today, we often use fast simulators for this purpose. Simulation technology has advanced quite a bit, and software-only simulators offer a great way to enable the ecosystem ahead of hardware availability. I must admit that there are certainly cases where software simulation will be too slow, but most of the time it is good enough. The value of having something as opposed to having nothing to run on should not be underestimated. 

Summary

The case of 3dfx nicely illustrates how simulation is used to develop and design hardware to this day. Simulation is the technology of choice for computer and system architecture, it is used to demonstrate new features, seed the ecosystem and do pre-silicon enablement, and to validate the eventual hardware. The 3dfx story provides a very concrete example of all of this, wrapped into a fascinating story of the rise and fall of a Silicon Valley legend.

Watching the whole panel is highly recommended, as it mixes technology history with business insights. In the end, 3dfx basically created the PC graphics model as we know it today, with a GPU sitting alongside the CPU (usually on the same SoC). They took the PC into the professional flight simulator market and dethroned the old workstations in that space. Apparently, they even got their chips into cockpit avionics displays! 

For more such intel Modern Code and tools from Intel, please visit the Intel® Modern Code

Source:https://software.intel.com/en-us/blogs/2017/04/20/how-simulation-started-a-billion-dollar-company

Promotion
Digit.in
Logo
Digit.in
Logo