Aw Shucks, Nvidia's April Fool's Prank Tugs at 3dfx Fans' Heart Strings

Oh Nvidia, why must you play with our emotions like this? The GPU maker today introduced the 3dfx Voodoo 590, saying "a group of ex-3dfx veterans at Nvidia have been leading a secret double life...working on a revolutionary graphics card based on 3dfx technology." If you're an old school PC gamer, you're thinking this is too good to be true, right?

Sadly, you're correct. Today is April 1, 2011, otherwise known as April Fool's Day, and in the technology industry, that means you need to be on the lookout for goofy press releases and bogus announcements. This is one of them, no matter how much we wish it were true, but let's play along for a moment anyway.

"We named it the 590 because it matches the GeForce GTX 590 in fillrate--an essential metric in graphics performance. Both cards saturate at 77.7 Gigatexels per second, the only difference being that the Voodoo 590 is built using 3dfx technology from ten years ago," says Gary Tarolli, former Chief Scientist of 3dfx.

Nvidia's awesomely fictitious card crams 233 unaltered VSA-100 chips on a single board, each one still clocked at 166MHz with two pixel pipelines with embedded memory to both save space and provide "huge on-chip bandwidth."

It's an obvious farce, and we're happy to see Nvidia play around with the 3dfx brand, even if only as a gag, but not all enthusiasts feel that way.

"Still waiting for the official drivers that you have promised ten years ago...After this joke I'm going to buy only AMD videocards," a poster on Nvidia's forums wrote. "Although VSA-100 was late to the market and didn't survive in competition, this joke is realy stupid. Let 3dfx rest in peace. Some day you'll be there, too," another poster wrote.

Our advice? Lighten up. 3dfx may have died long before its time (Nvidia acquired 3dfx in late 2000. At the time, 3dfx was in major debt and facing bankruptcy proceedings), but the company's legacy lives on in 3D gaming as a whole, which it helped to advance. Kudos to Nvidia for the trip down memory lane.

Via:  Nvidia
Show comments blog comments powered by Disqus