But then again, everything is relative, right? So the actual severity of the price depends on the overall experience though, correct? If you pay $300 for a graphics card and barely get an extra 5 fps under your favorite game then the card wasn't really worth $300 to you in the first place, now was it?
Then again, if you purchase a $300 card and, immediately after firing up your favorite game, it feels like you've never even seen the title before this moment, the card quickly begins to earn its value.
For many users, the feelings they were left with after seeing what NVIDIA's GeForce could do, were sort of in limbo between the "eh, whatever" and "I have seen the light" reactions. Even in our own tests, the conclusion after reviewing the GeForce was that it's the fastest chip in its class, but nothing more. While the GeForce boasts a fairly powerful hardware transforming & lighting engine, which NVIDIA cutely dubbed their GPU, the overall experience with the GeForce ended up being nothing more than faster gameplay at 1024 x 768, but still not capable of driving higher resolutions at 32-bit color.
For the minimum $249 cost of a card based on the GeForce, many TNT2 Ultra owners found themselves truly not interested in throwing away around $300 on a card that wouldn't improve the gaming experience significantly. While it is true that games with a high polygon count would take advantage of the GeForce's hardware T&L and could potentially perform much better on a GeForce than on any other card, it is also true that we have yet to see a game of that nature and when one does become available (it's inevitable), the GeForce will also carry a noticeable lower price tag (this is also inevitable). So for the NVIDIA loyalists that spent the $300 on a TNT2 Ultra the minute it became available, was the GeForce going to be another $300 "well spent?" Not this time around, not unless NVIDIA could offer a little more for the money.
It's alright to produce a product with flaws, just don't plan on it selling as well as one that is closer to achieving that flawless state that engineers strive for. NVIDIA's GeForce had an inherent flaw; its relatively low memory clock (in comparison to a TNT2 Ultra) and single 128-bit memory bus left it with a noticeable bottleneck from the start -- memory bandwidth.
16 Comments
View All Comments
klah - Monday, September 6, 2004 - link
1999.SlyNine - Sunday, October 12, 2014 - link
2004.NotLocke - Wednesday, October 29, 2014 - link
2014._Skylake_ - Saturday, April 16, 2016 - link
2014hansmuff - Monday, April 25, 2016 - link
^^ You mean 2016 with 2014 performance, Skylake ;)ianmills - Monday, August 20, 2018 - link
2018Random Stranger - Wednesday, December 5, 2018 - link
2018And owning one (Elsa Erazor X2) with a Slot1 Pentium III 733MHz.
MustangMike96 - Wednesday, March 25, 2020 - link
2020MustangMike96 - Wednesday, March 25, 2020 - link
2018: you guys are supposed to be doing the year before you. whoopsWeinerCheese - Monday, May 2, 2022 - link
2020