geforce3 retro upgrade blog post: typos ed

This commit is contained in:
Wouter Groeneveld 2021-01-28 12:33:43 +01:00
parent 13af751f34
commit 73175e8f0c
1 changed files with 4 additions and 4 deletions

View File

@ -19,11 +19,11 @@ What are the options? During the end of the nineties and the beginning of the 20
![](../geforce3.jpg "The MSI/Medion GeForce 3 Ti200 card.")
The third generation of Nvidia's GeForce graphics processing units, the GeForce 3, was released in the beginning of 2001. I'm pretty sure most people by then had moved on from Windows 98 to (hopefully) Windows 2000 - thereby safely sidestepping the disaster that Windows Me (or Millennium) was. Compared to the Riva TNT2, a chip also by Nvidia from early 1999, the GeForce 3 cards were beasts - yet only two years passed between these two AGP cards!
The third generation of Nvidia's GeForce graphics processing units, the GeForce 3, was released in the beginning of 2001. I'm pretty sure most people by then had moved on from Windows 98 to (hopefully) Windows 2000 - thereby safely sidestepping the disaster that Windows Me (or Millennium) was. Compared to the Riva TNT2, a chip also by Nvidia from early 1999, the GeForce 3 cards were beasts - yet only two years passed since the TNT2 release!
The TNT2 was never intended to be a big performer: the 1998 3Dfx Voodoo 2 and Voodoo 3 outperformed it in certain games. That is because the graphics APIs were not yet fully matured: you had your **Glide** drivers, that worked well with 3Dfx cards, your **OpenGL** ones, and Microsoft **Direct3D**. Some games could be patched to work better with a particular API, but most were pretty hard-wired. As a gamer with a Voodoo card, you'd be forced to also periodically upgrade your AGP slot if you wanted access to all games. The Voodoo cards slotted in PCI ports, meaning you could mix and match, or even rum them in SLI, a popular configuration in the Retro PC scene at [VOGONS](https://vogons.org). That is, if you're prepared to drop over `€140` a piece - [even today](https://www.benl.ebay.be/sch/i.html?_from=R40&_trksid=m570.l1313&_nkw=voodoo+2&_sacat=0).
The TNT2 was never intended to be a big performer: the 1998 3Dfx Voodoo 2 and Voodoo 3 outperformed it in certain games. That is because the graphics APIs were not yet fully matured: you had your **Glide** drivers, that worked well with 3Dfx cards, your **OpenGL** ones, and Microsoft **Direct3D**. Some games could be patched to work better with a particular API, but most were pretty hard-wired. As a gamer with a Voodoo card, you'd be forced to also periodically upgrade your AGP slot if you wanted access to all games. The Voodoo cards slotted in PCI ports, meaning you could mix and match, or even run them in SLI, a popular configuration in the Retro PC scene at [VOGONS](https://vogons.org). That is, if you're prepared to drop more than `€140` a piece - [even nowadays](https://www.benl.ebay.be/sch/i.html?_from=R40&_trksid=m570.l1313&_nkw=voodoo+2&_sacat=0).
I have fond memories of the floating 3Dfx logo. I remember my dad flashing our Voodoo card in order to overclock it, an attempt that ended with smoke coming out of a capacitor. We brought it back to the store, and when the guy behind the corner asked whether or not we flashed the thing, my dad said "of course not!". We promptly got a new one. However, until I can get hold of a Voodoo card at a fair price, I decided to resort to Nvidia's budget version of the GeForce 3 instead, that I found for only `€30`. The Ti 200 was basically a pumped-up GeForce 2 with `64MB` and a clock rate of `200MHz`. To me, it felt a bit more historically correct than a high-end Ti 500 - although you can [overclock the card](https://www.philscomputerlab.com/geforce3-ti-200.html) by adding active cooling. As a fan of _silent_ PCs, I was keen on keeping things as passive as possible. The result is a massive (well, for that time) black block on top of the GPU:
I have fond memories of the floating 3Dfx logo. I remember my dad flashing our Voodoo card in order to overclock it, an attempt that ended with smoke coming out of a capacitor. We brought it back to the store, and when the guy behind the counter asked whether or not we flashed the thing, my dad said "of course not!". We promptly got a new one. However, until I can get hold of a Voodoo card at a fair price, I decided to resort to Nvidia's budget version of the GeForce 3 instead, that I found for only `€30`. The Ti 200 was basically a pumped-up GeForce 2 with `64MB` and a clock rate of `200MHz`. To me, it felt a bit more historically accurate compared to a high-end Ti 500 - although you can [overclock the card](https://www.philscomputerlab.com/geforce3-ti-200.html) by adding active cooling. As a fan of _silent_ PCs, I was keen on keeping the cooling as passive as possible. The result is a massive (well, for that time) black block on top of the GPU:
![](../geforcevsriva.jpg "The MSI/Medion GeForce 3 Ti200 card.")
@ -65,7 +65,7 @@ As I mentioned in the original [Win98SE article](/post/2020/10/building-an-athlo
![](../wiz8.jpg "Wizardry 8: stranded on a beach, dangerous crabs nearby!")
Wiz8 was never the prettiest game of them all, and its development knew more than a few hickups and tumbles. Sadly, Sir-Tech Canada eventually closed doors, so an official sequel is out of the question. There aren't a lot of graphics options to play with, and after choosing the OpenGL API drivers, all I can say is that both cards pull of rendering scenes in Wizardry 8 quite well. In and around the monastery, the begin location of the party, The TNT2 pushes frames to `19` FPS, while the GeForce 3 almost quadruples this to `75` FPS. Since this is not an action-packed frenetic shooting gallery, like Quake 3 or Unreal Tournament are, having semi-low frames is something you don't even notice. It played fine on the TNT2, and it plays fine on the GeForce 3.
Wiz8 was never the prettiest game of them all, and its development was riddled with more than a few hickups and tumbles. Sadly, Sir-Tech Canada eventually closed doors, so an official sequel is out of the question. There aren't a lot of graphics options to play with, and after choosing the OpenGL API drivers, all I can say is that both cards pull of rendering scenes in Wizardry 8 quite well. In and around the monastery, the begin location of the party, The TNT2 pushes frames to `19` FPS, while the GeForce 3 almost quadruples this to `75` FPS. Since this is not an action-packed frenetic shooting gallery, like Quake 3 or Unreal Tournament are, having semi-low frames is something you don't even notice. It played fine on the TNT2, and it plays fine on the GeForce 3.
## So, was it worth it?