Just when you thought it was safe to go back into your fave computer parts store, you are once again in danger of an upgrade. NVIDIA is on the way to hammering the store shelves with new product once again. While this is not totally new, it will kind of remind you of a GeForce2 GTS...on steroids. Yeppers, the GeForce2 Ultra is here and we take it for a spin.
The Features:
While we have certainly had our differences with NVIDIA in the past, we cannot ignore the fact that they produce what is arguably the best mass production Vid Cards on the planet. NVIDIA committed to shorter product cycles over a year ago and they have kept their word, churning out newer and better technology along the way. Instead of sitting here going over specs you are most likely already familiar with, I will just let NVIDIA say it for us...
"NVIDIA's latest addition to the GeForce2 family of GPUs is quite simply the best graphics solution available to gaming and multimedia enthusiasts. GeForce2 UltraTM delivers the high visual quality you've come to expect from NVIDIA -- complete with second generation transform and lighting, real-time per-pixel shading, and stellar high-definition video processing capabilities. It is also the fastest GPU available -- processing 1 gigapixel (1 billion pixels) per second, 2 gigatexels (2 billion texels) per second, and 31 million triangles per second. Be sure to take advantage of the Detonator 3 drivers to maximize the performance of your GeForce2 Ultra."
More TnL, more PPS, more Hi Def Video. That is all well and good, but what exactly do these things mean to you RIGHT NOW? Let's elaborate.
There has been much speculation and discussion over Hardware Transform and Lighting, referred to here as "TnL". In a nutshell, if the Hardware TnL on the NVIDIA GPU is taken advantage of, the Vid Card does some of the work that is normally done by the CPU. When the GPU accepts all this load that is dedicated to "showing you what you see on your monitor" the CPU has more time to calculate Artificial Intelligence and other needed functions that make the game work properly. So if programmers of the games you play, code with the TnL in mind, it is highly possible that your game will run better and faster on a GeForce or GeForce2 based card. If TnL was not taken into account when the game engine was put together, it is possible that no performance increase will be seen.
Per Pixel Shading is much of the same if not more. It is really a "future-feature" that NVIDIA is hoping will be taken advantage of by game developers. We have witnessed the demos and there is NOT A DOUBT in my mind that the feature can help render close to lifelike textures and bump maps. To my knowledge at the time of writing this, there are no fully completed games available through retail channels that take advantage of PPS. Now taking into account that the NVIDIA GeForce and GeForce2 cards are a dominating force in the high-end retail market, we are seeing more and more TnL enabled games come to fruition. It is highly likely that we will see games that take advantage of the Per Pixel Shading gracing the store shelves soon, but until then Per Pixel Shading is a dormant feature waiting to be utilized.
So it leaves one with the old "Chicken or the Egg" argument. Surely, if manufacturers don't add these features to the cards, it is highly likely that game developers will NOT add the features to their games in hope that someday, someone will make a card that will use the features. Game developers and Chipset manufacturers are working much closer in the last couple years to ensure the "best" for the consumer. The "best" in this case being features and eyecandy that can be utilized.
Please don't comprehend this as negative criticism of NVIDIA, I am simply explaining how I happen to see it. Without companies like NVIDIA pushing that envelope all the time, it is highly likely that we would all still be playing in software mode.
The Hardware:
NVIDIA was kind enough to send us a GeForce2 Ultra engineering sample to play with. Please understand that this card does not totally represent any specific card you are likely to find for sale at Best Buy or CompUSA. Surely though, many of the GeForce2 Ultra cards for sale WILL BE BASED on the NVIDIA reference design, so our card should be a close representation. One place that our card may differ greatly from yours is its OverClocking abilities. You might get one that will not come close to our levels of OCing, then again you might get one that makes us look like a bunch of wussies.....and I actually hope you do!
We have not gone to any extremes that we might have wanted to with this card due to the fact that NVIDIA wants it back. I just did not feel quite right tearing their card up and sending it back to Derek in a box with a thank you note when it was obvious that the card had been abused. (They would have just seen it here anyway.) Also, we are not wanting to weaken our newly found relationship with NVIDIA by damaging the card when I know they need it for other reviewers. What would those NVIDIA fan sites say if they knew they have a card that had been through the [H]ard|OCP wringer. :


Ain't it purty? All green with a big green heatsink and they even slapped on some heatsinks for the RAM. Notice the back is conspicuously void of any Ram, do we smell a 128Meg version down the line somewhere? I don't know, but I am pretty happy with the 64Megs on the card now. Like I said, I did not want to damage the card so I asked NVIDIA what speed RAM is on the card instead of prying off the heatsinks. This was their response, "GeForce2 Ultra uses 4ns (aka "-4") memory but we set the memory clock speed at 230MHz (-4 implies 250MHz) which doubles to 460MHz effective bandwidth because the memory is DDR."
The core came clocked stock at a healthy 250MHz. So we got 250MHz on the core and 460MHz (DDR) on the memory. Shucks Vern, that there just ain't fast enough, we need to OverClock this mutha...
Overclocking:
I figure we just might as well get right to the point. With a little bit of cooling, and by that I mean a 30CFM blower (thanks to Outside Loop) blowing on the back of the card, we were able to reach the 315MHz core mark and 500MHz Memory mark.


[img][img]http://members.home.nl/grooten/kast.jpg[/img][/img]
Notice the third picture shows the temperature of the back of the PCB where the GeForce2 Ultra Chipset is attached. The simple act of placing the blower to where it could cool the area dropped the surface temp by 20ºF. Note below that we could successfully run the card at a core speed of 315MHz, but this produced some very bad artifacting. Bringing the core back to 305MHz allowed us to run our benchmarks without any glaringly obvious problems. (Here is an old TNT2 review we did that uses an aluminum riser that allows us to put a heatsink on back of a Vid Card, thought this might give you some ideas. Also, here is the link to the temp gun, because many of you will want one of these soon.)
[img][img]http://members.home.nl/grooten/kast.jpg[/img][/img]
This is a screenshot from the NVIDIA 6.18 Detonator 3 drivers, by default, this OC panel is NOT activated. Grab this file, called GeForce_OC.reg, and simply run it from your desktop. It will make the needed registry entries for you and then reboot the system. You should then be able to access the OC panel. If you hose your entire system doing this, it is NOT our fault. Use at your own risk! Thanks to NVIDIA for the Easter Egg, we know you guys deep down WANT us to OC the cards.
We were very impressed with the card's OC ability, but did it really give us anything in return?
Benchmarks:
All these benchmarks were run using Det 3 ver 6.18 drivers, Win98SE, ABIT KT7-RAID, 1.1Ghz TBird or 750MH TBird, 128Megs of Infineon SDRam from AZZO.Com, and of course an Engineering Sample of a GeForce2 Ultra supplied by NVIDIA.
Gotta say I about fell out of my chair when I saw that default 750MHz benchmark finish. WOW! 7K+ out of it. Our OCing of the card did give us an extra 300 marks which is surely NOT shabby. The 1Ghz mark is equally impressive and OCing the card gave us an extra 400+ 3D Marks. Certainly worth the effort considering all we did was point a blower at the card to keep it a bit cooler.
Now take a look at the FSAA marks, yep they are the ones with the little "AA" in the line. No Alcoholics Anonymous jokes please. (Those bastards get feisty once they get a few drinks in them!) This is where I was most impressed by the GeForce 2 Ultra AND the Detonator 3 drivers. The GeForce2 Ultra at 750MHz outscored our Voodoo 5 5500 AGP at 1GHz in the FSAA tests. Keep in mind this is on an AMD board and I think Voodoo is still having some 3DNow optimization issues, but this is just my opinion. Now also I still don't think using 3DMark is a good bench to utilize to test ACROSS platforms, and we usually would not even mention it, but seeing the AA scores as high as they are got us to seriously thinking/knowing that NVIDIA is finally going to be a player in the FSAA market and not just talk about it.

.................................................
ChukE ( http://www.cultdeadcow.com )
The Features:
While we have certainly had our differences with NVIDIA in the past, we cannot ignore the fact that they produce what is arguably the best mass production Vid Cards on the planet. NVIDIA committed to shorter product cycles over a year ago and they have kept their word, churning out newer and better technology along the way. Instead of sitting here going over specs you are most likely already familiar with, I will just let NVIDIA say it for us...
"NVIDIA's latest addition to the GeForce2 family of GPUs is quite simply the best graphics solution available to gaming and multimedia enthusiasts. GeForce2 UltraTM delivers the high visual quality you've come to expect from NVIDIA -- complete with second generation transform and lighting, real-time per-pixel shading, and stellar high-definition video processing capabilities. It is also the fastest GPU available -- processing 1 gigapixel (1 billion pixels) per second, 2 gigatexels (2 billion texels) per second, and 31 million triangles per second. Be sure to take advantage of the Detonator 3 drivers to maximize the performance of your GeForce2 Ultra."
More TnL, more PPS, more Hi Def Video. That is all well and good, but what exactly do these things mean to you RIGHT NOW? Let's elaborate.
There has been much speculation and discussion over Hardware Transform and Lighting, referred to here as "TnL". In a nutshell, if the Hardware TnL on the NVIDIA GPU is taken advantage of, the Vid Card does some of the work that is normally done by the CPU. When the GPU accepts all this load that is dedicated to "showing you what you see on your monitor" the CPU has more time to calculate Artificial Intelligence and other needed functions that make the game work properly. So if programmers of the games you play, code with the TnL in mind, it is highly possible that your game will run better and faster on a GeForce or GeForce2 based card. If TnL was not taken into account when the game engine was put together, it is possible that no performance increase will be seen.
Per Pixel Shading is much of the same if not more. It is really a "future-feature" that NVIDIA is hoping will be taken advantage of by game developers. We have witnessed the demos and there is NOT A DOUBT in my mind that the feature can help render close to lifelike textures and bump maps. To my knowledge at the time of writing this, there are no fully completed games available through retail channels that take advantage of PPS. Now taking into account that the NVIDIA GeForce and GeForce2 cards are a dominating force in the high-end retail market, we are seeing more and more TnL enabled games come to fruition. It is highly likely that we will see games that take advantage of the Per Pixel Shading gracing the store shelves soon, but until then Per Pixel Shading is a dormant feature waiting to be utilized.
So it leaves one with the old "Chicken or the Egg" argument. Surely, if manufacturers don't add these features to the cards, it is highly likely that game developers will NOT add the features to their games in hope that someday, someone will make a card that will use the features. Game developers and Chipset manufacturers are working much closer in the last couple years to ensure the "best" for the consumer. The "best" in this case being features and eyecandy that can be utilized.
Please don't comprehend this as negative criticism of NVIDIA, I am simply explaining how I happen to see it. Without companies like NVIDIA pushing that envelope all the time, it is highly likely that we would all still be playing in software mode.
The Hardware:
NVIDIA was kind enough to send us a GeForce2 Ultra engineering sample to play with. Please understand that this card does not totally represent any specific card you are likely to find for sale at Best Buy or CompUSA. Surely though, many of the GeForce2 Ultra cards for sale WILL BE BASED on the NVIDIA reference design, so our card should be a close representation. One place that our card may differ greatly from yours is its OverClocking abilities. You might get one that will not come close to our levels of OCing, then again you might get one that makes us look like a bunch of wussies.....and I actually hope you do!
We have not gone to any extremes that we might have wanted to with this card due to the fact that NVIDIA wants it back. I just did not feel quite right tearing their card up and sending it back to Derek in a box with a thank you note when it was obvious that the card had been abused. (They would have just seen it here anyway.) Also, we are not wanting to weaken our newly found relationship with NVIDIA by damaging the card when I know they need it for other reviewers. What would those NVIDIA fan sites say if they knew they have a card that had been through the [H]ard|OCP wringer. :


Ain't it purty? All green with a big green heatsink and they even slapped on some heatsinks for the RAM. Notice the back is conspicuously void of any Ram, do we smell a 128Meg version down the line somewhere? I don't know, but I am pretty happy with the 64Megs on the card now. Like I said, I did not want to damage the card so I asked NVIDIA what speed RAM is on the card instead of prying off the heatsinks. This was their response, "GeForce2 Ultra uses 4ns (aka "-4") memory but we set the memory clock speed at 230MHz (-4 implies 250MHz) which doubles to 460MHz effective bandwidth because the memory is DDR."
The core came clocked stock at a healthy 250MHz. So we got 250MHz on the core and 460MHz (DDR) on the memory. Shucks Vern, that there just ain't fast enough, we need to OverClock this mutha...
Overclocking:
I figure we just might as well get right to the point. With a little bit of cooling, and by that I mean a 30CFM blower (thanks to Outside Loop) blowing on the back of the card, we were able to reach the 315MHz core mark and 500MHz Memory mark.


[img][img]http://members.home.nl/grooten/kast.jpg[/img][/img]
Notice the third picture shows the temperature of the back of the PCB where the GeForce2 Ultra Chipset is attached. The simple act of placing the blower to where it could cool the area dropped the surface temp by 20ºF. Note below that we could successfully run the card at a core speed of 315MHz, but this produced some very bad artifacting. Bringing the core back to 305MHz allowed us to run our benchmarks without any glaringly obvious problems. (Here is an old TNT2 review we did that uses an aluminum riser that allows us to put a heatsink on back of a Vid Card, thought this might give you some ideas. Also, here is the link to the temp gun, because many of you will want one of these soon.)
[img][img]http://members.home.nl/grooten/kast.jpg[/img][/img]
This is a screenshot from the NVIDIA 6.18 Detonator 3 drivers, by default, this OC panel is NOT activated. Grab this file, called GeForce_OC.reg, and simply run it from your desktop. It will make the needed registry entries for you and then reboot the system. You should then be able to access the OC panel. If you hose your entire system doing this, it is NOT our fault. Use at your own risk! Thanks to NVIDIA for the Easter Egg, we know you guys deep down WANT us to OC the cards.
We were very impressed with the card's OC ability, but did it really give us anything in return?
Benchmarks:
All these benchmarks were run using Det 3 ver 6.18 drivers, Win98SE, ABIT KT7-RAID, 1.1Ghz TBird or 750MH TBird, 128Megs of Infineon SDRam from AZZO.Com, and of course an Engineering Sample of a GeForce2 Ultra supplied by NVIDIA.
Gotta say I about fell out of my chair when I saw that default 750MHz benchmark finish. WOW! 7K+ out of it. Our OCing of the card did give us an extra 300 marks which is surely NOT shabby. The 1Ghz mark is equally impressive and OCing the card gave us an extra 400+ 3D Marks. Certainly worth the effort considering all we did was point a blower at the card to keep it a bit cooler.
Now take a look at the FSAA marks, yep they are the ones with the little "AA" in the line. No Alcoholics Anonymous jokes please. (Those bastards get feisty once they get a few drinks in them!) This is where I was most impressed by the GeForce 2 Ultra AND the Detonator 3 drivers. The GeForce2 Ultra at 750MHz outscored our Voodoo 5 5500 AGP at 1GHz in the FSAA tests. Keep in mind this is on an AMD board and I think Voodoo is still having some 3DNow optimization issues, but this is just my opinion. Now also I still don't think using 3DMark is a good bench to utilize to test ACROSS platforms, and we usually would not even mention it, but seeing the AA scores as high as they are got us to seriously thinking/knowing that NVIDIA is finally going to be a player in the FSAA market and not just talk about it.

.................................................
ChukE ( http://www.cultdeadcow.com )