GeForce2 Ultra

Pagina: 1
Acties:
  • 137 views sinds 30-01-2008
  • Reageer

Acties:
  • 0 Henk 'm!

Anoniem: 5295

Topicstarter
Just when you thought it was safe to go back into your fave computer parts store, you are once again in danger of an upgrade. NVIDIA is on the way to hammering the store shelves with new product once again. While this is not totally new, it will kind of remind you of a GeForce2 GTS...on steroids. Yeppers, the GeForce2 Ultra is here and we take it for a spin.

The Features:

While we have certainly had our differences with NVIDIA in the past, we cannot ignore the fact that they produce what is arguably the best mass production Vid Cards on the planet. NVIDIA committed to shorter product cycles over a year ago and they have kept their word, churning out newer and better technology along the way. Instead of sitting here going over specs you are most likely already familiar with, I will just let NVIDIA say it for us...

"NVIDIA's latest addition to the GeForce2 family of GPUs is quite simply the best graphics solution available to gaming and multimedia enthusiasts. GeForce2 UltraTM delivers the high visual quality you've come to expect from NVIDIA -- complete with second generation transform and lighting, real-time per-pixel shading, and stellar high-definition video processing capabilities. It is also the fastest GPU available -- processing 1 gigapixel (1 billion pixels) per second, 2 gigatexels (2 billion texels) per second, and 31 million triangles per second. Be sure to take advantage of the Detonator 3 drivers to maximize the performance of your GeForce2 Ultra."


More TnL, more PPS, more Hi Def Video. That is all well and good, but what exactly do these things mean to you RIGHT NOW? Let's elaborate.

There has been much speculation and discussion over Hardware Transform and Lighting, referred to here as "TnL". In a nutshell, if the Hardware TnL on the NVIDIA GPU is taken advantage of, the Vid Card does some of the work that is normally done by the CPU. When the GPU accepts all this load that is dedicated to "showing you what you see on your monitor" the CPU has more time to calculate Artificial Intelligence and other needed functions that make the game work properly. So if programmers of the games you play, code with the TnL in mind, it is highly possible that your game will run better and faster on a GeForce or GeForce2 based card. If TnL was not taken into account when the game engine was put together, it is possible that no performance increase will be seen.

Per Pixel Shading is much of the same if not more. It is really a "future-feature" that NVIDIA is hoping will be taken advantage of by game developers. We have witnessed the demos and there is NOT A DOUBT in my mind that the feature can help render close to lifelike textures and bump maps. To my knowledge at the time of writing this, there are no fully completed games available through retail channels that take advantage of PPS. Now taking into account that the NVIDIA GeForce and GeForce2 cards are a dominating force in the high-end retail market, we are seeing more and more TnL enabled games come to fruition. It is highly likely that we will see games that take advantage of the Per Pixel Shading gracing the store shelves soon, but until then Per Pixel Shading is a dormant feature waiting to be utilized.

So it leaves one with the old "Chicken or the Egg" argument. Surely, if manufacturers don't add these features to the cards, it is highly likely that game developers will NOT add the features to their games in hope that someday, someone will make a card that will use the features. Game developers and Chipset manufacturers are working much closer in the last couple years to ensure the "best" for the consumer. The "best" in this case being features and eyecandy that can be utilized.

Please don't comprehend this as negative criticism of NVIDIA, I am simply explaining how I happen to see it. Without companies like NVIDIA pushing that envelope all the time, it is highly likely that we would all still be playing in software mode.

The Hardware:

NVIDIA was kind enough to send us a GeForce2 Ultra engineering sample to play with. Please understand that this card does not totally represent any specific card you are likely to find for sale at Best Buy or CompUSA. Surely though, many of the GeForce2 Ultra cards for sale WILL BE BASED on the NVIDIA reference design, so our card should be a close representation. One place that our card may differ greatly from yours is its OverClocking abilities. You might get one that will not come close to our levels of OCing, then again you might get one that makes us look like a bunch of wussies.....and I actually hope you do! :)

We have not gone to any extremes that we might have wanted to with this card due to the fact that NVIDIA wants it back. I just did not feel quite right tearing their card up and sending it back to Derek in a box with a thank you note when it was obvious that the card had been abused. (They would have just seen it here anyway.) Also, we are not wanting to weaken our newly found relationship with NVIDIA by damaging the card when I know they need it for other reviewers. What would those NVIDIA fan sites say if they knew they have a card that had been through the [H]ard|OCP wringer. :


Afbeeldingslocatie: http://www.hardocp.co/reviews/vidcards/gf2u/gf2cardensamp-sm.jpg
Afbeeldingslocatie: http://www.hardocp.com/reviews/vidcards/gf2u/gf2back-sm.jpg
Ain't it purty? All green with a big green heatsink and they even slapped on some heatsinks for the RAM. Notice the back is conspicuously void of any Ram, do we smell a 128Meg version down the line somewhere? I don't know, but I am pretty happy with the 64Megs on the card now. Like I said, I did not want to damage the card so I asked NVIDIA what speed RAM is on the card instead of prying off the heatsinks. This was their response, "GeForce2 Ultra uses 4ns (aka "-4") memory but we set the memory clock speed at 230MHz (-4 implies 250MHz) which doubles to 460MHz effective bandwidth because the memory is DDR."

The core came clocked stock at a healthy 250MHz. So we got 250MHz on the core and 460MHz (DDR) on the memory. Shucks Vern, that there just ain't fast enough, we need to OverClock this mutha...

Overclocking:

I figure we just might as well get right to the point. With a little bit of cooling, and by that I mean a 30CFM blower (thanks to Outside Loop) blowing on the back of the card, we were able to reach the 315MHz core mark and 500MHz Memory mark.

Afbeeldingslocatie: http://www.hardocp.com/reviews/vidcards/gf2u/ccooling1-sm.jpg

Afbeeldingslocatie: http://www.hardocp.com/reviews/vidcards/gf2u/ccooling2-sm.jpg

[img][img]http://members.home.nl/grooten/kast.jpg[/img][/img]

Notice the third picture shows the temperature of the back of the PCB where the GeForce2 Ultra Chipset is attached. The simple act of placing the blower to where it could cool the area dropped the surface temp by 20ºF. Note below that we could successfully run the card at a core speed of 315MHz, but this produced some very bad artifacting. Bringing the core back to 305MHz allowed us to run our benchmarks without any glaringly obvious problems. (Here is an old TNT2 review we did that uses an aluminum riser that allows us to put a heatsink on back of a Vid Card, thought this might give you some ideas. Also, here is the link to the temp gun, because many of you will want one of these soon.)
[img][img]http://members.home.nl/grooten/kast.jpg[/img][/img]

This is a screenshot from the NVIDIA 6.18 Detonator 3 drivers, by default, this OC panel is NOT activated. Grab this file, called GeForce_OC.reg, and simply run it from your desktop. It will make the needed registry entries for you and then reboot the system. You should then be able to access the OC panel. If you hose your entire system doing this, it is NOT our fault. Use at your own risk! Thanks to NVIDIA for the Easter Egg, we know you guys deep down WANT us to OC the cards. :)

We were very impressed with the card's OC ability, but did it really give us anything in return?

Benchmarks:

All these benchmarks were run using Det 3 ver 6.18 drivers, Win98SE, ABIT KT7-RAID, 1.1Ghz TBird or 750MH TBird, 128Megs of Infineon SDRam from AZZO.Com, and of course an Engineering Sample of a GeForce2 Ultra supplied by NVIDIA.


Gotta say I about fell out of my chair when I saw that default 750MHz benchmark finish. WOW! 7K+ out of it. Our OCing of the card did give us an extra 300 marks which is surely NOT shabby. The 1Ghz mark is equally impressive and OCing the card gave us an extra 400+ 3D Marks. Certainly worth the effort considering all we did was point a blower at the card to keep it a bit cooler.

Now take a look at the FSAA marks, yep they are the ones with the little "AA" in the line. No Alcoholics Anonymous jokes please. (Those bastards get feisty once they get a few drinks in them!) This is where I was most impressed by the GeForce 2 Ultra AND the Detonator 3 drivers. The GeForce2 Ultra at 750MHz outscored our Voodoo 5 5500 AGP at 1GHz in the FSAA tests. Keep in mind this is on an AMD board and I think Voodoo is still having some 3DNow optimization issues, but this is just my opinion. Now also I still don't think using 3DMark is a good bench to utilize to test ACROSS platforms, and we usually would not even mention it, but seeing the AA scores as high as they are got us to seriously thinking/knowing that NVIDIA is finally going to be a player in the FSAA market and not just talk about it.

Afbeeldingslocatie: http://www.hardocp.com/reviews/vidcards/gf2u/3dmarks.gif

.................................................
ChukE ( http://www.cultdeadcow.com )

Acties:
  • 0 Henk 'm!

  • GarBaGe
  • Registratie: December 1999
  • Laatst online: 18:36
F**K, moet je een het verschil zien tussen
(
het verschil tussen OC-ed en non-OC-ed op 750 MHz
)
en
(
het verschil tussen OC-ed en non-OC-ed op 1.1 GHz
)
Daar kan ik maar 1 conclusie aan verbinden:
GIGA-proc zorgt voor Xtreme AA verbetering. Zeker bij non-OC kaarten. Maar ja, wie heeft nou niet zijn GF ge-OC-ed :)

Ryzen9 5900X; 16GB DDR4-3200 ; RTX-4080S ; 7TB SSD


Acties:
  • 0 Henk 'm!

Anoniem: 5295

Topicstarter
Ja slim he ..haha. Ochja, dit kaartje valt nog wel mee.

.................................................
ChukE ( http://www.cultdeadcow.com )

Acties:
  • 0 Henk 'm!

  • smokey
  • Registratie: Oktober 1999
  • Laatst online: 03-07 20:28
8648 marks mm lekkere score :p

Acties:
  • 0 Henk 'm!

Anoniem: 5295

Topicstarter
Afbeeldingslocatie: http://www.tweak3d.net/reviews/nvidia/geforce2ultra/1.gif
Introduction

I knew there would be a GeForce 2 Ultra! Maybe it was obvious, but just like with the TNT brand, the old saying comes to mind... "if it ain't broke, don't fix it!"
Nvidia's TNT chip did great. So later, Nvidia made a TNT2. They probably didn't want to make a whole new card after TNT2, but they didn't want to make a TNT3, so they made a TNT2 Ultra. We saw the same success from the GeForce chip as we did with the TNT, so a GeForce 2 was too obvious. And of course, there would be a GeForce 2 Ultra. Okay, okay.. I'm rambling now. I've got 16 hours to test and write about this new chip from Nvidia, so I had better get started.

Do we really need a faster video card than a GeForce 2 GTS right now? I mean, the damn thing screams in nearly everything you can throw at it. It is still leaps and bounds beyond what the average gamer needs, and it's still faster than most of (if not all of) the competition. But have you played Deus Ex on the GeForce 2? What about Quake 3 at 1600x1200x32bpp? These games, with extreme details and/or at high resolutions, are the reason why the GeForce 2 Ultra was created. This card is obviously to fulfill the needs of the true die-hard gamer. So what makes the GeForce 2 Ultra different than its little brother?
Afbeeldingslocatie: http://www.tweak3d.net/reviews/nvidia/geforce2ultra/2.jpg
Features: GeForce 2 GTS vs. GeForce 2 Ultra

Like the TNT2 vs. TNT2 Ultra, there is not a huge difference (feature-wise) between the GeForce 2 GTS and the GeForce 2 Ultra. Anyway:

GeForce 256 GeForce 2 GTS GeForce 2 Ultra
Process Technology .22 micron (µ) .18 micron(µ) Advanced .18 micron
Clocks (core/memory) 120/300* 200/333* 250/460*
Texels/clock 4 8 8
Million Pixels/second 480 800 1000
Million Texels/second 480 1600 2000
Polygons/sec 15 million 25 million 31 million
Framebuffer 16 MB 32 MB/64 MB 64 MB
Shading Processor -- NSR NSR

* Effective memory frequency via DDR memory

As you can see, the biggest difference between the GeForce 2 GTS and the GeForce 2 Ultra is the memory clock speed. Using extremely high-performance DDR memory, the GeForce 2 Ultra manages an impressive 460 MHz (230 MHz * 2) output. The core clock is still 25% faster than the GeForce 2 GTS, but the memory is the big improvement, at over 38% faster than the GeForce 2 GTS. This fillrate increase, however, is notable because it puts the GeForce 2 Ultra at the one gigapixel/sec mark and the two gigatexel/sec mark.


So, if the memory is 38% faster than the GeForce 2 GTS and the core speed is 25% faster than the GeForce 2 GTS, how is this card going to perform? Well, before we begin with benchmarks, consider the following:

The biggest performance bottleneck on the GeForce 2 GTS was the memory. Now that the memory is significantly faster and the Detonator 3 drivers have been released (which also optimized the memory interface substantially), performance of the GeForce 2 Ultra can be nearly double that of the GeForce 2 GTS (as it was at launch). This was Nvidia's goal: to match or break Moore's law every six months instead of every two years. And compared to the GeForce 2 GTS when using the same drivers, the GeForce 2 Ultra should still be significantly faster due to the optimized memory and higher clock speed. Like I said, this card was built for gamers. Did I mention that the card also comes with a 64 MB framebuffer, standard? Who needs this kind of speed? Gamers. Who else would need enough power to crank out 55 FPS at 1600x1200x32bpp Quake3 with all settings turned up? (More on that later!)

Subtle Differences

There are other small differences that can be seen on the face of the GeForce 2 Ultra that separate it from the GeForce 2 GTS. Here's a pic of the shiny green card:

Afbeeldingslocatie: http://www.tweak3d.net/reviews/nvidia/geforce2ultra/1s.jpg
There is no memory on the back of the card. All 64 MB is placed as eight chips on the front of the card. Another interesting feature that Nvidia decided to include in the reference design was the memory heatsinks. On my card these are black, but anyway, since Nvidia added memory heatsinks to the reference design, we can expect to see the OEM versions with memory heatsinks as well.

On my card, the GPU's heatsink is bright Nvidia green, and the fan is rather small and dinky looking. Heat issues seem to be present... as at a LAN party, with an open 'case', we noticed the card was very hot after only about an hour of play. The so-called 'advanced .18 micron' process made 250 MHz stable, but we didn't even attempt to overclock the card with the default cooling since it was already very hot.

Now, you're probably drooling for benchmarks right? Only 15 more pages of nonsense before we get to them!
Test System

We would have had a very large range of systems to test the GeForce 2 Ultra with, but unfortunately, we were restricted by a deadline and the fact that we only had one GeForce 2 Ultra to use for benchmarking. For these reasons I used my personal machine to test the GeForce 2 GTS and GeForce 2 Ultra:

Celeron 566 CPU @ 850 MHz (100 MHz FSB)
128 MB PC133 RAM @ true CAS2
Abit BF6 motherboard (BX)
Sound Blaster Live! Retail
Adaptec 29160 UWSCSI card
Windows 98
Quake3 fresh install/3DMark 2000 v1.1 fresh install/Creature demo fresh install.
Quake3 Timedemo 1 / Demo 1 was used for testing.
6.16 Detonator 3 drivers were used in all testing unless otherwise noted. Vertical Sync (VSYNC) was disabled.

Benchmarks: GeForce 2 vs. GeForce 2 Ultra

Since the time I had to write this preview was extremely limited, we used 3DMark 2000 to test Direct3D, and we used Quake 3 Arena to test OpenGL. We also used Nvidia's Creature demo to test OpenGL w/ T&L.


Direct3D Performance: 3DMark 2000

Due to the lack of a better, more consistent Direct3D benchmark, 3DMark 2000 v1.1 was used. The settings were at 1024x768x16bpp with 32-bit textures enabled. If we had more time for testing we could've tried 1600x1200 or the same benchmarks with FSAA -- maybe we'll do this at a later time.

GeForce 2 GTS using 5.33 drivers: 5227 3DMarks
GeForce 2 GTS using 6.16 drivers: 5387 3DMarks

GeForce 2 Ultra w/ 5.33 drivers*: 5268 3DMarks
GeForce 2 Ultra w/ 6.16 drivers: 5642 3DMarks

* -- The GeForce 2 Ultra with 5.33 drivers (pre-GeForce 2 Ultra release) was for a later article comparing Detonator 3 and older Detonators.

As the numbers show, 3DMark 2000 saw a small improvement from the GeForce 2 Ultra. That isn't to say Direct3D won't show major improvements over the GeForce 2. It's just that with this benchmark, the difference is hard to see.

I also had the time to play with the GeForce 2 Ultra in Unreal Tournament and in Motocross Madness 2. I am happy to report that in both games, the performance increase was substantial if using the 6.16 Detonator 3 drivers. MCM2 was fully playable at 4x4 FSAA (full-scene anti-aliasing) 1024x768 -- which is very good considering this game is a hog on some systems.


OpenGL Performance (1): Nvidia Creature Demo

One of Nvidia's latest tech demos, Creature, is an underwater scene with very impressive visuals. The scene features several schools of fish in the background and of course, the creature. The demo runs at 1024x768 and has between 130,000 and 160,000 triangles in the scene at a time, or so, with plenty of eye candy.

Afbeeldingslocatie: http://www.tweak3d.net/reviews/nvidia/geforce2ultra/5s.jpg
The nice thing about this tech demo, other than the visuals, is the frame rate and polygon/triangle count, which helps to test T&L performance. Here's how the GeForce 2 GTS stacked against the GeForce 2 Ultra in this test:
Afbeeldingslocatie: http://www.tweak3d.net/reviews/nvidia/geforce2ultra/2.gif
As you can tell from the graphs, the GeForce 2 Ultra is quite a bit faster than the GeForce 2 GTS with this OpenGL/T&L benchmark. Not only that, but the GeForce 2 Ultra's extra memory and even higher speed allow it to have an increasing number of polygons with little performance hit; whereas the GeForce 2 GTS's performance was considerably lower with higher polygon counts. Perhaps the bigger issue with this sudden drop in performance around (140,000 triangles) is due to the limit on the GeForce 2 GTS's fillrate.


The forums are back, GO!



Posted: August 14, 2000
Written by: Dan "Tweak Monkey" Kennedy


Benchmarks: GeForce 2 vs. GeForce 2 Ultra (cont.)

OpenGL Performance(2): Quake 3 Arena @ Normal Quality

Quake 3 Arena is the standard for testing OpenGL performance these days, so we decided to watch the GeForce 2 Ultra beat the pulp out of it. The initial setting was to set the quality in the system option to "Normal" and then change the resolution and color depth to their respective settings. In this test, 16-bit color with 16-bit textures was used, at 1024x768 and 1600x1200. 1600x1200 was used because it's basically the highest playable resolution that is supported by a wide range of games, and 1024x768 was used for lower-end systems. There is no logical reason (from a gamer's standpoint) to test this card (or any current card) with anything lower because the frame rates are so high that it doesn't prove a thing! -- that had to be said. Here's the benchmark graph/chart:

Afbeeldingslocatie: http://www.tweak3d.net/reviews/nvidia/geforce2ultra/3.gif
As you can see, the difference at 16-bit color is only significant at 1600x1200. But even the GeForce 2 runs this resolution with basically no slowdowns. Still, it's interesting to see how much faster the GeForce 2 Ultra is at this less-intensive test (less than high quality, that is). Bragging rights? Indeed.


OpenGL Performance(3): Quake 3 Arena @ High Quality

When Quake 3 first came out, it seemed like the high quality tests were a waste of time because on most systems they were very intensive, and rarely did a system reach an acceptable frame rate. I think Nvidia used the Quake 3 1600x1200x32bpp test to develop the Detonator 3s and the GeForce 2 Ultra; because together, they scream. Here's the graph:

Afbeeldingslocatie: http://www.tweak3d.net/reviews/nvidia/geforce2ultra/4.gif


The performance increase in 1024x768 wasn't too great, but the 1600x1200 benchmark was obviously much better. A 46% performance increase over the GeForce 2 GTS was seen -- and this was with the new drivers. With the 5.33 drivers, the GeForce 2 GTS scored only 27.8 FPS on my system. Yes, Nvidia did double the performance of the GeForce 2 GTS if you look at it in a certain perspective. Remembers folks, this card is geared toward gamers, and it sure shows it.


OpenGL Performance(4): Quake 3 Arena @ High Quality w/ FSAA

I had problems with FSAA up until the last minutes before I realized I had better start writing this preview, so I only had time to grab a couple quick benchmarks.

Afbeeldingslocatie: http://www.tweak3d.net/reviews/nvidia/geforce2ultra/5.gif
1600x1200x32bpp with 2x2 FSAA will automatically disable FSAA (I'm guessing because the 64 MB framebuffer is not adequate for such high settings). The frame rate is fair with 1.5x1.5 FSAA enabled, but 2x2 requires 800x600 or lower (on this system) before it runs smoothly.
Evaluation/Analysis

At $499 (estimated retail price), the GeForce 2 Ultra is a chip that's definitely targeted toward hardcore gamers. Since the GeForce 2 GTS is already fast enough for most gamers, the GeForce 2 Ultra may seem useless... but it's not. Because it is so fast, you can now run a game at a higher resolution and detail level than before, with the same frame rate, or better than you could with a GeForce 2 GTS.

Afbeeldingslocatie: http://www.tweak3d.net/reviews/nvidia/geforce2ultra/3s.jpg
So, is NV16 (GeForce 2 Ultra) the ticket? Well, that all depends on if you're willing to spend the bucks to get the performance. For the price, a GeForce 2 GTS is a much better deal. But if you want full bragging rights or you've got the money to spare, by all means, indulge.

NV20

If you're waiting for NV20, take into consideration this e-mail, sent out by Nvidia PR rep, Diane Vanasse:
"Hi guys, I know many of you are curious to know what is going on with NV20 and we wanted to let you have some insight into our strategy. Our strategy is to rollout a new product every six months. Through these rollouts we maintain a steady moores law cubed performance gain. (2x every six months) We haven't missed a season yet.

This season, with NV16 we are following Moores Law cubed precisely. It is four times faster than NV10.

Regarding NV20, we time our new architectures based on the the availability of new process technology (.15 micron), new API's, (DX8) and the market's readiness to digest the new features. This all comes together next season.


Conclusion

Nvidia prides itself in being ahead of the competition. And in the case of the GeForce 2 Ultra, there is no doubt that it is currently the fastest chip available for 3D gaming. Expect to see the GeForce 2 Ultra in stores everywhere within a month or two. As always, thanks for reading.

.................................................
POWERD BY: ChukE ( http://www.cultdeadcow.com )

Acties:
  • 0 Henk 'm!

  • Speedy
  • Registratie: Juni 2000
  • Laatst online: 20:22
Nou wordt een GeForce 2 GTS afgescheept als een budget kaartje :(
Maar die fps verschillen is alleen in 1600x1200....en wie speelt daar nou zijn FPS games op?? En wie andere spellen speelt haalt wel een Radeon, kortom...weinig verkochte kaart lijkt me dit (ook om zijn prijs).

Flickr | Lego collectie


Acties:
  • 0 Henk 'm!

Anoniem: 5295

Topicstarter
De prijs van deze kaart zal binnen een hele korte tyd dalen. Vanwege de concurentie en de
de opvolger, Geforce 3 NV 20 en Geforce 3 NV 25.

-------------------------------------------------
ChukE ( http://www.cultdeadcow.com )

Acties:
  • 0 Henk 'm!

  • smokey
  • Registratie: Oktober 1999
  • Laatst online: 03-07 20:28
uiteraard zal die resolutie waar het verschil maakt (in dit geval 1600*1200) omlaag gaan. hoe dom ben je om te zeggen dat die kaart alleen maar voor resolutie voor 1600*1200 geschikt is je hebt namelijk op die resolutie een hogere fillrate of een weet ik veel andere waarde ofzo in de toekomst (systeemeisender spellen) wordt die ultra steeds interessanter

Acties:
  • 0 Henk 'm!

  • Speedy
  • Registratie: Juni 2000
  • Laatst online: 20:22
Daar is hij nu geschikt voor bedoel ik, en toekomst kijken? In deze tijd van pc stuff? Elk jaar een andere videokaart, en binnen een jaar zie ik die Ultra niet echt een meerwaarde geven in de lagere resoluties.

Flickr | Lego collectie


Acties:
  • 0 Henk 'm!

Anoniem: 5646

IK speel dus wel (veel) op 1600x1200x32...

en de vretertjes op 1280x1024x32

Acties:
  • 0 Henk 'm!

  • Speedy
  • Registratie: Juni 2000
  • Laatst online: 20:22
Ja, ok, maar daar heb je die 64MB kaart dus voor :), maar de mainstream zal daar nooit op spelen, zo zit ik zelf nog op 640x480 te gamen.

Eh, nooit in de zin van het komende jaar :)

Flickr | Lego collectie


Acties:
  • 0 Henk 'm!

Anoniem: 5295

Topicstarter
De Geforce 2 Ultra zal in 2D niet veel beter over komen als zijn broer de Geforce 2.
Verder zal deze kaart voldoen aan de eisen van jullie. Hij zal gemiddeld beter over komen als de concurentie. Alleem prijsmatig zal deze kaart nog te duur zijn voor de meeste van jullie, dit zal niet langer als 2maanden duren , en dan is deze kaart minstens 35% goedkoper. Maar hij is dan twee maanden ouder.

Wij krijg Maandag alle merken Geforce 2 Ultra's binnen, en zal er een mooie review van op het net zetten.

-------------------------------------------------
ChukE ( http:/www.cultdeadcow.com )

Acties:
  • 0 Henk 'm!

  • edward2
  • Registratie: Mei 2000
  • Laatst online: 20-06 00:26

edward2

Schieten op de beesten

En waar is dat ChukE?

Niet op voorraad.


Acties:
  • 0 Henk 'm!

  • Halo_Master
  • Registratie: Mei 2000
  • Laatst online: 02-12-2024

Halo_Master

kan het cooler , pleaze?

Chuke , wordt je betaald door Nvidia of zo!
als je een review wilt posten moet je dat gewoon bij tweakers.net doen , hier is het forum niet voor en wordt een beetje ziek van al dat Nvidia dit en dat , lijkt wel of ze een heel netwerk hebben !Kunnen het beter Nvidiaplankenreclameforum noemen als het op deze manier doorgaat.Ik vind het een totaal niets toevoegende thread, dit!Vraagt er laatst iemand de beste kaart voor UT en Deus Ex komen er weer een paar van die mensen met het advies van koop een Geforce , dat is de beste kaart!Ja , maar dat wordt dus niet gevraagd , de beste kaart voor UT en Deus EX!Nou , 3dfx(v5 5500) is hierin zelfs sneller dan de Ultra!Maar iedereen probeert zo een gast weer een Geforce door zijn strot te douwen.Lijkt wel massapsychose hierzo!Zo iets van : ik heb een geforce en iedereen moet er eentje hebben.(grijze muisjes)
Ik heb zelf ook een Geforce 2 , ben redelijk tevreden , maar ga hierna toch naar iets anders kijken.Mensen die zich maar vastklampen aan 1 merk en het verdedigen tot de dood bij wijze van spreken,dan.vind ik dom,echt dom.

Intel PIV 3.0@3600Mhz,2xiwaki md-15R en 1xswiftech mcp600, 4x xlr120 rad in externe boxes,dfi lanparty pro 875B,geil 512 mb pc4200, Ati Radeon 9800 pro,Audigy2,Silicon Graphics F180 (18,1 inch TFT)


Acties:
  • 0 Henk 'm!

Anoniem: 5295

Topicstarter
heej haallo.
Ik beantwoord zo alleen een vraag, als iemand op twekers.net keek hoefde hij zo'n vraag niet te stellen. Maarja..:(

Ennuh, jij maakt er weer een mening over, er zullen niet veel mensen zijn die voor de kaarten van 3Dfx kiezen.

-------------------------------------------------
ChukE ( http://www.cultdeadcow.com )

Acties:
  • 0 Henk 'm!

  • Halo_Master
  • Registratie: Mei 2000
  • Laatst online: 02-12-2024

Halo_Master

kan het cooler , pleaze?

Nou , ken er genoeg , en als het zo is komt dat omdat mensen een vraag stellen en een bekrompen antwoordje krijgen van :koop een gf2 voor UT of NSF 5 of Deus EX , want een geforce is de beste!Voor deze games niet , alleen willen de huidige Nvidia-freaks dat niet onderkennen en geven slecht advies.En ik heb geen vraag gezien bij de thread die jij hebt gestart.Is alleen een test van een Gf2 Ultra zonder een vraag erbij , lijkt wel een commercial!

Intel PIV 3.0@3600Mhz,2xiwaki md-15R en 1xswiftech mcp600, 4x xlr120 rad in externe boxes,dfi lanparty pro 875B,geil 512 mb pc4200, Ati Radeon 9800 pro,Audigy2,Silicon Graphics F180 (18,1 inch TFT)


Acties:
  • 0 Henk 'm!

Anoniem: 5295

Topicstarter
Heej makker, als jij je er aan stoort, moet je ze niet openen.

-------------------------------------------------
ChukE ( http://www.cultdeadcow.com )

Acties:
  • 0 Henk 'm!

  • Halo_Master
  • Registratie: Mei 2000
  • Laatst online: 02-12-2024

Halo_Master

kan het cooler , pleaze?

Moet ze gewoon niet plaatsen , klaar , ga maar een baantje zoeken bij de ster of IP of zo .................... als je een review doet , zet dan alleen ff de url neer , hele artikel is totaal niet nodig.

Intel PIV 3.0@3600Mhz,2xiwaki md-15R en 1xswiftech mcp600, 4x xlr120 rad in externe boxes,dfi lanparty pro 875B,geil 512 mb pc4200, Ati Radeon 9800 pro,Audigy2,Silicon Graphics F180 (18,1 inch TFT)

Pagina: 1