Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

geforce 2 gts ultra

  • 14-08-2000 1:10pm
    #1
    Closed Accounts Posts: 6,601 ✭✭✭


    my god its the holy grail, this will blow any other graphics card out of the water

    http://www.tomshardware.com/graphic/00q3/000814/index.html

    this is coming out at $500 yet the voodoo5 6000 is coming out at $600 (if it ever comes out) and still wont support T&L.
    goddamn 3d/fx suck ass, why dont they realise that?

    http://www.filmsoc.com


Comments

  • Registered Users, Registered Users 2 Posts: 20,099 ✭✭✭✭WhiteWashMan



    hurrah!
    i can now get an extra 2 fps in quake and 3 new shades of brown.


  • Registered Users, Registered Users 2 Posts: 2,518 ✭✭✭Hecate


    well all that matters is that I can run Deus Ex flawlessy @ 1024x768, 32bit color...who needs an Ultra? smile.gif


  • Registered Users, Registered Users 2 Posts: 20,099 ✭✭✭✭WhiteWashMan


    ahh cmon even my voodoo 3 can do that


  • Closed Accounts Posts: 353 ✭✭Yossarian


    and the NV20 is just round the corner.
    even faster and even more fun!

    Stephen.


  • Registered Users, Registered Users 2 Posts: 4,162 ✭✭✭_CreeD_


    Nope, the sept. release was expected to be the NV20, but this is it. A faster Geforce 2.
    The Nv20 won't be out until Spring 2001.....

    Kali, if you read most of the reviews you'll know that HW T&L is still a no-no. It's the MMX of graphic cards. Someone who has the dedication and cash to buy the Ultra most likely has a superfast CPU. You see no real difference. Now, this I know will change but when? The only game Anandtech admitted to previewing that actually should show a difference with HW T&L is the next UNreal game and thats years away.
    The best thing about this new one is the Memory bandwidth, for that I think it might well be worth getting.
    The V6000 still has ober 50% more ram bandwidth. Though some of this will be lost in replication between the 4 chips, nobody realy knows how much.
    Anyway, The ultra wont be released 'til October and we 'should' have a valid V6000 to compare it to by then.


  • Advertisement
  • Closed Accounts Posts: 353 ✭✭Yossarian


    "Creative Technology Ltd. today announced the 3D Blaster® AnnihilatorTM 2
    Ultra graphics accelerator, slated to ship to online and retail outlets in
    October, 2000. The Annihilator 2 Ultra will hold an estimated street price
    (ESP) of US$499."

    Its now August and there announcing something that will be on sale in october, fup sake...

    When i said the NV20 was just around the corner i was of coarse refering to the proverbial corner, around which is also DDR support, Dual Athlons and the next big thing (TM).

    Stephen.


  • Closed Accounts Posts: 6,601 ✭✭✭Kali


    creed what difference are you talking about?
    do you mean if you already have a superfast cpu you wont see any difference in speed by gettin this card?
    which depends whether your talkin about low or high resolutions.. at low resolutions (640x480-800x600) the cpu itself is the limiting factor in which case if your using a **** fast cpu you wont see a huge jump in speed there will be still be an increase though, however at higher resolutions memory bandwidth on the card and fill rate are the limitations .. so the geforce ultra will increase speed by a very large margin regardless of cpu or platform its running on.
    and since very few ppl who go and buy this card or the potential buyers will actually be using those lower resolutions i cant see how your point is relevant .. unless you meant something else smile.gif

    the v6000 will be ****. guaranteed. its memory bandwidth will be hogged by the 4 chips as you mentioned. Is it coming with 128mb ram? which means only 32mb per chip all of which need to hold the same textures so your still only dealing with 32mb available memory to fill.. i reckon if thats the case it will still struggle.
    3d/fx imo need to rethink their whole strategy rather than just adding more chips and multiplying numbers to come up with polygon fill rates which their cards are never going to achieve.
    (just to be sure on my facts I checked on 3dfxs official page.. they claim 1.47Gigapixels per second as a fill rate.. unless they have a 12Gb/s memory bandwidth than that claim will never be achieved, they dont mention memory quality, only that its at 350Mhz)
    now dont get me wrong i hope this card can do 2048X2048 with 4XFSAA and still churn out 70+ fps.. (as is claimed on many sites)
    .. it would be goddamn stunning and id probably be tempted to buy one.. but i seriously doubt it.

    now im off to start some work smile.gif


  • Registered Users, Registered Users 2 Posts: 4,484 ✭✭✭Gerry


    Creed, the hardware and t&l does not make as much difference as nvidia claimed, but in quake3, it makes quite a big difference. Nvidia were previously slower on low end cpu's, now they are faster. At low res, the geforce is 30-40% faster than a tnt2. Its the voodoo that needs the superfast cpu. How can you explain that? Also, the geforce2 ultra is overkill. I pretty much think my geforce ddr is overkill, and with the detail I play with in q3, at 1024*768, even with a 1 ghz p3, I am cpu limited.


  • Closed Accounts Posts: 102 ✭✭OctaviaN


    Guys y dont ye buy whats here now?
    I had a 486 for 9 years and upgrading it was pointless
    y not buy a pc for 3 years and just renew it/or overhall it with loadsa bits? besides ye can all prolly run q3 at decent fps if not upgrade but in a year you wont run ****!!
    yer talkin about **** not being released for a year?
    yer talkin about **** that doesnt matter....im talkin about **** that doesnt matter????? im bored! did any d/l that battlemail program??? does any1 wanna have a game??? and u only neeed a 133 and 800x600 smile.gif



  • Registered Users, Registered Users 2 Posts: 4,162 ✭✭✭_CreeD_


    Kali,
    "this is coming out at $500 yet the voodoo5 6000 is coming out at $600 (if it ever comes out) and still wont support T&L.
    goddamn 3d/fx suck ass, why dont they realise that?"

    Soooo, I focused on your HW T&L argument, the fact that the V6000 doesn't have it now isn't a problem, IMHO.


    Gerry:
    I specifically mentioned high end CPUs, anyone who can pay ~400 quid for a Video card will have a top end system.

    A Voodoo "Needs" a fast CPU, because the slow ones can't keep up with its fill rate, as would be the case with a geforce on anything that doesn't use HW T&L (which is pretty much anything based on the Quake1/2/3 engines). If you do have a low end, then as you said just get a Geforce one, the rest of your system will be too much of a bottleneck for the other operations like sound/Hard drive/ram access etc. to warrant more.


    The v6000 doesn't have to equally share bandwidth, and yes it's 12G/s is with all 4 chips, but not all data is replicated - hence my comment about nobody being able to accurately guess at it's effective bandwidth since we don't know the overhead needed for synchronisation. Also, it's not as simple as 32mb per chip. Since the frame buffer is unified for all 4, it will get split between them. Say your nice shiney new high res game needs (simplified) 8Mb for the Frame & Z buffers. on a 32Mb card that leaves 24Mb (On a Geforce this is slightly less since it also needs to store the Vertex info for the T&L engine onboard, the ATI Radeon has a seperate vertex buffer and doesn't have this problem). On a Voodoo6000 it would be 8/4 = 2mb from the texture ram each chip can use, so it would have 30Mb in the same situation. Granted, it's only a 33% improvement and not true 128Mb capacity, but it is not as bad as you think.
    The biggest bottleneck with Ram-Bandwidth is generally not textures (on a 32mb+ card anyway) - with the exception of some Q3 levels, which was solved with Texture compression anyway,It's actually the frame buffer.

    Get a game that allows you to set the Texture bit depth independantly of res (there are a few out there). Set the overall Bit depth to 32, See if you notice a big difference with 16 bit or 32 bit textures enabled...Not much.
    Now try the same res at 16 bit overall, and 32 bit. BIG difference, yet the only thing you've really changed here is the Frame buffer settings, and the amount of ram it needs per frame.
    This is also why 64Mb Geforce/Geforce2's only show decent FPS improvements over their 32Mb brethren in very rare-select situations.

    So, while the V6000 will indeed replicate textures, it does not need to replicate for it's frame buffer, this is unified between all 4 chips and thus is not limited to the bandwidth of any one.
    Again simplifying, say its setup so each chip does 1 line at a time (This will most likely be user configurable for best visual quality). This means that at 1600x1200, each chip is effectively doing 1600x300, which is the same strain in both fill rate and ram-bandwidth as doing 800x600 - which the Vs100 can do in it's sleep.
    It's a bit muddy, but it means It's still a wildcard with high potential.


    Pretty Land


    [This message has been edited by _CreeD_ (edited 15-08-2000).]


  • Advertisement
  • Closed Accounts Posts: 6,601 ✭✭✭Kali



    -->Soooo, I focused on your HW T&L argument, the fact that the V6000 doesn't have it now isn't a problem, IMHO.

    okay, i thought you were refering to the fastcpu point which was what i replied to.

    but i think your dismissing T&L a bit too much.. opengl can make very good use of it the quake engine took advantage of that well before it was even realised in hardware, and im sure we'll see directx games getting a much bigger performance increase with it what with nvidia and MS basicalyl working together releasing driver sets now.. (both directx and detonator)
    on a related note.. anyone have any actual performance figures for those latest drivers?
    oops just checked.. tomshardware has article up.

    some nice info there on the v6000.. any urls?
    would like to read more info thats not available on 3d/fxs page and therefore full of t-buffer and other media fancy words.. (i didnt look too far) you dont know the speed each chip is operating at?


  • Registered Users, Registered Users 2 Posts: 6,660 ✭✭✭Blitzkrieger


    Originally posted by _CreeD_:
    . Say your nice shiney new high res game needs (simplified) 8Mb for the Frame & Z buffers. on a 32Mb card that leaves 24Mb (On a Geforce this is slightly less since it also needs to store the Vertex info for the T&L engine onboard, the ATI Radeon has a seperate vertex buffer and doesn't have this problem). On a Voodoo6000 it would be 8/4 = 2mb from the texture ram each chip can use, so it would have 30Mb in the same situation. Granted, it's only a 33% improvement and not true 128Mb capacity, but it is not as bad as you think.
    The biggest bottleneck with Ram-Bandwidth is generally not textures (on a 32mb+ card anyway) - with the exception of some Q3 levels, which was solved with Texture compression anyway,It's actually the frame buffer.
    (edited 15-08-2000).]


    Chewbacca is a Wookie. Why would a Wookie choose to live on a planet with the Ewoks - it makes no sense......

    Jays! Getting technical here!

    "2 gts ultra" With a name like that it's got to be good!


    Personally taking all the differant factors into account, the different chip architecture, different memory, different amount of chips gives me a headache. I saw a review in PCZ where they didn't go into any detail at all. They compared real-world performace of a VooDoo 5 5000 to a geforce 2 gts. They found that with most games the geforce nudged it tho there was little noticable difference, but in some games the VooDoo out-performed it.

    With the VooDoo 5 6000, I'd guess that they'll out-perform it even more but I'm going to wait n c. At the end of the day all most people will do is compare real-world performance. Most people won't care about t-buffers, 4XFSAA or anything. They just want to load up a game and say : "Wow!"


  • Registered Users, Registered Users 2 Posts: 4,162 ✭✭✭_CreeD_


    "Chewbacca is a Wookie. Why would a Wookie choose to live on a planet with the Ewoks - it makes no sense......"

    Have you ever seen a Wookie brand of toilet paper?..No....Now you know.

    I know T&l is nice, and that it will be great in the future. I just don't see it being even near necessary for another year. And then, if it ever appears, the Glaze3d thingy will be here to keep our toes warm.


    The V6000 stuff is from a lot of dif. articles. Mainly the few Anandtech came out with over the last 6 months. Can't remember the rest.
    3dfx are being very tightlipped about it, which doesn't bode well.
    And to go out on a limb, I'd say it will have about 30% over the Geforce2 Ultra.

    It's amazing the things you'll write waiting for Planetarion to get it's ass in gear... smile.gif


  • Closed Accounts Posts: 6,601 ✭✭✭Kali


    Originally posted by Blitzkrieger:

    I saw a review in PCZ where they didn't go into any detail at all. They compared real-world performace of a VooDoo 5 5000 to a geforce 2 gts. They found that with most games the geforce nudged it tho there was little noticable difference, but in some games the VooDoo out-performed it.

    haha "nudged" .. if by nudged you mean an extra 15-40% performance in the majority of games.

    unless "real-world performance" equates to throwing your card out the window to see if the fans will make it fly then theres no way the v5000 will outperform the geforce 2.

    http://www.tomshardware.com/graphic/00q3/000721/radeon256-04.html

    imo the v5500 (let alone the 5000) is not even a competitor for the geforce 1 ddr (which will easily outperform it at 1024 < resolutions > 1280)


  • Registered Users, Registered Users 2 Posts: 20,099 ✭✭✭✭WhiteWashMan


    so...who managed to get an xtra 3 fps and 4 shades of brown in quake?


  • Closed Accounts Posts: 577 ✭✭✭Chubby


    Originally posted by WhiteWashMan:
    so...who managed to get an xtra 3 fps and 4 shades of brown in quake?
    Errr, who still plays Quake?? smile.gif


  • Closed Accounts Posts: 6,601 ✭✭✭Kali


    just to mention that i just have a bog tnt2 so amn't really pushed either way smile.gif
    (now to go dl new detonator drivers to make me drive faster in NFS5 and GP3)


  • Closed Accounts Posts: 6,601 ✭✭✭Kali


    quick update..
    voodoo 6000 has 350Mhz RAM and 3d/fx claim a memory bandwidth of 12Gb/s.

    the gf 2 gts ultra uses 460Mhz RAM and has an actual bandwidth of 6.9Gb/s, which can be boosted up to 8Gb/s at 500Mhz, and thats only feeding one chip (and its still limited by memory bandwidth)

    i think 3d/fx are quoting the total memory bandwidth of all 4 chips.. e.g. 4Gb/s per chip, wish there was some proper figures available for it, number of rendering pipelines pre chip etc. rather than a load of garbage about "t-buffer" etc. etc. so people can actually come up with proper comparisons.
    still can you imagine if a manufacturer decided to throw a few geforce 2 chips on the same board with 128Mb DDR RAM? whoa.


  • Closed Accounts Posts: 27 iar_xeno


    Well I have a geforce 2, and DLd the drivers and they sent my monitor into what must have been a >100Hz refresh rate as I couldn't see anything apart from "monitor working outside scan range". Be wary, if your monitor tries to set itself to whatever refresh rate those drivers suggest you may find yourself with a very expensive paperweight where your monitor was. Not sure what happened, I had only installed the retail drivers shipped by Creative (which were rubbish - failing to work with the demos that came with the card, and certain Q3 levels, which I blamed on the texures being above 32Mb - wrongly) then I installed the TNT2 drivers, great performance but some texture screwups on Q3 (sky was v-low quality and some wierd colouring on certain textures) but after TNT3 I am back on TNT2.

    Xenophobic


  • Users Awaiting Email Confirmation Posts: 285 ✭✭sam


    creed, youre talking about reviews..
    well as far as quake3 is concerned, t&l makes a big difference.
    i used to have a tnt2 running at 165/180, on a p3-600.. now i have a geforce SDR on the same system running at default core speed/180 memory speed, it gives me 30% more fps than the tnt2

    increasing the core to 240 on the tnt2 would not give me a 30% increase, it probably wouldnt even have given me a 15% increase,
    so t&l does work.. and quake3 supports it, so it IS "supported"
    t&l is nothing like mmx, mmx could have been useful if the instructions were different, and if it didnt take so many cpu cycles per instruction, but they did, so mmx wasnt useful for games.. so mmx was crap.
    t&l is not an added instruction set, its hardware accelerating portions of the cpu's work.. unless you have a cpu which is faster then the t&l unit (which will always cost more than the current equivalent t&l unit at the time, the same way as it would cost you much more to get a cpu that could do 1024*768 at 150 fps in quake3, than just getting a geforce2 right now... in fact i dont think any processor on the market right now would be able to go anywhere near 1024*768 at 150 fps)

    as for chubby, loads of people still play quake, and some of the good duellers are going back to QW on barrysworld.. as for me, i play both qw and q3 whenever i get the chance, for now

    actually, hang on.. you used to MUD, didnt you, hahahha, you cant complain about **** graphics!


  • Advertisement
  • Users Awaiting Email Confirmation Posts: 285 ✭✭sam


    lol@thetroll


  • Users Awaiting Email Confirmation Posts: 285 ✭✭sam


    so eh anyway i doused him with the holy water and he turned into an elf

    weird eh?

    then i was slain by a troll frown.gif

    [This message has been edited by sam (edited 16-08-2000).]


  • Registered Users, Registered Users 2 Posts: 4,162 ✭✭✭_CreeD_


    Sam, as I said it is good for very few (read: One really popular) games, and under select conditions. It will help you if you have a lower end CPU, but the original argument about folks who would buy the Ultra stands. If you shell out for it, you can shell out for a faster CPU.
    And no, you're right, a 1.1 Ghz CPU will not match the full graphics performance of the Ultra, but a 1.1 Ghz CPU with an accelerator of equal fill-rate to the Ultra (humour me) would. T&L is not as effective on the Geforce series as Nvidia have made out, in a number of situations the Geforce1 was proven to be slower at T&L than an Athlon700, even though on paper it had 3x the polygon capabilities.

    Don't get me wrong, they're good cards, I have a Geforce DDR and it's great, one of the reasons I havent felt the need to jump on the upgrade bandwagon in a while.
    This Ultra would be worth it (Whereas the first batch of Geforce2's wasn't). But If I but it, it'll be because it has the fill-rate and features to play the games I like today.
    Also, as a side note, the faster CPU allows you to do so much more than just FPS. I play a lot of strategy games, encode MP3's etc. So I think it's a better investment.


    On the TnT2...I would expect any card with double the ram, and a greater fill-rate to outperform it's predecessor. You quoted overclocked TNT2 numbers that from your phrasing are theoretical, you can't match those against existing real-world facts and numbers.
    Also, from what you're saying , if T&L was mostly responsible for the big performance boost in Q3, then it doesn't give you an increase in anything else (Since T&L wont be a factor)? Then Take it back my son, you've got a dud Geforce.

    Ps. MMX was actually supposed to be quite good just never used properly.There was a post-mortem on VE I think a year ago. A lot of programmers from Tim Sweeney to Brian Hook felt it was wasted by most programmers. That was why I drew the comparison.


  • Registered Users, Registered Users 2 Posts: 6,660 ✭✭✭Blitzkrieger


    What's the point in having your monitor working at 100Hz? Don't your eyes only work at 40Hz?


    Peeps who don't know might find this interesting - TV works at 50Hz while most VDUs work at 40Hz. That's why you get that nasty flickering when you see 'puters on the telly. wink.gif


  • Closed Accounts Posts: 6,601 ✭✭✭Kali


    oh please god no.
    please god lets not get into the refresh/fps debate..
    anyway anyone can tell the difference between 60 and 80Hz let alone 100Hz.
    just maximise a mainly white window and look slightly to the left or right of the monitor.. there ya go .. flicker lots of it.


  • Moderators, Science, Health & Environment Moderators Posts: 17,066 Mod ✭✭✭✭Gonzo


    I replaced my dell geforce drivers with those new detonator ones and they have certainly made my Q3 etc run alot faster than before but as was said earlier in this forum the new drivers have made the sky in Q3 look very lo-res and odd sky colouring. The new drivers have also fixed a problem i had with flickering in many games particularly in Q2 when you fire the hyperblaster etc the screen used to flicker alot but now its perfect. The only real downside to these drivers is my Unreal Tournment seems to be running abit slower than before unlike all the other games which are now faster...


  • Registered Users, Registered Users 2 Posts: 3,308 ✭✭✭quozl


    Its a messages from the gods. Stop playing UT


  • Closed Accounts Posts: 6,601 ✭✭✭Kali


    jesus ye must all be crap if you just go around looking at the sky anyway like wink.gif
    p.s. go buy warlords battlecry. it rules.


  • Registered Users, Registered Users 2 Posts: 4,484 ✭✭✭Gerry


    Creed:
    Look the bottom line is , the geforce is faster even when it is not fill rate limited. If you put it in a resolution where it is cpu limited like 512*384, the geforce will be 20-30 % faster than a tnt2. Why is this? because t&l is having some effect. End of story. I cant understand why you are desperately defending 3dfx. The FSAA is great, but I can do that on my geforce now, and the 3dfx is significantly slower, almost all of the time.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 4,162 ✭✭✭_CreeD_


    Eh....I'm not defending 3dfx, just presenting some facts (As I see them). I buy the best card I can at the time, regardless of brand. Like I said I have a Geforce1, and I haven't jumped from it to the V5500 - there was no point

    And on FSAA - the Geforce is software only. The Geforce2 handles it in software for all D3d games (Until Direct-X 8, and games that use it, are released), and hardware for OpenGL.
    While this is nice to have it has been proved time and again that the V5500 FSAA quality is much higher.


    Are you seeing the same performance increase on anything other than a QuakeX engine(Or test drive6/Messiah AFAIK) ? If so, then you've just disproved your own point.
    There's more to graphic card enhancements than fill rate. Probably in a 1000 little ways that were never discussed the Geforce is more advanced than a TNT2, that'll contribute to a speed increase. Memory handling for one comes to mind (No, not pipelines or bus sizes, but how efficiently it handles these resources), Nvidia' cards used to be crap at handling textures quickly, 3dfx were much better at it (it's one of the reaons their PCI cards didn't suffer as much as was predicted without the extra bandwidth of AGP)

    [This message has been edited by _CreeD_ (edited 17-08-2000).]


  • Users Awaiting Email Confirmation Posts: 285 ✭✭sam


    by the way, most vdu's dont work at 40hz

    they might have in the 1980's, but most vdu's (monitors) can work up to 100hz at 1024*768 nowadays


    as for tv's, PAL tv's do 50hz interlaced (which means every second line on the tube gets updated per refresh, then next refresh the other lines get refreshed, then back to the first set, and so on)

    the reason you get a flicker of vdu screens when you are taking a picture of them is because your video camera only updates at about 24 frames per second, so if your monitor/tv refresh rate is low enough, you end up missing a lot of frames.. because your video camera does not take each frame at the same time as the monitor refreshes it (obviously), they are not in sync.

    tv's will flicker in video images the same as monitors, and the monitor will flicker a lot less, even in the video image, if its refresh rate is at 100hz as opposed to 60


  • Users Awaiting Email Confirmation Posts: 285 ✭✭sam


    one last message.. for Creed

    you're right, my geforce isnt that much faster than the overclocked tnt2 in other games, its just like having a tnt2 running at 240mhz core and 180mhz memory, as i said before.. in some games it makes a difference over 165core and 180memory, in some it doesnt.


  • Closed Accounts Posts: 1,484 ✭✭✭El_Presidente


    So.... Your saying there are OTHER games besides Quake?


  • Users Awaiting Email Confirmation Posts: 285 ✭✭sam


    hahahahha 40 fps

    this has happened before


    ill just ask one thing, who told you that your eyes can only see up to 40 frames per second, and why did you believe them?
    im actually interested in this


    i can tell the difference between 85hz monitor refresh rate in quake, and 120hz refresh rate.. EASILY
    and i think probably 75% of the population can as well, and the rest are either over 70 or blind


  • Registered Users, Registered Users 2 Posts: 4,484 ✭✭✭Gerry


    I heard that the geforce was two tnt2 chips and a t&l unit stuck together.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 4,162 ✭✭✭_CreeD_


    I heard it was in fact the IC replicant of Elvis' left foot (the TNT2 was the right Ankle).


  • Users Awaiting Email Confirmation Posts: 285 ✭✭sam


    gerry(if that is your real name..),
    a geforce is 2 tnt2 chips stuck together with t&l and some minor tweaks, the tnt2 was a faster tnt1 with some tweaks, the tnt1 was 2 riva 128 chips stuck together with some enhancements, chip companies rarely design a new chip from scratch, they usually take old chips and shrink the die size (so they go faster), or put two or more of them together on the same die, or change the memory bus, or stuff like that, obviously with every new graphics card released they will put some tweaks in as well, eg. the tnt1 had a bug which caused poor image quality at 1024*768 res , the image wasnt as sharp as it should have been.. so they fixed that in tnt2.. etc


  • Closed Accounts Posts: 577 ✭✭✭Chubby


    Originally posted by sam:
    one last message.. for Creed

    you're right, my geforce isnt that much faster than the overclocked tnt2 in other games, its just like having a tnt2 running at 240mhz core and 180mhz memory, as i said before.. in some games it makes a difference over 165core and 180memory, in some it doesnt.
    The geforce isn't that much faster than the overclocked tnt2 because of the memory limitations. The chip itself is a lot more powerful on the geforce which is why when they started using ddr ram, the geforce is so much faster. Same story with GTS and the GTS ultra.



    [This message has been edited by Chubby (edited 18-08-2000).]


  • Closed Accounts Posts: 6,601 ✭✭✭Kali


    wow sam arent you glad you bought your geforce then? 4 riva 128 chips stuck together with a few enhancements eh?


  • Registered Users, Registered Users 2 Posts: 3,537 ✭✭✭SickBoy


    Now just to be a stick in the mud smile.gif were those Riva 128 chips the vanilla or the ZX flavour?


  • Advertisement
  • Users Awaiting Email Confirmation Posts: 285 ✭✭sam


    damn straight im glad i bought a geforce

    i play on 1024*768 now in q3, instead of 800*600, and still get slightly higher fps so there


Advertisement