Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Doom 3 - brought to you by..Nvidia

  • 03-03-2004 6:42pm
    #1
    Closed Accounts Posts: 2,918 ✭✭✭


    http://www.megagames.com/news/html/pc/idrecommendsnvidiafordoom3.shtml

    Well great, so you can play doom3, and half life 2, but not on the same card?!
    *mutters angrily*

    Im sure doom3 will run on an ati card..but bloody after shelling out on a 9800 pro now i find out it wont get the best out of doom 3!? ARGHHH!


Comments

  • Closed Accounts Posts: 2,951 ✭✭✭L5


    I wouldnt worry about that. ID are pissed off with ati for leaking the doom 3 demo, also nvidia just payed them a load of money to endorse the game. it'll run fine on an ati card.


  • Registered Users, Registered Users 2 Posts: 8,070 ✭✭✭Placebo


    they promised that it would run on my laptops ati 9600 !
    bastards.


  • Closed Accounts Posts: 3,357 ✭✭✭secret_squirrel


    On a similar subject X2:the threat has a big Nvidia splash screen come up during loading...makes me smile ironically as my 9800se kicks into gear :D, although my MB is an Nforce2


  • Registered Users, Registered Users 2 Posts: 26,584 ✭✭✭✭Creamy Goodness


    it will still run on ati cards

    just nvidia cards will give like a 5fps increase

    in all fairness it's the same with unreal tournament they have that logo

    nvidia the way it's meant to be played.

    it play fine on my 9800pro i don't see why doom3 won't

    plus this is rather old news id have favoured nvidia for quite a while now.


  • Registered Users, Registered Users 2 Posts: 3,312 ✭✭✭mr_angry


    nVidia propaganda.

    There's nothing in the article that says it will be better than on an ATi card - they just use generic phrases like "lightning-fast frame-rates". The chances are it will be 5% faster on your current ATi card, as compared to any currently available nVidia card. Of course, the NV40 might be better than the R420, but nobody knows. Yet.

    Don't be the victim of poxy marketing.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 5,463 ✭✭✭shinzon


    i dont get why people get so het up about media manipulation, course there gonna put made for nvidia on the box when there in partnership with them.

    I have several games with the "way there meant to be played" title screen on them, I can turn everything upto full with AA and all the bells and whistles on No probs

    Dont read into the hype, all us ATI owners will be well catered for when Doom 3 arrives and we can snigger derisively when HL2 appears and the shoe is on the other foot so to speak

    Shin


  • Closed Accounts Posts: 29,130 ✭✭✭✭Karl Hungus


    It's utter horse****, tbh

    I mean, the UT2k4 demo has that Nvidia tripe on the startup, and I run it on my Radeon 8500 at 1280x960 resolution, with high details and shadows, and I dont get a single bit of slowdown whatsoever.

    I also get nearly 30fps on the Doom3 Beta.
    Although that does drop often enough.

    So, all in all, I'm pretty sure that anyone here will run the game perfectly fine on an ATI card.


  • Registered Users, Registered Users 2 Posts: 6,905 ✭✭✭User45701


    Originally posted by Cr3m0
    it will still run on ati cards

    just nvidia cards will give like a 5fps increase

    in all fairness it's the same with unreal tournament they have that logo

    nvidia the way it's meant to be played.

    it play fine on my 9800pro i don't see why doom3 won't

    plus this is rather old news id have favoured nvidia for quite a while now.
    Originally posted by L5
    I wouldnt worry about that. ID are pissed off with ati for leaking the doom 3 demo, also nvidia just payed them a load of money to endorse the game. it'll run fine on an ati card.

    So I dont have to go on a mass killing spree?


  • Closed Accounts Posts: 7,563 ✭✭✭leeroybrown


    The performance won't be an issue on the ATI card. The ATI card may even perform better if you tweak the settings to suit it's strengths and any Doom3 specific weakness that may exist will no doubt be tweaked out in a subsequent driver release.

    I thought the buggy leaked alpha looked fabulous on my ancient GeForce2TI (albeit at between 5 and 10 fps) so I'd be surprised if a player will notice the difference in game without 100 benchmark suites to tell them that one runs it 0.5% better.


  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by AngelWhore
    It's utter horse****, tbh

    I mean, the UT2k4 demo has that Nvidia tripe on the startup, and I run it on my Radeon 8500 at 1280x960 resolution, with high details and shadows, and I dont get a single bit of slowdown whatsoever.

    I also get nearly 30fps on the Doom3 Beta.
    Although that does drop often enough.

    So, all in all, I'm pretty sure that anyone here will run the game perfectly fine on an ATI card.

    So it wouldnt be "utter horse $hit" if it refused to play on your ATI card?? What are you saying here... you hate that nVidia support developers with support, guidence and a big bag of cash? WTF, if you were NV, what would you want in return? I know Id want more than a 3 second logo, but thats seemingly is all nVidia *requires*. They merely remain hopeful that some benefit to their customers is apparent due to the close cooperation with Game Developers.

    You should be thanking nVidia for advancing the PC Games business and bringing some much needed media attention, instead you are all getting your knickers in a not over a poxy animation.



    Matt


  • Advertisement
  • Closed Accounts Posts: 16,339 ✭✭✭✭tman


    Originally posted by leeroybrown
    The performance won't be an issue on the ATI card. The ATI card may even perform better if you tweak the settings to suit it's strengths and any Doom3 specific weakness that may exist will no doubt be tweaked out in a subsequent driver release.
    yep, just have a look at the nvidia dawn demo if you need proof. designed explicitely for the fx series of cards, it'll run perfectly on a radeon once you've tricked it into thinking you're using nvidia hardware...


  • Closed Accounts Posts: 25,848 ✭✭✭✭Zombrex


    Both Valve and ID have made there beds with the graphics card makers, but neither of them are stupid enough to release games that run significantly slower on the "other" graphics cards.

    At the end of the day Valve and Id want to sell as many copies as they can, they are going to optimise their games for both cards.

    All this is just marketing crap.


  • Registered Users, Registered Users 2 Posts: 2,694 ✭✭✭Dingatron


    yep, just have a look at the nvidia dawn demo if you need proof. designed explicitely for the fx series of cards, it'll run perfectly on a radeon once you've tricked it into thinking you're using nvidia hardware...

    How'd you do that than? Played it on the 9800 Pro the other nite and while good, once I put everything on full it was a bit jerky. I'd be interested to see the difference once tweeked.


  • Closed Accounts Posts: 1,049 ✭✭✭superfly


    i have a 9700 pro and i get that Nvidia splashscreen as well but i thought that was due to the Nforce2 on my motherboard
    when i had a 3dfx card i used to get the 3dfx splashscreen, so i just assumed that the games are picking up the Nforce


  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    Dawn demo and hacks are old hat. Vulcan is a much more impressive demo.


    Matt


  • Closed Accounts Posts: 7,488 ✭✭✭SantaHoe


    ^ Yeah what the guy a few posts ago said...
    It all runs through a standard d3d/opengl api anyway, it's not like they're going to sabotage the performance for ati users.
    My Radeon 9600pro performs very nicely in UT2004demo... the nVidia splashscreen is nothing more than an advertisment.


  • Closed Accounts Posts: 29,130 ✭✭✭✭Karl Hungus


    Originally posted by Matt Simis
    So it wouldnt be "utter horse $hit" if it refused to play on your ATI card?? What are you saying here... you hate that nVidia support developers with support, guidence and a big bag of cash? WTF, if you were NV, what would you want in return? I know Id want more than a 3 second logo, but thats seemingly is all nVidia *requires*. They merely remain hopeful that some benefit to their customers is apparent due to the close cooperation with Game Developers.

    You should be thanking nVidia for advancing the PC Games business and bringing some much needed media attention, instead you are all getting your knickers in a not over a poxy animation.

    Matt

    No, you git... I'm stating that the notion of the game running on NVidia cards only is horse****, and I'm giving the UT2k4 comment as an example. I'm pretty sure that it was rather clear what I was saying, and the point I was making.

    So please, STFU and stop putting words in my mouth.


  • Registered Users, Registered Users 2 Posts: 15,817 ✭✭✭✭po0k


    Ken, calm the fuck down.
    Matt knows what he's on about.
    Originally posted by SantaHoe
    ^ Yeah what the guy a few posts ago said...
    It all runs through a standard d3d/opengl api anyway, it's not like they're going to sabotage the performance for ati users.
    My Radeon 9600pro performs very nicely in UT2004demo... the nVidia splashscreen is nothing more than an advertisment.

    well that's actually a consideration.
    A game's D3D engine might include specific code-paths to take advantage of a particular family/generation of cards.
    HL2 being an example.
    Apparently performed really well on ATi hardware whether it used the standard D3D code or the ATi optimised one, yet nVidia hardware ran it sub-par with either the standard or nVidia optimised codepaths.

    Although OpenGL is a different kettle of fish (being opensource for one), there are nVidia OpenGL extensions and I don't remember ATi OpenGL performance (under windows) being anything to write home about.

    Tbh, I'm currently running a GF4Ti4200, had a GF2Pro before that and a GForce256 before that, with a VooDoo2 and VooDoo1 before them.
    My next card will more than likely be an ATi, simply cos I like the way they're going, especially with they're approach to PCI-express support. ie native instead of bridged.


  • Registered Users, Registered Users 2 Posts: 11,446 ✭✭✭✭amp


    Yellow cards for Matt Simis and Angelwhore for flaming. Calm down lads.


  • Registered Users, Registered Users 2 Posts: 6,892 ✭✭✭bizmark


    A game's D3D engine might include specific code-paths to take advantage of a particular family/generation of cards

    Thats more than likely true X2 the threat has a "nividia" jobby as well and it runs around 10-20% better on Nividia cards than ati

    Its all marketing lads big deal your only loseing out a few fps either way


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by AngelWhore
    No, you git... I'm stating that the notion of the game running on NVidia cards only is horse****, and I'm giving the UT2k4 comment as an example. I'm pretty sure that it was rather clear what I was saying, and the point I was making.

    So please, STFU and stop putting words in my mouth.

    So where did the notion of "running on NVidia cards only" come from then? Everything Ive read used vague marketing language like "plays best on" or "the way its meant to be played"... The animation is gone faster than most can read it anyway, as already stated, its merely an advertisment.


    Matt


  • Registered Users, Registered Users 2 Posts: 11,446 ✭✭✭✭amp


    Paul88, while your post made me laugh, please put away your flame thrower. You're new* to Games so I'll let it slide this time but next time you'll earn yourself a temp ban.










    *Well your nick is new anyway ;)


  • Closed Accounts Posts: 867 ✭✭✭l3rian


    i hope this relationship doesnt get too serious, we dont want a situation were certain games will only play on certain gfx cards


  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by l3rian
    i hope this relationship doesnt get too serious, we dont want a situation were certain games will only play on certain gfx cards


    There is no sane business model that supports such an approach. These companies arent idiots, so you dont need to worry bout that happening any time soon.


    Matt


  • Registered Users, Registered Users 2 Posts: 10,304 ✭✭✭✭koneko


    :rolleyes:

    That's never going to happen. Game manufacturers are in it to make money aswell. They're hardly going to want to alienate half their potential customer base.
    All this hooplah over a little image at the start of the game.


  • Registered Users, Registered Users 2 Posts: 11,446 ✭✭✭✭amp


    Counterstrike plays alright on my old Voodoo 3500tv. Have you ever tried CS Paul88?


  • Closed Accounts Posts: 14,983 ✭✭✭✭tuxy


    Originally posted by amp
    Counterstrike plays alright on my old Voodoo 3500tv. Have you ever tried CS Paul88?
    1.6 or the older versions?


  • Registered Users, Registered Users 2 Posts: 11,446 ✭✭✭✭amp


    Originally posted by tuxx
    1.6 or the older versions?

    Well the 3500 is in my server now which is my precious so there's no way in hell that Steam is ever going to go near it. When it was in my main machine I played Half-life and Q3 with it and while the frame rate was below average it was playable.

    (hey you did ask)


  • Closed Accounts Posts: 29,130 ✭✭✭✭Karl Hungus


    Originally posted by Matt Simis
    So where did the notion of "running on NVidia cards only" come from then? Everything Ive read used vague marketing language like "plays best on" or "the way its meant to be played"... The animation is gone faster than most can read it anyway, as already stated, its merely an advertisment.


    Matt

    THIS is where it came from:
    Originally posted by Deadwing
    Well great, so you can play doom3, and half life 2, but not on the same card?!
    *mutters angrily*

    That is what I am repling to.

    I have no bones about NVidia whatsoever.
    Ok?


  • Advertisement
  • Closed Accounts Posts: 29,130 ✭✭✭✭Karl Hungus


    Aye Deadwing, I'm well aware of that.

    Which is why I gave an example of a game that has the Nvidia running perfectly fine on an ATI card. Hence reasuring you that you will indeed get the best out of your card, despite the plugs by nvidia. Personally, I shall be shelling out on a new GFX Card myself, more than likely a 9800pro, and I have no worries that it'll run everything perfectly fine.

    Anyway, sorry for quoting you in my own defence, but Matt Simis was taking me up completely wrong.


  • Closed Accounts Posts: 2,918 ✭✭✭Deadwing


    I just noticed Deus Ex 2 had a 'the way its meant to be played' on the box. Runs absolutely perfectly on the 9800pro :D


  • Registered Users, Registered Users 2 Posts: 15,817 ✭✭✭✭po0k


    Originally posted by Matt Simis
    There is no sane business model that supports such an approach. These companies arent idiots, so you dont need to worry bout that happening any time soon.


    Matt

    3Dfx anyone??

    If some company comes up with some killer-tech-patent which makes coding lovely fast smooth hi-res graphics engines easy as pie for developers and markets it to the audience do you think another monopoly could start?

    Unlikely, but "possible" :)


  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    Well, 3DFX initially provided Glide as there was no generic API available at the time. They also had a hardware monopoly (due to lack of competition) at the same time as the software one they stumbled into. When the "3Dfx Enhanced" (or whatever they called it) campaign started DirectX was established, Glide became a marketing tool. Mind you, it was good, real good. They never stopped anyone making games that supported both DX and Glide however, they had no need to as the Glide version genuinely looked and played better.

    Sure, anything is possible, but I would say its far more likely we see an escalation of the Doom3 "NV30" codepaths approach, it will run on any card, but has special ways of dealing with certain hardware. Funnily enough, the D3 NV30 codepath is as much a kludge to handle nVidias (mis-)interpetation of a standard as it is a performance enhancement.

    Lets not forget, NV30+ nVidia hardware has accelerated methods of dealing with shadows, to use that the game needs to be written with NV in mind (afaik).


    Matt


  • Registered Users, Registered Users 2 Posts: 3,312 ✭✭✭mr_angry


    The fact that ATi's current generation of cards is so much better than nVidia's means that the game will most likely run better on any comparable ATi card over any nVidia card, and there's nothing that nVidia's propaganda division, or John Carmack's choice of code path can do about that (unless he went totally mad).

    However, the game wont be released until the next generation of cards are in full swing, so its hard to tell who's will run best. Carmack publicly stated that he was having trouble coding using the NV3x code-path though.

    As we know, the ATi R420 will be out at the end of this month, and the NV40 will be out sometime around the middle of May. However, there was a rumour recently that nVidia had scrapped the NV40, and rushed the NV45 (rebadging it as the NV40 so as nobody would notice) because they knew the R420 was gonna kick its ass. Hence the reason its coming out so far after the R420.

    If you want to play any DX9 game on a current generation card, you want to be buying an ATi. 6 months from now? Who knows...


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by mr_angry


    If you want to play any DX9 game on a current generation card, you want to be buying an ATi. 6 months from now? Who knows...


    Admittedly I currently have a FX5950 Ultra so I could be accused of bias, but I switched from a Radeon 9800pro. Ive had Radeons for the last 2 years (and I switch cards a lot, so many Radeons) and could have got the 9800XT at a good price, but it just got boring using ATI for so long..

    The fact remains that NV HW is usually faster at DX8 games and is faster in Doom3 (as per Anandtech sneak peak) while being on par in Stalker with ATI counterparts (Doom3 and Stalker being "DX9" gen games). It tanks in Tomb Raider and Half Life2 (to a lesser degree). Its a bit of hit and miss, and generally ATI cards seem "safer" bet for DX9, but its far from as black and white as you portray it.

    nVidia has other advantages too:
    nVidia has better SW vendor support, better drivers under both Windows and Nix (switching back to NV drivers after so long really reminded me, tho ATI drivers arent bad, they arent as polished), vastly better 3D Stereo support, sometimes better image quality (although both of them can be accused of playing with quality to "fix" scores), better DX7/8 support, more robust HW (can run at higher AGP speed and overclocks well, expect to see good early PCI Express implementations), the ingenious OTES cooling system alone on the NV reference cards is worth $60+.

    Whats happening to nVidia now among the digerati is a lot like the OTT critique that plagued 3DFX in their later days. People latched onto their lack of 32bit colour support and dogged them on every mishap afterwards, even when they were on the verge of releasing a vastly superior product. People like to see market leaders go down in flames, deservedly or not. There are still people who claim as if it was a matter of fact that NV cards are "hot and noisy" despite the fact that to more recent NV cards compare favorably to ATI cards. When the Voodoo3 3500 was released, it was the fastest and most feature packed product on the market, but nobody cared as NV marketing did such a good job convincing them that they needed HW T&L that it never recieved the accolades and commendation it deserved.

    Sure, NV cards have problems, but compare them for a second to their other competitions "attempts": Intel, S3 or VIA cards. Nvidia are also doing a lot of things right.


    Matt


  • Closed Accounts Posts: 975 ✭✭✭j0e9o


    from what i have read the best card to have for this game is the ti4800, a bit rippin since i have a 9800pro


  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by thomasmckinless
    from what i have read the best card to have for this game is the ti4800, a bit rippin since i have a 9800pro

    Umm... Doom3? Ti4800? Where did you see that?

    Heres the Anand article I mentioned:
    http://www.anandtech.com/video/showdoc.html?i=1821&p=22


    Matt


  • Registered Users, Registered Users 2 Posts: 3,312 ✭✭✭mr_angry


    I appreciate that you're trying to give balance to the argument Matt, and the last thing I want to see is nVidia bullied out of the marketplace (which is a pretty unlikely scenario). The competition in the GPU market at the moment is the single greatest driving force which is giving consumers fabulous choice and high-quality products. If the NV40 turns out to be better than the R420, I'll be quite happy to promote their superior products to people.

    However, from a purely consumer oriented perspective, the Radeon 9600 and 9800 series almost universally outperformed the comparable nVidia cards in relation to DX9 games. I agree - the nVidia cards do have slightly better performance on DX8 games, and they have a nice compact set of utilities (I myself was an nVidia customer for more than 3 years), but this is not a substitute for real performance, especially when we're talking about a DX9 game. The fact that nVidia hadn't implemented the Shaders 2.0 standard on the FX series left them at a serious disadvantage, and put both developers and consumers in a difficult position. It is entirely a hardware issue, and thus could not be resolved by later driver releases. For most, it was easier to switch to supporting ATi.

    As for the performance indicators you mentioned - the Doom3 leak was a thoroughly unoptimised alpha (the article in question doesn't seem to indicate what kind of Doom3 sneak peak they have, so I'm assuming it was last year's leak), and its impossible to determine any of these issues before release, let alone the near cumpulsory 1.01 patch and driver updates from the vendors. As for Stalker - I'm not aware of any near-playable version of the game which could be described as fit for deciding this argument. In all of the market-release DX9 gaming benchmarks I've seen, the ATi cards have come out on top. The 5950 Ultra made serious inroads into that, but the majority of the ATi series cards won out over their respective competitors. Given that the 5950 is a pricey card in order to achieve such results is a downside though.

    Overclocking is an interesting comparison. I myself have a 9800 non-pro which I have overclocked from:

    Core: 324MHz
    Memory: 290.3MHz

    to:

    Core: 411.8MHz (XT speeds)
    Memory: 330.8MHz

    One of my housemates did even better, overclocking from the same initial values to:

    Core: 490MHz
    Memory: 382MHz

    Nobody can deny that they are some pretty impressive results, although they seem limited somehow to Sapphire cards - another of my housemates has a Hercules 9800 Pro, and achieved almost zero increase. Nevertheless, I have yet to see an nVidia card achieve similar increases.

    I really hope nVidia come out with a great card in the next product cycle, but the NV3x has not been good for them. If the time comes, I will be happy to praise nVidia for the good work they've done, and suggest to people that they purchase their cards. However, I will not advise people to buy an extremely high priced nVidia card on the basis of a thoroughly un-optimised demo benchmark, nor am I happy to see people proclaiming nVidia will be superior simply because they have a sticker on the Doom3 box. In most realistic tests so far, ATi have come out on top, and until that changes, I will keep advising people to buy thier product.

    Long live healthy competition!


  • Registered Users, Registered Users 2 Posts: 15,817 ✭✭✭✭po0k


    ATi have been a slow juggernaut of a graphics company, the only thing that kept them going through the pre-Radeon years was the OEM contracts (you'd have a very high chance of finding a RAGE128 on most OEM servers these days, and looking back at Gateway, Dell etc, I remember seeing people get these new machines with nice new fast cpus and then finding an onboard ATi chip inside them, sighing and saying "**** for games".

    Glad to see that has changed as it gave nVidia a bit of a kick up the arse.
    I was afraid of seeing them becoming complacent and sitting on their laurels, concentrating on "fluff" cinematic effects while trading off performance, as (imho) was the case with the VooDoo4/5/6 series and their T-buffer etc.
    And not having 32-bit support in the VooDoo3 was a bit of a failing on their part :)
    Things like shadows and fine gradients look terrible under 16-bit.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 6,560 ✭✭✭Woden


    Originally posted by mr_angry


    Nobody can deny that they are some pretty impressive results, although they seem limited somehow to Sapphire cards - another of my housemates has a Hercules 9800 Pro, and achieved almost zero increase. Nevertheless, I have yet to see an nVidia card achieve similar increases.


    thats a nice overclock especially on your housemates ones, however i seem to find that nvidia cards will overclock just as well especially the 5900LX/XT range which starts off with something like 400/350 and can be brought up to 5900 ultra frequencies and in a number of cases 5950 speeds at 475/475 which is quite impressive for a circa €200 gfx card.

    personally my 9800pro will do 452/392 and thats a powercolour one


  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    Given that Doom3 and Stalker are both NV promo games, I would bet that Stalker will match ATI performance and Doom3 will beat it. In regards Doom3, you have to remember that nVidia has always (and continues) to have the best OpenGL support on the market, combined with the fact that they (and ATI) write their own extensions to OpenGL spec, providing a better fitting API that DirectX ever could be. The "sneak peek" Doom3 bench was sancantioned by ID, I would assume it was a little bit further along than the leaked alpha.. the final product will probably be heavily optimised for nVidia (tbh, the FX will need it), but that really doesnt matter to the end user. Lets not forget that nVidia cards support HW accelerated Stencil Shadowing, a major part of Doom3! :p

    In regards those overclocking results, impressive stuff. As I have the top end clocked FX, I cant match those figures percentage wise (perhaps a 5900XT could?), but Ive managed

    Core: 475MHz -> 580MHz
    Memory: 950MHz -> 1.04GHz

    With stock cooling, no mods on an already heavily overclocked system. I assume with another cooler and voltage mods it would push the core over 600MHz.

    All FX cards support PS2.0 and the NV30 range are just a handful of instructions from supporting PS3.0 (the spec wasnt final when HW was completed). I agree tho, there is no getting away from the fact NV's PS2.0 is much slower than ATIs.




    Matt


  • Closed Accounts Posts: 2,918 ✭✭✭Deadwing


    Originally posted by mr_angry

    Nobody can deny that they are some pretty impressive results, although they seem limited somehow to Sapphire cards - another of my housemates has a Hercules 9800 Pro, and achieved almost zero increase. Nevertheless, I have yet to see an nVidia card achieve similar increases.

    I really hope nVidia come out with a great card in the next product cycle, but the NV3x has not been good for them. If the time comes, I will be happy to praise nVidia for the good work they've done, and suggest to people that they purchase their cards. However, I will not advise people to buy an extremely high priced nVidia card on the basis of a thoroughly un-optimised demo benchmark, nor am I happy to see people proclaiming nVidia will be superior simply because they have a sticker on the Doom3 box. In most realistic tests so far, ATi have come out on top, and until that changes, I will keep advising people to buy thier product.

    Long live healthy competition!
    Well said indeed :) Ive used both ATI and Nvidia cards, so im not really biased in either ones favour. (Tho i do love my 9800pro)
    As i said, Deus ex2 has the nvidia sticker on the box, yet runs without a hitch whatsoever, with a fairly blistering framerate on mine. (havent had ANY slowdown yet) So i hightly doubt doom3 will will be a jerky, jaggy, crapfest on ATI cards.


  • Registered Users, Registered Users 2 Posts: 3,312 ✭✭✭mr_angry


    Fair enough - I accept those points. But the Doom3 benchmark in question was run on the 12th May last year, and the alpha released not long before that wasn't even capable of running on most ATi cards without modification. To say its a fair benchmark is... well, dubious.

    The only real test is when the game actually comes out. I'm not trying to say ATi cards will definitely run it better. In fact, this agreement goes to suggest that nVidia cards will perform better. But that is marketing-talk, not actual results, and I just wanted to point out that nobody should jump to conclusions on the basis of this story, as some people were doing earlier in the thread.


Advertisement