Advertisement
Help Keep Boards Alive. Support us by going ad free today. See here: https://subscriptions.boards.ie/.
https://www.boards.ie/group/1878-subscribers-forum

Private Group for paid up members of Boards.ie. Join the club.
Hi all, please see this major site announcement: https://www.boards.ie/discussion/2058427594/boards-ie-2026

Article: Doom3 vs HL2 aka nVidia vs Ati

  • 23-08-2004 01:02PM
    #1
    Registered Users, Registered Users 2 Posts: 8,574 ✭✭✭


    http://www.sudhian.com/showdocs.cfm?aid=581


    Interesting how the press is now turning against ATI and (more so) Valve:

    "Because after the HL2 benchmarks were posted, general opinion of the GeForce FX series went in the toilet. Even in cases where reviewers remained neutral and cautioned readers to wait for shipping product, forums and commentary exploded, most of it damning NVIDIA and vindicating ATI as “the” hardware to purchase. The entire process did tremendous PR damage to the GeForce FX line and influenced consumer buying decisions all based on benchmark results using a game engine whose status as a finished product had been substantially mis-represented."



    Matt


«1

Comments

  • Registered Users, Registered Users 2 Posts: 2,614 ✭✭✭BadCharlie


    Its like the English soccer players. The English press builds them up to heroes just to knock them down when the time is right.


  • Registered Users, Registered Users 2 Posts: 17,958 ✭✭✭✭RuggieBear


    Scary to think how the graphic card companies are practically turning the pc into seperate consoles based on the gfx card.....

    ATi games only working on Ati cards and vice versa in a few short years!??? :eek:


  • Registered Users, Registered Users 2 Posts: 1,864 ✭✭✭uberpixie


    "Corporate “sponsorship” could end up stifling the industry if partnership with one company or the other is seen as essential to building a successful product."

    Anyone remember the days when a 3dfx card was the only card worth getting because it was the only card developers properly supported, and of course the games that could only be played using 3dfx...


    This sort of stuff has been going on for ages. It is just different now that there are 2 dominant players in the gfx market not just 1.

    To be honest most developers will try to make a game work just as well on both sets of graphics cards in order to get the most people to buy their game.


  • Moderators, Entertainment Moderators Posts: 18,084 Mod ✭✭✭✭ixoy


    Excuse my ignornace here, but I thought the DirectX packages were developed to enable developers to make games for any system and not worry about what graphics cards they have, right? So surely all games should work almost equally well regardless of the GFX card. Is the only difference that they'll have programmed in a few tweaks for their prefered card but nothing hugely different?


  • Registered Users, Registered Users 2 Posts: 3,738 ✭✭✭BigEejit


    well if you look at http://www.theinquirer.net/?article=18029 you'll see a whole different spin going on now .... Nvidia is on the up and its going to take off ....

    Basically, boys in the know (OEM's) are going for Nvidia because it has features that Ati wont even have in its next gen cards .... and maybe more telling: a lot of new games about to be released or in development use the Doom3 engine == Nvidia higher FPS ..... its all downhill for a couple of years for Ati ....

    And thats not including SLI ;)


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 17,958 ✭✭✭✭RuggieBear


    ixoy wrote:
    Excuse my ignornace here, but I thought the DirectX packages were developed to enable developers to make games for any system and not worry about what graphics cards they have, right? So surely all games should work almost equally well regardless of the GFX card. Is the only difference that they'll have programmed in a few tweaks for their prefered card but nothing hugely different?


    That was the idea but now it's such a big bussiness that the "few tweaks" can now mean big differences in framerates.

    here's another article comparing higher end gfx cards

    Who the hell do you trust with all these dif results emerging? :rolleyes:


  • Closed Accounts Posts: 2,808 ✭✭✭Dooom


    I don't think there'll be games that'll only run on certain hardware. Well, it is a possiblility but I think despite its immaturity, the games industry is more sensible than that. I mean unless the hardware manufacturer gives the developers a huge load of cash they might. But I think as time goes on, the games industry might actually realise that they want people to enjoy their games, not just aim it at a certain market. Maybe.

    Well it's either that or EA will take over the entire industry and everything will be realeased on all formats. Everywhere people go they'll be hearing "Challenge Everything", except considering that EA will probably begin owning the world - it'll be more like "Challenge Nothing". Followed by an onset of global opression and communist dictatorship run by EA. Damn publishers.


  • Registered Users, Registered Users 2 Posts: 17,958 ✭✭✭✭RuggieBear


    Spike wrote:
    Well it's either that or EA will take over the entire industry and everything will be realeased on all formats. Everywhere people go they'll be hearing "Challenge Everything", except considering that EA will probably begin owning the world - it'll be more like "Challenge Nothing". Followed by an onset of global opression and communist dictatorship run by EA. Damn publishers.

    ...and i thought i was a pessimist! :D


  • Registered Users, Registered Users 2 Posts: 10,283 ✭✭✭✭BloodBath


    That's a good write up but I don't think the half life 2 benchmarks were the main reason that Ati took the lead in the graphics card market over Nvidia's FX as it seems to indicate.

    The FX range was simply worse than Ati's 9xxx cards and they still are. Nvidia messed up with the FX and no-one else is to blame.


    BloodBath


  • Registered Users, Registered Users 2 Posts: 8,574 ✭✭✭Matt Simis


    BloodBath wrote:
    That's a good write up but I don't think the half life 2 benchmarks were the main reason that Ati took the lead in the graphics card market over Nvidia's FX as it seems to indicate.

    The FX range was simply worse than Ati's 9xxx cards and they still are. Nvidia messed up with the FX and no-one else is to blame.


    BloodBath

    Ah, but thats the crux of it isnt it?
    I dont think they were the only reason, but I do think they were the main reason. The HL2 benchmarks were the embodiment of "all that was wrong with FX". I had both cards at the time, I recall little to no difference for the games available at the time, the majority were DX8 based. Some of them (eg BF1942) played better on nV.

    Only when Farcry (a year later) came along did a game actually available play noticably better on R300+... and I wouldnt really call FarCry a "killer app".


    Matt


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 10,283 ✭✭✭✭BloodBath


    Yeah but that's it exactly. The fx was just as good if not better than the Ati cards at directx 8 but when it came to dx9 it fell significantly behind.

    This is where the fx line died and anyone that had spent 400 quid for one was kicking themselves for buying the card based on dx8 benchmarks.

    The same thing may be happening with the 6800's and the x800's axcept in Nvidias favour this time.


    BloodBath


  • Registered Users, Registered Users 2, Paid Member Posts: 5,671 ✭✭✭Slutmonkey57b


    If the spin is that ATI is on the downward spiral because of lack of SLI, I have to disagree. SLI is the exact opposite of a "killer app" - or rather it is a killer app because it killed 3dfx (Nvidia seemed to have forgotten this). The whole concept of SLI is only useful in the low-end or workstation market - nobody in their right mind, even the must-have instant adopters is going to buy two cards to run in SLI mode - it's just too expensive. Whatever the press write up about X800 and 6800, the fundamental purchasing decision for the consumer is price/performance ratio. Fundamentally, the average punter regards €200 on a card as the upper limit. €400 for the latest tech is all well and good but the market (as a whole) doesn't bite at that price, for any performance.

    The OEM market relies almost entirely on low-end parts - the "checkboxes" The Inq. refers to are all well and good but all OEMs list their graphics almost exclusively by memory size, not features, as this is what Joe Bloggs has been told to look at by his savvy friends - "Bigger is better". That's why all Dell ads have "128MB Nvidia graphics" or suchlike, rather than "Crippled 128MB 5200 part that's worse than a 64MB GF3". I remain to be convinced that PS 3.0 will become an issue before the R500 comes out. Anything shown at a trade show now will just be on the shelves for Chrimbo. Maybe.
    Far Cry is a perfect example of PS 3.0 making no difference to the game.


  • Registered Users, Registered Users 2 Posts: 6,984 ✭✭✭Venom


    Some unbiased testing can be seen here. Seems Nvidia are really trying to make sure that 6800 users are not left in the lurch with HL2 :D


  • Registered Users, Registered Users 2 Posts: 8,574 ✭✭✭Matt Simis


    If the spin is that ATI is on the downward spiral because of lack of SLI, I have to disagree. SLI is the exact opposite of a "killer app" - or rather it is a killer app because it killed 3dfx (Nvidia seemed to have forgotten this). The whole concept of SLI is only useful in the low-end or workstation market - nobody in their right mind, even the must-have instant adopters is going to buy two cards to run in SLI mode - it's just too expensive. Whatever the press write up about X800 and 6800, the fundamental purchasing decision for the consumer is price/performance ratio. Fundamentally, the average punter regards €200 on a card as the upper limit. €400 for the latest tech is all well and good but the market (as a whole) doesn't bite at that price, for any performance.

    snip

    I remain to be convinced that PS 3.0 will become an issue before the R500 comes out. Anything shown at a trade show now will just be on the shelves for Chrimbo. Maybe.
    Far Cry is a perfect example of PS 3.0 making no difference to the game.

    SLI is not the issue at hand here, its merely one of them. I too dont count it as major interest to the OEM market, however it is to the retail market. Lets not forget that retail is the bread and butter of many other hardware companies (Creative, Logitech etc) and while that may not be quite that case with nVidia, its still important.

    Your statement that "SLI killed 3DFX" is flawed, SLI made 3DFX the darling of the enthusiast community and gave retailers a great selling point (buy one now and if in a year you find it slow, increase the speed 100%) and had nothing to do with 3DFX's downfall. I suggest you read up on 3DFX, but in a nutshell the switch from a vertical business model (making chips and selling on) to a horizontal business model (purchasing STB, making chips and boards) therefore competing with their own distributors, coupled with grossly underqualified management (they were visionaries and super-geeks, I salute them!) and poor yields/luck destroyed the company. Even on most basic level, 3DFX dissolved years after they dropped SLI support, which was well after the intro of the Voodoo3, before "Rampage". SLI was a VooDoo2 technology. One last tidbit, nVidia will never forget the mistakes of 3DFX as many 3DFX employees work there now, they vowed to use the best elements of both technologies in future products.

    I had 2 seperate SLI setups, back then the cards initially cost more than the current top end do, and there was a much smaller market. SLI in 2004 is 2x $199-399 cards, not 2x $699 cards as it was then.

    Regarding PS3, Ill just say that Farcry is an "old" engine at this point.. the Inq article specifically mentioned titles that were better on nVidia HW, you would be better off reserving judgement until you see them.


    Matt


  • Registered Users, Registered Users 2 Posts: 1,725 ✭✭✭kmb


    Well all i know3 from experience i always had nvidia graphics cards and decided a few months ago to get the 9700 pro from connect3d and even though graphics are very good there are a lot of unhappy 9700 pro users out because of problems and fussy bios settings like 64mb agp memory etc etc.
    I have had problems with it playing doom3 and call of duty and after searching ,call of duty has a *fix*
    Now i never had these problems with nvidia cards and i think it was the voodoo 3 was the last time i had to download a fix for certain games.
    I think i will be going back to nvidia sometime soon ...very soon.
    Anyone wanna buy a 9700 pro?


  • Registered Users, Registered Users 2 Posts: 698 ✭✭✭vishal


    i also believe carmack did not optimise preferentially for one architecture than the other. it can be seen that the 9800 pro preforms better than equivalent card i.e. the 5900 ultra. the fx series clearly did suck, it preformed worse in just about every dx9 game. however the 6800 series has specific technologies that doom 3 wanted. as one executive already said that they put the stuff carmack wants to be put in.

    gabe newell is full of s***. he knew the game was nowhere near fininshed before it was leaked. what is interesting that the 6800 gt preforms on par in hl2 benchmarks compared with the x800 pro. i really think that they have optimised much more for the ati hardware than they have for nvidia. what really annoys me is that they have partnerships in the first place. in a hl video i watched, gabe newell said that he would like to thank ati. for what?
    none of the technolgies that the 6800 has helps hl2 so that prob help explain the bechmarks.

    what is important is mainstream graphics cards and clearly the winner is 9800 pro. as newer games are released that requirer faster gpus, the companies release newer cards that are based on the newer architecture and they just scale down the higher end card as will be the 6600 and x700 and i am sure they will preform about equal. that is where the real contest will be.


  • Registered Users, Registered Users 2 Posts: 8,574 ✭✭✭Matt Simis


    vishal wrote:
    what is important is mainstream graphics cards and clearly the winner is 9800 pro. as newer games are released that requirer faster gpus, the companies release newer cards that are based on the newer architecture and they just scale down the higher end card as will be the 6600 and x700 and i am sure they will preform about equal. that is where the real contest will be.


    But you have to remember the 9800pro is out of production now, there are no more cards, therefore it cannot realistically be considered a valid mainstream card. From the benchmarks posted above, they recommend the bog standard "6800" (not GT or Ultra) as a good mainstream card right now. Personally I think the 6600 PCI Express part will make a nicer all round mainstream card (cheap with SLI support :D).


    Matt


  • Registered Users, Registered Users 2 Posts: 2,760 ✭✭✭Col_Loki


    Great article, and good reading. Interesting stuff.......


  • Registered Users, Registered Users 2 Posts: 6,360 ✭✭✭OfflerCrocGod


    If the spin is that ATI is on the downward spiral because of lack of SLI, I have to disagree. SLI is the exact opposite of a "killer app" - or rather it is a killer app because it killed 3dfx (Nvidia seemed to have forgotten this). The whole concept of SLI is only useful in the low-end or workstation market - nobody in their right mind, even the must-have instant adopters is going to buy two cards to run in SLI mode - it's just too expensive. Whatever the press write up about X800 and 6800, the fundamental purchasing decision for the consumer is price/performance ratio. Fundamentally, the average punter regards €200 on a card as the upper limit. €400 for the latest tech is all well and good but the market (as a whole) doesn't bite at that price, for any performance.
    You just don't get it....in one or two months 6600GTs will be available for around €200 - they will beat the 9800XTs in benchs/games and will be nearly the same price but you can get two and get more power then a 6800Ultra again for about the same price. That kinda GPU power is plenty to play most new games for the next year on very-high/good quality settings by the end of that year (give or take 3-4 months) you will be seeing €200 6800Ultras - just like the 9800Pros now; you grab two of those if you need them. That's the beauty you can give yourself a massive upgrade by buying last generation Gfx cards and you will still have a huge amount of power left over for playing new games with all the eye candy. That's the beauty, that's sweetness itself
    :D. What about the mid-range next generation Gfx cards who's to say they wont support SLI? You could keep going for 3 years more! with the same computer and just replacing the Gfx cards and still have a top of the range gaming machine :eek: ATI don't have THAT

    That is a big that ;)


  • Registered Users, Registered Users 2 Posts: 1,419 ✭✭✭nadir


    Ive always run nvidia cards, but just because of the superb linux drivers.
    Big surprise I dont care much for directx, I seen somone saying above that it enables developers to develop for different system, but in the end its MS software, GLSL is the way to go, its argubly superior now to dx9 :D, and sertainly easier to develop with, for all different types of platforms. Unfortunately MS decided not to ship it with windows, I wonder why, so much for supporting all systems.
    but to answer the question nvidia pwn ATI.
    shame on you kmb for switching :D


  • Advertisement
  • Moderators Posts: 5,617 ✭✭✭Azza


    I'm still undecided as to what to do. ATI clearly have been superior up for the last while and are probably are still just about are overall. But the majority of games I'm going be playing are Open GL based. Still I'm looking foward to Half Life 2 more than playing Doom III. I also wish to try a 64bit Linux Distrabution but there is no 64bit ATI drivers out and besides Linux dosen't support directx. Gosh darn it what to do what to do.


  • Registered Users, Registered Users 2 Posts: 1,419 ✭✭✭nadir


    well man, I got to say, glsl looks amazing, remeber opengl and nvidia have a much better history than either directx or ATI, and tbh I dont see any ati and/or DX keeping their advantage for long, infact it may be slipping already. Id say give it a few more months and see whats going down then. Also If like you say most of the games you will be playing are opengl based and you want to use linux ;) simple choise really.


  • Registered Users, Registered Users 2, Paid Member Posts: 5,671 ✭✭✭Slutmonkey57b


    In order to take advantage of SLI you're going to have to buy current top-of-the line hardware. The current top-gen cards will drop to €200 sooner or later. Maybe, but 2 x €200 = ?? Too much! By then the next-gen cards like the R500 will be out, and what's your choice then? Take two bottles into the shower? Also, there's no evidence to suggest that twice the cards = twice the benchmarks. Having multiple cores on single cards doesn't produce that result, so single cores on seperate cards with an additional "fake a single card" software overhead is hardly going to do so. I'd expect 50 - 75% but not more than that.

    Matt Simis might well be prepared to fork out €800 on two graphics cards, but he is the exception that proves the rule (or a lunatic who needs to get out more depending on your point of view). Ditto, what the Linux users want is irrelevant in sales terms, so we can ignore that argument. Games developers ship what works on windows, so if MS is not shipping OGL as standard, tell me what incentive developers have to waste time coding for it? Nothing? OGL has not been the driving standard in games development from the moment that DirectX started to incorporate a comparable feature set - ie from DX6 on. If developers have a choice between coding for a technically superior platform (open GL 2), or the vastly numerically dominant one that just about does the job they want (DX), we already know what the answer is, don't we?

    SLI is irrelevant in retail terms. There are not enough high-end buyers to make a market out of it. J. Bloggs buying his equipment in PC World or Game is not going to be swayed by the argument "Don't buy that card! If you buy this card, then you get the fantastic opportunity to buy another one just like it from me in six month's time, stick them together and make a decent card!" Ehh, J. Bloggs is going to think "Rip off!" and he's right. And that assumes that you will be able to mix and match different generations of cards under SLI. If that's as smooth as Nvidia would have you believe I will only say:
    "Lying Fuc.ks"

    By the time the mobos and cards have occupied the affordable bracket, next gen hardware will be out, and will wind up being the same price (ultimately) as an SLI rig based on current gen hardware. If you accept that only the enthusiast market is going to drive SLI sales, and the enthusiast always goes for the latest gear, what is the choice going to be? The consumer on a budget does not think "hey, I've got a budget of €200 here, I could buy a single middle of the road card, or two cheapo cards and string em together for a slight performance boost that won't make up the gap between the cheap card and the midrange one!"

    I'm not saying ATI is going to continue stomping all over Nvida over the next year as it has the previous years, but the reasons the battle is won are not those listed in the Inq article.

    Price vs Performance + Availability + Reputation = Sales

    Nvidia lost the plot with the fx because it destroyed its reputation with weaselly bull, failed to provide availability, and over-priced. Currently ATI's main problem with the X800 is availability - the same problem Nvidia has.


  • Registered Users, Registered Users 2 Posts: 8,574 ✭✭✭Matt Simis


    In order to take advantage of SLI you're going to have to buy current top-of-the line hardware. The current top-gen cards will drop to €200 sooner or later. Maybe, but 2 x €200 = ?? Too much! By then the next-gen cards like the R500 will be out, and what's your choice then? Take two bottles into the shower?
    The 6600 is the first SLI card, its a mid range card, not top end.

    Games developers ship what works on windows, so if MS is not shipping OGL as standard, tell me what incentive developers have to waste time coding for it? Nothing? OGL has not been the driving standard in games development from the moment that DirectX started to incorporate a comparable feature set - ie from DX6 on.
    You are aware all Quake3 and Doom3 games engines are and will be OpenGL right? OGL makes up a sizeable chunk of the market thanks to ID Software. OpenGL doesnt need or benefit from MS support, its supported directly by HW vendors and has its own review board.

    SLI is irrelevant in retail terms. There are not enough high-end buyers to make a market out of it. J. Bloggs buying his equipment in PC World or Game is not going to be swayed by the argument "Don't buy that card! If you buy this card, then you get the fantastic opportunity to buy another one just like it from me in six month's time!" Ehh, J. Bloggs is going to think "Rip off!" and he's right.
    I disagree. J Bloggs has the choice of a mid range, EUR200 card from nVidia and ATI. Both play games the same. Whats the difference? Well, "the nVidia offers this great new "upgrade" technology, it wont go obsolete as you can add on another card to make it faster".. You are thinking too much about the technical side of this. You underestimate the power of seeing one product doesnt support a feature a similar product does for the same price, to the consumer. Even if its a useless feature, why buy something for the exact same price that doesnt have X features.. it seemingly makes no sense to them. I fail to see how anyone could spin having SLI vs not having it as a "Rip off".
    Nvidia lost the plot with the fx because it destroyed its reputation with weaselly bull, failed to provide availability, and over-priced. Currently ATI's main problem with the X800 is availability - the same problem Nvidia has.

    Ive been shopping for the X800Pro and 6800GT and generally its easier to find 6800GTs...



    Matt


  • Registered Users, Registered Users 2 Posts: 3,312 ✭✭✭mr_angry


    I was under the impression that you had to have 2 similar, if not identical cards to make use of SLI, no? If thats the case, then your single purchase card is equally as likely to go obsolete as it is without SLI.


  • Registered Users, Registered Users 2 Posts: 6,984 ✭✭✭Venom


    Matt Simis wrote:
    The 6600 is the first SLI card, its a mid range card, not top end.

    You are aware all Quake3 and Doom3 games engines are and will be OpenGL right? OGL makes up a sizeable chunk of the market thanks to ID Software. OpenGL doesnt need or benefit from MS support, its supported directly by HW vendors and has its own review board.

    I disagree. J Bloggs has the choice of a mid range, EUR200 card from nVidia and ATI. Both play games the same. Whats the difference? Well, "the nVidia offers this great new "upgrade" technology, it wont go obsolete as you can add on another card to make it faster".. You are thinking too much about the technical side of this. You underestimate the power of seeing one product doesnt support a feature a similar product does for the same price, to the consumer. Even if its a useless feature, why buy something for the exact same price that doesnt have X features.. it seemingly makes no sense to them. I fail to see how anyone could spin having SLI vs not having it as a "Rip off".

    Matt

    Your wasting you breath Matt. Argueing with an ATI fanboi is pointless. If the 6800 cards was 100 fps faster in everygame at any resolution and detail settings they would still bitch about something. Nvdia screwed up with the FX series of cards but learned from those mistakes and came back with a killer chipset in the 6800 series.

    Just imagine the power of a P4 4.0ghz chip running two 6800 Ultra Extreme cards in SLI someday :D


  • Registered Users, Registered Users 2 Posts: 8,574 ✭✭✭Matt Simis


    Venom wrote:
    Your wasting you breath Matt. Argueing with an ATI fanboi is pointless. If the 6800 cards was 100 fps faster in everygame at any resolution and detail settings they would still bitch about something. Nvdia screwed up with the FX series of cards but learned from those mistakes and came back with a killer chipset in the 6800 series.

    Just imagine the power of a P4 4.0ghz chip running two 6800 Ultra Extreme cards in SLI someday :D


    Ahh well, thought it was worth a shot! :D


    Matt


  • Registered Users, Registered Users 2 Posts: 698 ✭✭✭vishal


    unless you are a performance freak with more money than you can count sli is a waste of time. i would never consider it because 1 year down the line there will be a better card supporting various new features i.e dx9d or some like that, and it will prob perform better than 2 6800gt


  • Registered Users, Registered Users 2 Posts: 6,984 ✭✭✭Venom


    vishal wrote:
    unless you are a performance freak with more money than you can count sli is a waste of time. i would never consider it because 1 year down the line there will be a better card supporting various new features i.e dx9d or some like that, and it will prob perform better than 2 6800gt

    Well anything is possable but the 9800 card is getting on in years and is still able to hold its own. Imagine two of those linked togeather and what they could do.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 4,228 ✭✭✭Scruff


    vishal wrote:
    unless you are a performance freak with more money than you can count sli is a waste of time. i would never consider it because 1 year down the line there will be a better card supporting various new features i.e dx9d or some like that, and it will prob perform better than 2 6800gt

    me thinks you underestimate the amount of performance freaks with more money than you can count. There are more than enough of them out there that would happily snap up 2 6800gt's if they could run them in SLI mode.
    True its not a big incentive for casual users but even for mainstream gamers it does give pause to think.


Advertisement
Advertisement