Advertisement
Help Keep Boards Alive. Support us by going ad free today. See here: https://subscriptions.boards.ie/.
https://www.boards.ie/group/1878-subscribers-forum

Private Group for paid up members of Boards.ie. Join the club.
Hi all, please see this major site announcement: https://www.boards.ie/discussion/2058427594/boards-ie-2026

X800 review up at THG

«13

Comments

  • Registered Users, Registered Users 2, Paid Member Posts: 5,671 ✭✭✭Slutmonkey57b


    Both long reviews so here's a summary:

    In Far Cry
    <GF6800U> *SLAP* "How'd ya like it bitch? Huh?"
    <R420> "That's some ugly **** you're bringing it the table. Bring it up to where the big boys play, eh?"
    <3d Forum poster> "Oi! GF6800U! U R CHEATOR!!"
    <GF6800U> "Huh? Me? No no no PS 1.1 here... don't look at me! DONT LOOK AT ME!!"

    All other tests:
    <R420> *SLAP* "How'd you like me now fat boy, eh?" *SLAP* "Thought you'd need 2 molex connectors, eh?" *SLAP* "Have to buy a new PSU, do they?" *SLAP*
    <GF6800U> *WHIMPER*


    And stuff.


  • Registered Users, Registered Users 2 Posts: 6,360 ✭✭✭OfflerCrocGod


    I never realised Gfx cards were so abusive!:eek:, I like the commentary:D. We'll see what happens once Nvidia start "optimising" their drivers.


  • Registered Users, Registered Users 2 Posts: 3,754 ✭✭✭Big Chief


    Originally posted by Slutmonkey57b
    Both long reviews so here's a summary:

    In Far Cry
    <GF6800U> *SLAP* "How'd ya like it bitch? Huh?"
    <R420> "That's some ugly **** you're bringing it the table. Bring it up to where the big boys play, eh?"
    <3d Forum poster> "Oi! GF6800U! U R CHEATOR!!"
    <GF6800U> "Huh? Me? No no no PS 1.1 here... don't look at me! DONT LOOK AT ME!!"

    All other tests:
    <R420> *SLAP* "How'd you like me now fat boy, eh?" *SLAP* "Thought you'd need 2 molex connectors, eh?" *SLAP* "Have to buy a new PSU, do they?" *SLAP*
    <GF6800U> *WHIMPER*


    And stuff.

    lol pld ;)

    EAGERLY awaiting this, i knew i was doing the right thing waiting for ati's response, gonna get this at end of month (i spoke to europes and uk's PR from ati about 3 hours ago about getting a hold of one of these for a review and he told me this.. that they will be on shelfs 3/4 weeks from now..)

    CANT WAIT YEH BAAAAAAABY!:D


  • Registered Users, Registered Users 2 Posts: 2,760 ✭✭✭Col_Loki


    Nice review and mini-review :) !!

    Very interesting that the X800pro will be comming in at $399 , and the X800xt for $100 more. I hope they dont put a saddle on us europeans and hike up the price.


  • Registered Users, Registered Users 2 Posts: 6,762 ✭✭✭WizZard


    Big Chief, any idea of what prices they will be at when they hit our shelves? I think I will be treating myself and putting my life on hold for a couple of... weeks/months(?)


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 10,283 ✭✭✭✭BloodBath


    I'm still not sure which to go for. Ati made a big mistake not incorporating the shader 3.0 technology. Also they upped those clocks at the last minute so the cards are coming pre-overclocked which means there is probably not a lot of headroom to go higher.


    I'm waiting for a direct comparrison with overclocking to see which has the edge.


    BloodBath


  • Registered Users, Registered Users 2 Posts: 1,864 ✭✭✭uberpixie


    Sod pixel shader 3.0 this round. Twill be an age before any developers get around to using it. By the time shader 3.0 is a big deal you will be on a new card and probally 2 gens from now. Not much of a worry at the moment.

    I also doubt there is much head room for o'clocking on either set of cards.

    Ati are using a refined version of architecture around for the last two gens, they have to be reaching their upper limit by now.

    Nvidias are running very hot with a lot of juice through em already, thats before you even get to overclock!

    To be honest both sets of cards run fairly similar in D3D games, unless you consider 5 or 6 frames a huge gap:-)

    It will be down to image quality at the end of the day and how much dicking around both companies do with their drivers.

    Ati only taking up one slot and drawing less power is a nice advantage especially for the small form factor fans.

    At the end of the day either will put a smile on your face. Be interesting to see if prices drop on the currant high end.


  • Closed Accounts Posts: 1,321 ✭✭✭neokenzo


    Originally posted by COL_LOKI
    Nice review and mini-review :) !!

    Very interesting that the X800pro will be comming in at $399 , and the X800xt for $100 more. I hope they dont put a saddle on us europeans and hike up the price.

    We'll probably get screwed over as usual and pay a lot higher than that. Nevertheless, still cant wait for it to hit the shelves.


  • Registered Users, Registered Users 2 Posts: 4,317 ✭✭✭CombatCow


    Il prolly just wait for the pci express version of the 6800 ultra super max extreme pro edition or whatever it's gonna be called and then wait to get a waterblock for it :D


    ** if you dont take into account the 6800 ultra extreme and xt platinum edition the 6800 ultra and x800 pro are very close,nothing between them really there both fcuking monster cards.**

    CombatCow


  • Registered Users, Registered Users 2 Posts: 17,446 ✭✭✭✭jesus_thats_gre


    Originally posted by Combatcow

    ** if you dont take into account the 6800 ultra extreme and xt platinum edition the 6800 ultra and x800 pro are very close,nothing between them really there both fcuking monster cards.**

    CombatCow

    Is there not 100 quid in it?


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 6,560 ✭✭✭Woden


    interesting stuff alright, depending on the review you look at which one is coming out on top.

    thats enough to tell me there isn't much in it and that it won't come down to performance really when ya go to buy these cards.;

    the shader3.0 i don't think is a big thing i don't think it will come in til next generation really. what i do thin is important is that ati have this card running at less power then the XT. its single slot and that makes it sff compatible as someone mentioned. i think the power thing may be important to a number of people.

    driver optimisations will be important but i still don't really trust nvidia on image quality and stuff though the new card seems alright some of the stuff on the 5950 looked crap.

    nvidia may have the lower high end market though with the GT version possibly being somewhat better then the standard X800 pro. nvidia also have the 6850 or the extreme version that will run at 450mhz i think and its included in some of the reviews and that is better then the X800 XT i think but apparently will be in massively limited supply and the most of em capable of 450mhz will go to gainward.

    with regards to overclocking the ati card is made on a 130nm processor which although refined and giving them the clock speeds over 500mhz apparently tops out at 600mhz so there ain't much room for overclocking alright.

    it will be interesting however i don't think i will be upgrading until the next gen after this the 9800pro will have to do til then

    data


  • Closed Accounts Posts: 4,943 ✭✭✭Mutant_Fruit


    What kind of processor would you need to take full advantage of those cards?


  • Closed Accounts Posts: 1,321 ✭✭✭neokenzo


    Here's another review from Anandtech and Driver Heaven.


  • Registered Users, Registered Users 2 Posts: 10,283 ✭✭✭✭BloodBath


    The final versions of the nv6800 will only use one slot, they are changing the fan apparantly and nv seem to have sorted out their picture quality.

    The only real advantage Ati have is that they have slightly better AA and use less power.

    Pixel shader 3 is being used in games already. Far cry uses an early version of it. In fact most games coming out in the near future will incorporate it. Even half life 2, a partner of Ati, will incorporate it although they will not advertise that it uses it. Not to mention all the partners of Nvidia and they have a lot.

    I'm sure id will incorporate it into doom3. It is a big thing.


    The war is only starting. It will be a while yet before we know which card really is better.



    BloodBath


  • Registered Users, Registered Users 2 Posts: 4,317 ✭✭✭CombatCow


    True bloodbath,theres still a lot to be seen from both cards with driver updates and optimizations.Id say either card would match up nicely with a 2.0GHz A64 cpu and a Gig o' ram ;)

    Also a recent review in a mag said the 6800 ultra worked perfictly with other hi-spec components and a antec 430W psu,and what uber nerd ''thats going to buy the 6800 card'' is not going to have at least a 450W psu ??

    Some more reviews :

    http://www.beyond3d.com/reviews/ati/r420_x800/

    http://www.bjorn3d.com/_preview.php?articleID=457&51391

    http://www.gamers-depot.com/hardware/video_cards/ati/x800/001.htm

    http://www.lostcircuits.com/video/ati_x800/

    http://www.neoseeker.com/Articles/Hardware/Reviews/r420preview/

    http://www.pcper.com/article.php?aid=38

    http://www.trustedreviews.com/article.aspx?art=418


    CombatCow


  • Registered Users, Registered Users 2 Posts: 12,036 ✭✭✭✭Giblet


    Doom III is a very big Open Gl game, and thats were Nvidia seem to have excelled.

    It's very close, it will be down to which ever is the cheapest, most overclockable i'd say.


  • Registered Users, Registered Users 2, Paid Member Posts: 5,671 ✭✭✭Slutmonkey57b


    Nvidia are going to have to convince a lot of people (like me) that they're not still lying slippery weasels after everything that happened with their last gen cards. What makes me suspicious is a) the whole PS 1.1 Far-Cry bit - Nvidia are again claiming that they support x y and z while in actual operation they aren't. The whole "look at how much better PS 3 is" demo where they were actually showing PS2 v PS1.1 doesn't inspire confidence either.

    What is also apparant is that they haven't learned from the 5800's intro - they introduced a new card which made a noise like a jumbo jet and took up two slots, thinking "Eh, they'll get used to it, **** em, we can't be bothered fixing it." This time they've introduced a card which takes up two slots (but likes to pretend it doesn't, as though it were a fat granny from Barnsley who's wearing her daughter's velure tank top) and requires its own electricity generator with the thinking "Eh, **** em, they can go out and spend 500 quid on the card and then another 150 on a new PSU. They'll get used to it and we can't be bothered fixing it."

    ATI have produced a card that gives the same performance for less overhead and less weaselly bullsh!t. And a lot of people may already have a fat PSU but if you're planning an upgrade the extra outlay is going to be a drawback for the 6800. Doom 3 will NOT be a PS3 game - Carmack has said it uses DX7 tech that the original GF could do (assuming it had the throughput).
    I suspect by the time x800 reaches its ceiling the r500 will be making its move.


  • Registered Users, Registered Users 2 Posts: 17,958 ✭✭✭✭RuggieBear


    Originally posted by Slutmonkey57b
    Nvidia are going to have to convince a lot of people (like me) that they're not still lying slippery weasels after everything that happened with their last gen cards. What makes me suspicious is a) the whole PS 1.1 Far-Cry bit - Nvidia are again claiming that they support x y and z while in actual operation they aren't. The whole "look at how much better PS 3 is" demo where they were actually showing PS2 v PS1.1 doesn't inspire confidence either.

    What is also apparant is that they haven't learned from the 5800's intro - they introduced a new card which made a noise like a jumbo jet and took up two slots, thinking "Eh, they'll get used to it, **** em, we can't be bothered fixing it." This time they've introduced a card which takes up two slots (but likes to pretend it doesn't, as though it were a fat granny from Barnsley who's wearing her daughter's velure tank top) and requires its own electricity generator with the thinking "Eh, **** em, they can go out and spend 500 quid on the card and then another 150 on a new PSU. They'll get used to it and we can't be bothered fixing it."

    ATI have produced a card that gives the same performance for less overhead and less weaselly bullsh!t. And a lot of people may already have a fat PSU but if you're planning an upgrade the extra outlay is going to be a drawback for the 6800. Doom 3 will NOT be a PS3 game - Carmack has said it uses DX7 tech that the original GF could do (assuming it had the throughput).
    I suspect by the time x800 reaches its ceiling the r500 will be making its move.


    Exactly! That's it down to a tee.....


    P


  • Registered Users, Registered Users 2 Posts: 759 ✭✭✭El_MUERkO


    After nVidia's muppetry with the 5800>5900>5950 its hard foe me not to be biased toward the X800 and scream 'nVidia sux balls!' at all the fanbois who've been going nuts over the 6800ultra lately.

    However thats what I'm going to do till the retail release of both cards with optimised drivers.

    I'm still thinking ATi are going to blow nVidia out of the water but if the big 'n' can reduce the size, noise and power needs of the card while increasing its performance at least we'll have a level playing feild so the two giants of the market can duke it out with a nice price war for us the consumer to benifit from :D


  • Registered Users, Registered Users 2 Posts: 10,283 ✭✭✭✭BloodBath


    They are changing the fan on the card to one that won't take up a pci slot.

    Whats up with the companys in the first place when they can't even make a decent cooler for their own card. The Radeons for example. Asus were the only ones that made an effort and it still wasn't great. The 8 euro artic vga cooler keeps the 98xx series 30c cooler and is no louder.

    I'm not an Nvidia fan at all. Never bought their cards coz they were muck. The Fx range anyway. They used to be the leaders though, they just ****ed up and underestimated ati.

    They do seem to have a really good product on their hands this time that gives big boosts from current technology. Don't say you weren't impressed by the early benchmarks. With some optimisation they seem to be on a fairly even level.


    Ati is looking good apart from the whole pixel shader 3 thing.


    BloodBath


  • Advertisement
  • Closed Accounts Posts: 4,943 ✭✭✭Mutant_Fruit


    ATI are ahead with this card in everythinge except openGL performance. In every OgenGl game, Nvidia are ahead, and ATI trail by up to 20fps i the higher resolutions.

    Still, that ATI would more than likely be my next upgrade, cos i don't want to have to go out and replace my brand new 350watt power supply, and i need the 3 PCI slots in my mobo, i can't have a new card taking over my AGP AND my PCI.

    Looks like ATI have aced it again.


  • Closed Accounts Posts: 296 ✭✭M@lice


    With the cooling demands of modern gpu's i reckon motherboards in the future are going to allow for huge coolers so the loss of a pci slot will no longer be an issue. I didn't see it anywhere on the bx outlay tho.

    What are your thoughts? I think it would be a good idea


  • Registered Users, Registered Users 2 Posts: 10,283 ✭✭✭✭BloodBath


    If the companies invested a little more r & d into the cooling there wouldn't be a problem or even hired a third party to do it for them.

    It's bad design more than anything else.


    BloodBath


  • Registered Users, Registered Users 2 Posts: 4,317 ✭✭✭CombatCow


    You would think nvidia could use single slot cooling on the 6800 ultra if they changed that fat alu heatsink to a thinner copper 1 no ???

    with prety much everthing intergrated into the mobo these days im surprised ppl still use up all there pci slots :confused:

    Interesting times ahead now with socket 939,ddr2, pci-express,faster 64 bit cpu's,and now these 2 beasts.when will it end ??


    CombatCow


  • Registered Users, Registered Users 2 Posts: 10,283 ✭✭✭✭BloodBath


    Hopefully never. The slump in the market is gone and they are pumping money back into development.


    I use up nearly all my pci slots. Onboard sound doesn't cut it. Mine has a auidgy 2 ls souncard. A fan card. A control switch for the lights, a usb/fire wire extra port connection and an extra network card.

    Not that half of that is needed of course but it's handy.


    BloodBath


  • Closed Accounts Posts: 3,354 ✭✭✭secret_squirrel


    The interesting thing I saw on a review on the register.com was that they were testing with a P4 3.2Gz and still reckoned both cards were being CPU limited. That means they might scale well as CPU's get faster too!

    Cant help feeling that from a physical point of view the ATI is much better. Especially for the SFF boys out there like me.

    Arent Nvidia underestimating the size of this market a bit?


  • Moderators Posts: 5,617 ✭✭✭Azza


    Looks like ATI for me the moment but am worried thats the roof performance wise...and not overclockable......I don't think pixel shader matters much....nvidia have apparantly arsed it up anyway need drivers to fix it shortly but its actually power than 2.0 as it stands. Wait till I see the Ultra Extreme that nvidia produces with less power requirements. Overall though alot of games and benchmarks are split but seem to be leaning marginally towards ati...even if the Ultra Extreme or Extreme Ultra or what ever it is is slightly better I think its mean't to basically the same card just a higher clock....code cooling on standard model may bring it to that level anyway.....

    Darn its just to close to call


  • Registered Users, Registered Users 2 Posts: 2,005 ✭✭✭CivilServant


    At least you can trust Ati with their benchmarks and you hear it straight from the horses mouth. With nvidia you really don't know what you're getting yourself in for. Driver optimisations will fix this game and that. All we know for certain is that it performs well on the benchmarks we've seen but how about the upandcoming new games. Will there be more driver fixes, bugs and whatnot.

    Ati has the performance crown concerning Anti-aliasing and Anisotropic filtering. 1024x768 is last years 800x600. Performing above 1024 is where it's at now. 9800Xts can play at this res with aa + af, so whats the point in benching this?

    As for overclocking, I wouldn't worry about that now. Ati might do the locked pipes feature again, like the 9500 non pro and 9800SE. So there's more value to be had there. The top overclockers are burning the 9800XT out at 600+ Mhz, so once production get into full speed, we should see some good overclocks.


  • Registered Users, Registered Users 2 Posts: 2,760 ✭✭✭Col_Loki


    I wouldnt be totally trusting of ATI either TBH, didnt they do simular to Nvidia a while back?


  • Advertisement
  • Closed Accounts Posts: 1,321 ✭✭✭neokenzo


    I dont think I'll be jumping too quickly on this. Will wait and see till both nvidia and ati sussed out the glitches if any.
    Whats the big deal on the pci slots anyway?


Advertisement
Advertisement