Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

X800 review up at THG

«1

Comments

  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭Slutmonkey57b


    Both long reviews so here's a summary:

    In Far Cry
    <GF6800U> *SLAP* "How'd ya like it bitch? Huh?"
    <R420> "That's some ugly **** you're bringing it the table. Bring it up to where the big boys play, eh?"
    <3d Forum poster> "Oi! GF6800U! U R CHEATOR!!"
    <GF6800U> "Huh? Me? No no no PS 1.1 here... don't look at me! DONT LOOK AT ME!!"

    All other tests:
    <R420> *SLAP* "How'd you like me now fat boy, eh?" *SLAP* "Thought you'd need 2 molex connectors, eh?" *SLAP* "Have to buy a new PSU, do they?" *SLAP*
    <GF6800U> *WHIMPER*


    And stuff.


  • Registered Users, Registered Users 2 Posts: 6,334 ✭✭✭OfflerCrocGod


    I never realised Gfx cards were so abusive!:eek:, I like the commentary:D. We'll see what happens once Nvidia start "optimising" their drivers.


  • Registered Users, Registered Users 2 Posts: 3,754 ✭✭✭Big Chief


    Originally posted by Slutmonkey57b
    Both long reviews so here's a summary:

    In Far Cry
    <GF6800U> *SLAP* "How'd ya like it bitch? Huh?"
    <R420> "That's some ugly **** you're bringing it the table. Bring it up to where the big boys play, eh?"
    <3d Forum poster> "Oi! GF6800U! U R CHEATOR!!"
    <GF6800U> "Huh? Me? No no no PS 1.1 here... don't look at me! DONT LOOK AT ME!!"

    All other tests:
    <R420> *SLAP* "How'd you like me now fat boy, eh?" *SLAP* "Thought you'd need 2 molex connectors, eh?" *SLAP* "Have to buy a new PSU, do they?" *SLAP*
    <GF6800U> *WHIMPER*


    And stuff.

    lol pld ;)

    EAGERLY awaiting this, i knew i was doing the right thing waiting for ati's response, gonna get this at end of month (i spoke to europes and uk's PR from ati about 3 hours ago about getting a hold of one of these for a review and he told me this.. that they will be on shelfs 3/4 weeks from now..)

    CANT WAIT YEH BAAAAAAABY!:D


  • Registered Users, Registered Users 2 Posts: 2,761 ✭✭✭Col_Loki


    Nice review and mini-review :) !!

    Very interesting that the X800pro will be comming in at $399 , and the X800xt for $100 more. I hope they dont put a saddle on us europeans and hike up the price.


  • Registered Users, Registered Users 2 Posts: 6,762 ✭✭✭WizZard


    Big Chief, any idea of what prices they will be at when they hit our shelves? I think I will be treating myself and putting my life on hold for a couple of... weeks/months(?)


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    I'm still not sure which to go for. Ati made a big mistake not incorporating the shader 3.0 technology. Also they upped those clocks at the last minute so the cards are coming pre-overclocked which means there is probably not a lot of headroom to go higher.


    I'm waiting for a direct comparrison with overclocking to see which has the edge.


    BloodBath


  • Registered Users, Registered Users 2 Posts: 1,864 ✭✭✭uberpixie


    Sod pixel shader 3.0 this round. Twill be an age before any developers get around to using it. By the time shader 3.0 is a big deal you will be on a new card and probally 2 gens from now. Not much of a worry at the moment.

    I also doubt there is much head room for o'clocking on either set of cards.

    Ati are using a refined version of architecture around for the last two gens, they have to be reaching their upper limit by now.

    Nvidias are running very hot with a lot of juice through em already, thats before you even get to overclock!

    To be honest both sets of cards run fairly similar in D3D games, unless you consider 5 or 6 frames a huge gap:-)

    It will be down to image quality at the end of the day and how much dicking around both companies do with their drivers.

    Ati only taking up one slot and drawing less power is a nice advantage especially for the small form factor fans.

    At the end of the day either will put a smile on your face. Be interesting to see if prices drop on the currant high end.


  • Closed Accounts Posts: 1,321 ✭✭✭neokenzo


    Originally posted by COL_LOKI
    Nice review and mini-review :) !!

    Very interesting that the X800pro will be comming in at $399 , and the X800xt for $100 more. I hope they dont put a saddle on us europeans and hike up the price.

    We'll probably get screwed over as usual and pay a lot higher than that. Nevertheless, still cant wait for it to hit the shelves.


  • Registered Users, Registered Users 2 Posts: 4,317 ✭✭✭CombatCow


    Il prolly just wait for the pci express version of the 6800 ultra super max extreme pro edition or whatever it's gonna be called and then wait to get a waterblock for it :D


    ** if you dont take into account the 6800 ultra extreme and xt platinum edition the 6800 ultra and x800 pro are very close,nothing between them really there both fcuking monster cards.**

    CombatCow


  • Registered Users, Registered Users 2 Posts: 17,441 ✭✭✭✭jesus_thats_gre


    Originally posted by Combatcow

    ** if you dont take into account the 6800 ultra extreme and xt platinum edition the 6800 ultra and x800 pro are very close,nothing between them really there both fcuking monster cards.**

    CombatCow

    Is there not 100 quid in it?


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 6,560 ✭✭✭Woden


    interesting stuff alright, depending on the review you look at which one is coming out on top.

    thats enough to tell me there isn't much in it and that it won't come down to performance really when ya go to buy these cards.;

    the shader3.0 i don't think is a big thing i don't think it will come in til next generation really. what i do thin is important is that ati have this card running at less power then the XT. its single slot and that makes it sff compatible as someone mentioned. i think the power thing may be important to a number of people.

    driver optimisations will be important but i still don't really trust nvidia on image quality and stuff though the new card seems alright some of the stuff on the 5950 looked crap.

    nvidia may have the lower high end market though with the GT version possibly being somewhat better then the standard X800 pro. nvidia also have the 6850 or the extreme version that will run at 450mhz i think and its included in some of the reviews and that is better then the X800 XT i think but apparently will be in massively limited supply and the most of em capable of 450mhz will go to gainward.

    with regards to overclocking the ati card is made on a 130nm processor which although refined and giving them the clock speeds over 500mhz apparently tops out at 600mhz so there ain't much room for overclocking alright.

    it will be interesting however i don't think i will be upgrading until the next gen after this the 9800pro will have to do til then

    data


  • Closed Accounts Posts: 4,943 ✭✭✭Mutant_Fruit


    What kind of processor would you need to take full advantage of those cards?


  • Closed Accounts Posts: 1,321 ✭✭✭neokenzo


    Here's another review from Anandtech and Driver Heaven.


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    The final versions of the nv6800 will only use one slot, they are changing the fan apparantly and nv seem to have sorted out their picture quality.

    The only real advantage Ati have is that they have slightly better AA and use less power.

    Pixel shader 3 is being used in games already. Far cry uses an early version of it. In fact most games coming out in the near future will incorporate it. Even half life 2, a partner of Ati, will incorporate it although they will not advertise that it uses it. Not to mention all the partners of Nvidia and they have a lot.

    I'm sure id will incorporate it into doom3. It is a big thing.


    The war is only starting. It will be a while yet before we know which card really is better.



    BloodBath


  • Registered Users, Registered Users 2 Posts: 4,317 ✭✭✭CombatCow


    True bloodbath,theres still a lot to be seen from both cards with driver updates and optimizations.Id say either card would match up nicely with a 2.0GHz A64 cpu and a Gig o' ram ;)

    Also a recent review in a mag said the 6800 ultra worked perfictly with other hi-spec components and a antec 430W psu,and what uber nerd ''thats going to buy the 6800 card'' is not going to have at least a 450W psu ??

    Some more reviews :

    http://www.beyond3d.com/reviews/ati/r420_x800/

    http://www.bjorn3d.com/_preview.php?articleID=457&51391

    http://www.gamers-depot.com/hardware/video_cards/ati/x800/001.htm

    http://www.lostcircuits.com/video/ati_x800/

    http://www.neoseeker.com/Articles/Hardware/Reviews/r420preview/

    http://www.pcper.com/article.php?aid=38

    http://www.trustedreviews.com/article.aspx?art=418


    CombatCow


  • Registered Users, Registered Users 2 Posts: 11,989 ✭✭✭✭Giblet


    Doom III is a very big Open Gl game, and thats were Nvidia seem to have excelled.

    It's very close, it will be down to which ever is the cheapest, most overclockable i'd say.


  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭Slutmonkey57b


    Nvidia are going to have to convince a lot of people (like me) that they're not still lying slippery weasels after everything that happened with their last gen cards. What makes me suspicious is a) the whole PS 1.1 Far-Cry bit - Nvidia are again claiming that they support x y and z while in actual operation they aren't. The whole "look at how much better PS 3 is" demo where they were actually showing PS2 v PS1.1 doesn't inspire confidence either.

    What is also apparant is that they haven't learned from the 5800's intro - they introduced a new card which made a noise like a jumbo jet and took up two slots, thinking "Eh, they'll get used to it, **** em, we can't be bothered fixing it." This time they've introduced a card which takes up two slots (but likes to pretend it doesn't, as though it were a fat granny from Barnsley who's wearing her daughter's velure tank top) and requires its own electricity generator with the thinking "Eh, **** em, they can go out and spend 500 quid on the card and then another 150 on a new PSU. They'll get used to it and we can't be bothered fixing it."

    ATI have produced a card that gives the same performance for less overhead and less weaselly bullsh!t. And a lot of people may already have a fat PSU but if you're planning an upgrade the extra outlay is going to be a drawback for the 6800. Doom 3 will NOT be a PS3 game - Carmack has said it uses DX7 tech that the original GF could do (assuming it had the throughput).
    I suspect by the time x800 reaches its ceiling the r500 will be making its move.


  • Registered Users, Registered Users 2 Posts: 17,958 ✭✭✭✭RuggieBear


    Originally posted by Slutmonkey57b
    Nvidia are going to have to convince a lot of people (like me) that they're not still lying slippery weasels after everything that happened with their last gen cards. What makes me suspicious is a) the whole PS 1.1 Far-Cry bit - Nvidia are again claiming that they support x y and z while in actual operation they aren't. The whole "look at how much better PS 3 is" demo where they were actually showing PS2 v PS1.1 doesn't inspire confidence either.

    What is also apparant is that they haven't learned from the 5800's intro - they introduced a new card which made a noise like a jumbo jet and took up two slots, thinking "Eh, they'll get used to it, **** em, we can't be bothered fixing it." This time they've introduced a card which takes up two slots (but likes to pretend it doesn't, as though it were a fat granny from Barnsley who's wearing her daughter's velure tank top) and requires its own electricity generator with the thinking "Eh, **** em, they can go out and spend 500 quid on the card and then another 150 on a new PSU. They'll get used to it and we can't be bothered fixing it."

    ATI have produced a card that gives the same performance for less overhead and less weaselly bullsh!t. And a lot of people may already have a fat PSU but if you're planning an upgrade the extra outlay is going to be a drawback for the 6800. Doom 3 will NOT be a PS3 game - Carmack has said it uses DX7 tech that the original GF could do (assuming it had the throughput).
    I suspect by the time x800 reaches its ceiling the r500 will be making its move.


    Exactly! That's it down to a tee.....


    P


  • Registered Users, Registered Users 2 Posts: 759 ✭✭✭El_MUERkO


    After nVidia's muppetry with the 5800>5900>5950 its hard foe me not to be biased toward the X800 and scream 'nVidia sux balls!' at all the fanbois who've been going nuts over the 6800ultra lately.

    However thats what I'm going to do till the retail release of both cards with optimised drivers.

    I'm still thinking ATi are going to blow nVidia out of the water but if the big 'n' can reduce the size, noise and power needs of the card while increasing its performance at least we'll have a level playing feild so the two giants of the market can duke it out with a nice price war for us the consumer to benifit from :D


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    They are changing the fan on the card to one that won't take up a pci slot.

    Whats up with the companys in the first place when they can't even make a decent cooler for their own card. The Radeons for example. Asus were the only ones that made an effort and it still wasn't great. The 8 euro artic vga cooler keeps the 98xx series 30c cooler and is no louder.

    I'm not an Nvidia fan at all. Never bought their cards coz they were muck. The Fx range anyway. They used to be the leaders though, they just ****ed up and underestimated ati.

    They do seem to have a really good product on their hands this time that gives big boosts from current technology. Don't say you weren't impressed by the early benchmarks. With some optimisation they seem to be on a fairly even level.


    Ati is looking good apart from the whole pixel shader 3 thing.


    BloodBath


  • Advertisement
  • Closed Accounts Posts: 4,943 ✭✭✭Mutant_Fruit


    ATI are ahead with this card in everythinge except openGL performance. In every OgenGl game, Nvidia are ahead, and ATI trail by up to 20fps i the higher resolutions.

    Still, that ATI would more than likely be my next upgrade, cos i don't want to have to go out and replace my brand new 350watt power supply, and i need the 3 PCI slots in my mobo, i can't have a new card taking over my AGP AND my PCI.

    Looks like ATI have aced it again.


  • Closed Accounts Posts: 296 ✭✭M@lice


    With the cooling demands of modern gpu's i reckon motherboards in the future are going to allow for huge coolers so the loss of a pci slot will no longer be an issue. I didn't see it anywhere on the bx outlay tho.

    What are your thoughts? I think it would be a good idea


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    If the companies invested a little more r & d into the cooling there wouldn't be a problem or even hired a third party to do it for them.

    It's bad design more than anything else.


    BloodBath


  • Registered Users, Registered Users 2 Posts: 4,317 ✭✭✭CombatCow


    You would think nvidia could use single slot cooling on the 6800 ultra if they changed that fat alu heatsink to a thinner copper 1 no ???

    with prety much everthing intergrated into the mobo these days im surprised ppl still use up all there pci slots :confused:

    Interesting times ahead now with socket 939,ddr2, pci-express,faster 64 bit cpu's,and now these 2 beasts.when will it end ??


    CombatCow


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    Hopefully never. The slump in the market is gone and they are pumping money back into development.


    I use up nearly all my pci slots. Onboard sound doesn't cut it. Mine has a auidgy 2 ls souncard. A fan card. A control switch for the lights, a usb/fire wire extra port connection and an extra network card.

    Not that half of that is needed of course but it's handy.


    BloodBath


  • Closed Accounts Posts: 3,357 ✭✭✭secret_squirrel


    The interesting thing I saw on a review on the register.com was that they were testing with a P4 3.2Gz and still reckoned both cards were being CPU limited. That means they might scale well as CPU's get faster too!

    Cant help feeling that from a physical point of view the ATI is much better. Especially for the SFF boys out there like me.

    Arent Nvidia underestimating the size of this market a bit?


  • Moderators Posts: 5,580 ✭✭✭Azza


    Looks like ATI for me the moment but am worried thats the roof performance wise...and not overclockable......I don't think pixel shader matters much....nvidia have apparantly arsed it up anyway need drivers to fix it shortly but its actually power than 2.0 as it stands. Wait till I see the Ultra Extreme that nvidia produces with less power requirements. Overall though alot of games and benchmarks are split but seem to be leaning marginally towards ati...even if the Ultra Extreme or Extreme Ultra or what ever it is is slightly better I think its mean't to basically the same card just a higher clock....code cooling on standard model may bring it to that level anyway.....

    Darn its just to close to call


  • Registered Users, Registered Users 2 Posts: 2,005 ✭✭✭CivilServant


    At least you can trust Ati with their benchmarks and you hear it straight from the horses mouth. With nvidia you really don't know what you're getting yourself in for. Driver optimisations will fix this game and that. All we know for certain is that it performs well on the benchmarks we've seen but how about the upandcoming new games. Will there be more driver fixes, bugs and whatnot.

    Ati has the performance crown concerning Anti-aliasing and Anisotropic filtering. 1024x768 is last years 800x600. Performing above 1024 is where it's at now. 9800Xts can play at this res with aa + af, so whats the point in benching this?

    As for overclocking, I wouldn't worry about that now. Ati might do the locked pipes feature again, like the 9500 non pro and 9800SE. So there's more value to be had there. The top overclockers are burning the 9800XT out at 600+ Mhz, so once production get into full speed, we should see some good overclocks.


  • Registered Users, Registered Users 2 Posts: 2,761 ✭✭✭Col_Loki


    I wouldnt be totally trusting of ATI either TBH, didnt they do simular to Nvidia a while back?


  • Advertisement
  • Closed Accounts Posts: 1,321 ✭✭✭neokenzo


    I dont think I'll be jumping too quickly on this. Will wait and see till both nvidia and ati sussed out the glitches if any.
    Whats the big deal on the pci slots anyway?


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    Exactly people. You shouldn't be so loyal to companies. That's how a lot of people got screwed over in the past.

    "Oh the lovely nvidia fx range is out, I love my ti440, I think i'll run out and buy it. **** ATI"

    At least you can trust Ati with their benchmarks and you hear it straight from the horses mouth. With nvidia you really don't know what you're getting yourself in for. Driver optimisations will fix this game and that. All we know for certain is that it performs well on the benchmarks we've seen but how about the upandcoming new games. Will there be more driver fixes, bugs and whatnot.


    Nvidia aren't stupid enough to make that kind of mistake again. They have let ati become the market leaders with their fx **** up.

    That will not be a problem with the new card.

    Wait and see, this war is only getting started.

    BTW 600mhz core on a 9800xt is only possible with massive cooling.

    500 is easy to get with an 8 euro artic vga cooler though but going any higher has barely any affect on games. The card is limited by it's memory which no matter what you do will not go much higher than 800mhz.


    BloodBath.


  • Closed Accounts Posts: 4,943 ✭✭✭Mutant_Fruit


    The big deal with PCI slots is that my computer only has three, and i can't have a bigass graphics card wasting one on me!


  • Registered Users, Registered Users 2 Posts: 2,005 ✭✭✭CivilServant


    Originally posted by BloodBath
    Nvidia aren't stupid enough to make that kind of mistake again. They have let ati become the market leaders with their fx **** up.

    That will not be a problem with the new card.

    Wait and see, this war is only getting started.

    BTW 600mhz core on a 9800xt is only possible with massive cooling.

    500 is easy to get with an 8 euro artic vga cooler though but going any higher has barely any affect on games. The card is limited by it's memory which no matter what you do will not go much higher than 800mhz.

    Oh but they have you see. This screenshot show that nvidia have been claiming to use full ps2.0 on Far Cry. The way it's meant to be played ... yeah right! :)

    You can see the blockiness on the floor on the nvidia sample. This is caused by lower precision? ie. the 6800U is using a combination of PS1.1 and PS2.0 to boost benchmarks. Whereas the Ati products are all using full PS2.0. Hows that for honesty. In fact you can force PS2.0 on nvidia, but of course the benchmarks will suffer as a result. Why have nvidia done this?

    Because their latest graphics card is being whupped by Ati's offering. Now imagine the benchmark with forced PS2.0, not looking so good.

    @COL_LOKI

    Yeah Ati did something similar with the 8500 benchmarks. People frowned on them too, but they paid the price and developed a superior card. You don't have to cheat when you've got the better technology.


  • Registered Users, Registered Users 2 Posts: 2,761 ✭✭✭Col_Loki


    Yea i agree, when ATI are on top cheating isint nessary. But how do we tell which card is on top performance wise? Benchmarks and reviews.......

    Nvidia have definetly messed up, but ATI did the same...... there both trying to sell us their product , to be loyal is very silly as they have none for us. Definetly ATI look the better choice, the more trust worthy, would like to see some overclocking reviews and see what some reviews find (if there is dodgy play involved :)).


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    Why did they compare 1600x1200 ati pics to 1280x1024 nvidia pics. Surely this isn't a fair comparisson.

    Also the overclocking test they said the ati was better but they never showed any nvidia overclocking results.

    Me thinks they are a little biased towards nvidia. I will be going on what anandtech
    says.


    The x800pro looks like the card to get for the best overclocking and price it seems. Hopefully an artic vga cooler will be released for it allowing higher overclocks but thats a very impressive memory overclock. I'd say the heat is all that is limiting the core from going higher. Design a better fan ati, jesus it's not that hard.

    The artic's on the 98xx series drop the temps by 30c and are no louder and only cost
    8.50.

    I wouldn't rule out the nvidia yet though. It outperforms the ati in some of those benchmarks and it renders some sceness better like this one.
    http://www.hardocp.com/image.html?image=MTA4MzU2NDE4OTg4OEFkazcwdGVfNV80X2wuanBn

    Notice the shadows are only on the nvidia card and none of the ati's.
    The shading performance of the nvia card is second to none.

    BloodBath


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 2,005 ✭✭✭CivilServant


    The 6800U is pretty similar in most of the benchmarks except Far Cry and any time you use more than 4aa + 8af.

    That's pretty weird that Ati cards weren't rendering that Splinter Cell scene properly.Looks like an actual bug. The reviewers say the Ati took a similar performance hit when the shadows were turned on to suggest that the card was doing the work, but it just wasn't rendering it properly to the screen. I'll forgive Ati for that one.

    Still it's not as bas as nvidia showing off PS3.0 by comparing Far Cry screenshots of scenes with rendered with PS1.1 and calling it PS2.0. When in fact it was more like PS1.1 vs PS2.0.


  • Closed Accounts Posts: 3,357 ✭✭✭secret_squirrel


    Originally posted by CivilServant

    Ati might do the locked pipes feature again, like the 9500 non pro and 9800SE.

    The Reg review says ATI are burning out the unused pipes this time after grading the chip as 8/12/16 pipes. So that means no softmodding for us if its true :(


  • Registered Users, Registered Users 2 Posts: 4,317 ✭✭✭CombatCow


    Do you think the layout of the x800 board the exact same as the 9800 ??? It looks really similar...so could you use the current radeon waterblock (http://www.hitide.ie/catalog/product_info.php/cPath/21_22_39/products_id/200) on the x800.Intresting :ninja:


    CombatCow


  • Registered Users, Registered Users 2 Posts: 5,578 ✭✭✭Slutmonkey57b


    If there's any doubt as to Nvidia's honesty, look up Gabe Newell's *lashing* of them at a developers conference in September. He basically devoted half a talk to how Nvidia were taking the piss, lying about their hardware, cutting image quality or cutting out imaging effects totally in order to make the framerate, rigging drivers to run with certain games, writing drivers for use in hardware/developer reviews that would never be released to the end user, and not being up to the job as a whole. When a major developer (who relies on a good relationship with a hardware vendor to be able to sell products to that vendor's users) is prepared to go all out like that you *know* something's wrong. Even Carmack said that he was having to code an entirely different path for Nvidia cards because they couldn't do what they were supposed to do (he's since dropped this having managed to get most of it fixed.)

    The silence from Nvidia over the whole PS3 lies isn't heartening. I smell a repeat.


  • Registered Users, Registered Users 2 Posts: 4,317 ✭✭✭CombatCow


    :eek: Sweet jesus check out the price of the gfx cards on komplett.

    Gainward GeForce 6800 Ultra 256MB DDR3 AGP, "CoolFX Ultra/2600 GS" 450Plus mhz,
    Komplett #: 300416 / 471846200-5995
    Availability: Backordered (lead time: 5 days) €919.60


    Gainward GeForce 6800 Ultra 256MB DDR3 AGP, "Ultra/2600 GS" 430Plus mhz, Retail
    Komplett #: 300417 / 471846200-6015
    Availability: Backordered (lead time: 5 days) €617.10


    PowerColor Radeon X800PRO 256MB DDR3 AGP, ATI X800PRO, Backbag Limited Ed, Hi
    Komplett #: 300425 / R42-PD3
    Availability: Backordered (lead time: 10 days) €471.90


    Sapphire Radeon X800PRO Atlantis 256MB D AGP,ATI X800PRO,DVI-I,TV-Out, Full-Retai
    Komplett #: 300413 / 21034-00-40
    Availability: 144 pcs 2004-05-26 (unconfirmed) €441.65


    http://www.komplett.ie/k/kl.asp?bn=10413&sortBy=p&minprice=&maxPrice=&sortOrder=d



    Fookin 920 blips for the 6800 Ultra are they rippin the pi$$ :eek:
    ''i think the gainward coolFX is a watercooling block'' but still !!


    CombatCow


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 6,892 ✭✭✭bizmark


    Gainward GeForce 6800 Ultra 256MB DDR3 AGP, "CoolFX Ultra/2600 GS" 450Plus mhz,Gainward GeForce 6800 Ultra 256MB DDR3 AGP, "CoolFX Ultra/2600 GS" 450Plus mhz,
    Komplett #: 300416 / 471846200-5995
    Availability: Backordered (lead time: 5 days) €919.60

    What the fook 919 euro!! jesus christ you could build a perfectly good pc for that money

    Thats bang out of order christ almighty 919 euro


  • Closed Accounts Posts: 4,943 ✭✭✭Mutant_Fruit


    I built my brand spanking new PC including new monitor and box for 2/3's of that price! Holy fook!


  • Registered Users, Registered Users 2 Posts: 6,984 ✭✭✭Venom


    "Ati only taking up one slot and drawing less power is a nice advantage especially for the small form factor fans."

    Agreed. While I have plenty of room slot wise, using 2 power connectors would be a right pain. Have to say the X800XT from Sapphire will be my next card. Shader 3.0 is not gonna be used in games for a long time. Nvidia have been caught cheeting to many times in benchmarks for me ever to trust there cards again not to mention the Fook off insane price tag lol.

    One question I havent seen answered is whats the noise level like on these cards any one know?


  • Registered Users, Registered Users 2 Posts: 759 ✭✭✭El_MUERkO


    The X800 is supposed to retail fro $399 so I wont be paying €440+ euros for mine, tired of getting ripped off on this ****, I'll get a relative in America to Fed-Ex me one.

    As to the nvidia, as I understand it from posts here and elsewhere the retail version will only take up 1 slot and 1 power connecter.


  • Registered Users, Registered Users 2 Posts: 4,317 ✭✭✭CombatCow


    Originally posted by Venom
    One question I havent seen answered is whats the noise level like on these cards any one know?

    LOL,yeah that ( E920 ) gainward one is silent,as it uses a waterblock instead of a hsf.:rolleyes:


    CombatCow


  • Closed Accounts Posts: 1,321 ✭✭✭neokenzo


    Originally posted by Combatcow
    Gainward GeForce 6800 Ultra 256MB DDR3 AGP, "CoolFX Ultra/2600 GS" 450Plus mhz,
    Komplett #: 300416 / 471846200-5995
    Availability: Backordered (lead time: 5 days) €919.60

    Holy crap. Just imagine that if someone buys that card, it could either be about the same or higher than thier own pc put together. I expected to be expensive, but never thought it is going to be that expensive. I wonder what the X800XT going to cost now.


  • Closed Accounts Posts: 1,321 ✭✭✭neokenzo


    Originally posted by Combatcow
    LOL,yeah that ( E920 ) gainward one is silent,as it uses a waterblock instead of a hsf.:rolleyes:


    CombatCow

    Probably have wings and can fly too! :p


  • Registered Users, Registered Users 2 Posts: 4,317 ✭✭✭CombatCow


    It would want to for that price ;)


    CombatCow


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    "Ati only taking up one slot and drawing less power is a nice advantage especially for the small form factor fans."

    Agreed. While I have plenty of room slot wise, using 2 power connectors would be a right pain. Have to say the X800XT from Sapphire will be my next card. Shader 3.0 is not gonna be used in games for a long time.

    Jesus lads do ye read the posts. I have said already that Nvidia have changed the fan to one that doesn't take up a pci slot. Look it up. It's in a few mags as well. Pc extreme shows the new one. That was only on the prototype for testing.


    Also I guarentee shader 3.0 will be used in a lot of games. Just look at how many companies and games Nvidia sponser. You can bet that they will be using s3 in their new titles.

    I'm not saying the card is great but some of the points ye are making just are not valid. Do some research will ye.


    BloodBath


  • Closed Accounts Posts: 1,321 ✭✭✭neokenzo


    Originally posted by BloodBath
    Also I guarentee shader 3.0 will be used in a lot of games. Just look at how many companies and games Nvidia sponser. You can bet that they will be using s3 in their new titles.

    At E920 a pop, Nvidia can keep their shader 3.0.


  • Advertisement
Advertisement