Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

9800GX2 or GTX280?

  • 26-06-2008 12:52pm
    #1
    Registered Users, Registered Users 2 Posts: 8,067 ✭✭✭


    Going to be getting a new graphics card and wondering which would be the best buy. I'm not going to have this card installed on an SLI mobo so that's out of the window. It will also need to run games at 1980x1200.

    I was set on the GTX280 until I read this review which says 9800GX2 is pretty much better at everything.

    http://www.overclock3d.net/reviews.php?/gpu_displays/nvidia_gtx_280_performance_revealed_-_msi_n280gtx/1

    Without SLI support is there any point getting the GTX280 over the GX2? Considering the GX2 is actually cheaper :confused:


«1

Comments

  • Registered Users, Registered Users 2 Posts: 2,924 ✭✭✭Nforce


    9800GX2;)


  • Registered Users, Registered Users 2 Posts: 8,067 ✭✭✭L31mr0d


    Nforce wrote: »
    9800GX2;)

    but i'm torn... I mean the GX2 will perform great in games with dual card support but what about games that don't support it? Would the GTX280 be better in this instance?

    I don't understand though why Nvidia are charging more for a less powerful card than the GX2?


  • Registered Users, Registered Users 2 Posts: 8,067 ✭✭✭L31mr0d




  • Registered Users, Registered Users 2 Posts: 1,525 ✭✭✭DanGerMus


    You should read flaccus post in the other 280gtx (sorry its the 4850 thread) thread. It basically says that at very high resolutions the gtx pulls away from the gx2. I dont think any reviews have said that the gx2 is 'way' better. I'd also 'hope' that with drivers the new gtx will get better. 'hope'. Do you need to change right away if you can wait a little for new drivers it'd be worth it just to know for sure.

    Man if they had just stuck some ddr5 on the gtx there'd be no question, oh well.

    Post number ten here.
    http://www.boards.ie/vbulletin/showthread.php?p=56323132#post56323132


  • Advertisement
  • Closed Accounts Posts: 140 ✭✭great


    L31mr0d wrote: »
    don't really want crossfire or SLI.

    it preforms well without crossfire in some games it beats the gtx280


  • Registered Users, Registered Users 2 Posts: 8,067 ✭✭✭L31mr0d


    well no I don't need it right away as i've got a spare PC and most of my gaming is on the PS3 for the moment *cough*MGS4 and GTA4*cough*

    But i'd like to get it up and running again, so next month i'll be buying a new motherboard for it and wanted to throw a graphics card onto the order. The GTX280 just looks too expensive for the performance.


  • Registered Users, Registered Users 2 Posts: 195 ✭✭WEST


    great wrote: »
    it preforms well without crossfire in some games it beats the gtx280

    Too true. Going by the thread title I thought this thread was started pre the Radeon DH 4870 release. Once the ATI card was available I never thought anyone would want to buy the GTX280 especially going by the below and other reviews:

    http://techreport.com/articles.x/14990/1

    In a lot of cases the 4870 beats the GTX280 at your resolution and when you consider the price difference there is no contest. However if you need to buy an Nvidia card, I would go with the 9800GX2, its still a good deal. And if you want the best performance and if you are patient I would wait for the 4870 x2.

    God I hate to see people send that much on a GTX280, you would get a whole new subsystem instead. Just think of the starving childern in Africa ;)


  • Closed Accounts Posts: 13,874 ✭✭✭✭PogMoThoin


    I just ordered a HD4870 today from hdv.de for 235e, available cheaper elsewhere if ya look. Best bang for buck gpu ATM, be buying another in a few weeks when the price drops.

    Edit, I'd just like to add that if you can wait the HD4870x2 will be out soon and is looking like its goin to be a beast


  • Registered Users, Registered Users 2 Posts: 2,528 ✭✭✭TomCo


    WEST wrote: »
    Just think of the starving childern in Africa ;)

    Send them your old graphics cards.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 1,864 ✭✭✭uberpixie


    L31mr0d wrote: »
    Going to be getting a new graphics card and wondering which would be the best buy. I'm not going to have this card installed on an SLI mobo so that's out of the window. It will also need to run games at 1980x1200.

    I was set on the GTX280 until I read this review which says 9800GX2 is pretty much better at everything.

    http://www.overclock3d.net/reviews.php?/gpu_displays/nvidia_gtx_280_performance_revealed_-_msi_n280gtx/1

    Without SLI support is there any point getting the GTX280 over the GX2? Considering the GX2 is actually cheaper :confused:

    Bare in mind flaccus posted this a while back....
    Flaccus wrote: »

    Having said all that benches are not everything, and I read with interest what this person had to say about the GTX 280's superior bandwidth and extra memory against a GX2 which he also owns :
    http://forums.overclockers.co.uk/sho...php?t=17888181

    Although lots of sites are showing the GTX 280 being beaten out by the GX2 this person found :

    > Certain games now have quicker loading times, especially Unreal Engine 3 based games (like Mass Effect, about 50% faster loading)

    > More complex games with high res textures have less random paulsing, from things caused by hard drive paging and so on (Oblivion, Mass Effect, Crysis and so on...)

    Personally would go with 4780 or a cheap GTX260 (€320)


  • Closed Accounts Posts: 29,930 ✭✭✭✭TerrorFirmer


    The GTX definitely pulls away at very high resolutions and with AA enabled, but only really at incredibly high resolutions, not even ones like 1920x1200 which in itself is a very minority used resolution. It's not appreciable by the vast majority of people planning on even a high end card....like in some reviews, at 1680x1050 w/AA GX2 and GTX280 perform neck and neck - yet drive that up to 2560x1600 w/AA and the GTX280 can, in some games, absolutely dominate. Remember in one of the early magazine reviews, in one test, the GTX280 was 14-15 times faster then the GX2 at 2560x1600 with full AA. But how many people game at that resolution....

    Though it would be far better then a GX2, seeing as it's a single card and not relying on multi-gpu support in a given game...

    But it really seems at the moment that even as a single card, the HD4870 is the best option available by a long shot.


  • Closed Accounts Posts: 495 ✭✭Tony Broke


    Either get HD4870 or wait for HD4870x2

    All the reviews I have seen, show €240 HD4870 being 15% slower than GTX 280, can you live with that?

    dgdsjlc7gt0ms5lygd2i.png


    http://www.xtremesystems.org/forums/showthread.php?t=192421

    Ordered one myself yesterday, plus E8400 and 2gb pc 6400, for €100 less than GTX 280

    http://www.pixmania.com/ie/uk/1344930/art/sapphire-technology/radeon-hd-4870-512-mo-gdd.html


  • Registered Users, Registered Users 2 Posts: 8,067 ✭✭✭L31mr0d


    what's the release date of the HD4870x2?


  • Closed Accounts Posts: 495 ✭✭Tony Broke


    L31mr0d wrote: »
    what's the release date of the HD4870x2?

    Late Aug/ early Sept


  • Registered Users, Registered Users 2 Posts: 3,357 ✭✭✭papu


    Tony Broke wrote: »
    Late Aug/ early Sept

    yeah , they may have a way of it sharing memory across the 2 cores with a new thing called the crossfire sideport or something , it will be interesting , anyway
    the 4870 really is the way to go , with less heat and noise and power consumption than the 280 its a safe bet , with the money saved you can buy a nice Hr-03 and it seems to really do the trick , the hr-03 from the 3870's will fit it like a sock.PALITHD4870-84.jpg

    power play isnt working just yet on the new cards , and thats why it seems to run hot / use more power cause it doesnt auto clock down when not in 3d , their working on new drivers/bio's to correct this


  • Registered Users, Registered Users 2 Posts: 1,525 ✭✭✭DanGerMus


    l31m, what's your current setup. I just went for the 4870 but i really need a new card as i have an x1950pro so i'm looking forward to seeing some kick ass improvements. If you've got 8800gtx or similar i'd really wait for the 4870x2.


  • Registered Users, Registered Users 2 Posts: 3,971 ✭✭✭Flaccus


    I think the 4870 is a good bet too. Maybe wait a little while for the 1gig one ?

    I have spoken to people that have them and even at idle the fan is a bit whiney. The GTX 280 wins out in this regard and has some superb power management. Very quiet when idle. Both cards in my opinion are noisy when gaming though, but at least with the 4870 that can be quickly sorted with a HR-03-GT. But that may then limit your ability to go crossfire due to the sheer size of some of these aftermarket coolers.

    Also another minus on the the GTX 280 is that it is suffering from downclock issues in some games, thanks to the new power saving modes they added..this is believed to be caused by the drivers. The card can clock down to 300mhz during periods in a game where there is not much going on, only for it to stutter when trying to clock back up. Only a few games have this issue. And Nvidia/EVGA are working on a fix.


  • Registered Users, Registered Users 2 Posts: 3,357 ✭✭✭papu


    Flaccus wrote: »
    I think the 4870 is a good bet too. Maybe wait a little while for the 1gig one ?

    I have spoken to people that have them and even at idle the fan is a bit whiney. The GTX 280 wins out in this regard and has some superb power management. Very quiet when idle. Both cards in my opinion are noisy when gaming though, but at least with the 4870 that can be quickly sorted with a HR-03-GT. But that may then limit your ability to go crossfire due to the sheer size of some of these aftermarket coolers.

    Also another minus on the the GTX 280 is that it is suffering from downclock issues in some games, thanks to the new power saving modes they added..this is believed to be caused by the drivers. The card can clock down to 300mhz during periods in a game where there is not much going on, only for it to stutter when trying to clock back up. Only a few games have this issue. And Nvidia/EVGA are working on a fix.

    gig version comming early in july , ones been spotted pre order on the 8th of july , you can wrap the hr-03 around one of the cards so crossfire isnt a problem.


  • Registered Users, Registered Users 2 Posts: 8,067 ✭✭✭L31mr0d


    don't know if I can wait till the end of august though, at latest I want to have my main system up and running by the end of July.

    So for a system that's going to watercooled and pretty much left alone for the next ~6 months at least should I go for the 4870, GX2 or 280?

    Which one overclocks the best also? As whichever card I choose is going to have a PE120.2 rad all to itself.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 3,971 ✭✭✭Flaccus


    I just bought 1 of each so will let you know next week :D


  • Registered Users, Registered Users 2 Posts: 8,067 ✭✭✭L31mr0d


    Flaccus wrote: »
    I just bought 1 of each so will let you know next week :D

    why did you buy a 4870, GX2 and a 280?


  • Registered Users, Registered Users 2 Posts: 3,971 ✭✭✭Flaccus


    4870 and GTX.

    Already have GX2's and anyone on EVGA forums who has had both the GX2 and a GTX has reported that the GTX feels much smoother in gaming despite benches showing the GX2 ahead in some areas In fact if you look at reviews like Anandtech they show 2x8800gt cards in sli (current setup) matching and even beating a GX2 and a GTX. But most of these sites use canned benches and don't talk about microstuttering in sli, how the framrate varies during the game, or what happens when a card with a ddr3 memory and 256bit bus/512meg ram reaches it's limit at 1920x1200 with lots of AA. HardOCP on the other hand use real world benches and demonstrate this perfectly. Their benches monitor in game frame rate fluctuations to get an idea of what is playable and what is not, and they have noticed that the GTX even outpeforms 2 x 9800gtx's in sli at high res with aa.

    http://enthusiast.hardocp.com/article.html?art=MTUxOCw1LCxoZW50aHVzaWFzdA==

    The GTX is leading the field therefore as the fastest single card. It also consumes less power then the 8800 GTX at idle (over 30 watts less), and lies in between the 8800GTX and GX2/3870X2 for power consumption under load. However there remains 2 issues. Some people have reported their 280's climbing to 105 degrees within seconds and the core clocks backing down to 300, or in some cases artifacting and even hanging. The minority though. The more common issue is the card can sometimes throttle down to 2d speeds in games (similar to what happened with ATI powerplay). But again it depends on the game and not everyone is seeing this issue. It's clearly tied to the new power management in this card whose drivers can dynamically change clock and voltage levels to mimimise power draw. (hence the low power usage at idle). But this is a known problem that Nvidia/ATI said they are working on, and already people have reported success with the latest whql driver.

    Power consumption on GTX 280
    http://www.legitreviews.com/article/726/20/

    GTX Downclocking and/or Throttling
    http://www.evga.com/forums/tm.asp?m=421769


    Having said that, their is also a consensus that the new ATI card over using DDR5 and having more efficient texture compression can make up for having only 512meg ram/256 bit bus, and canned benches have shown it outperform the GTX 260 and come within 15% of the GTX 280. HardOCP have backed this up as well with regards to the 4870 offering higher playable framerates then the GTX 260. But still behind the GTX 280. Also the ATI is very loud under load, and it's like the fan does nothing until the gpu gets very very hot. It doesn't seem to have the intelligent power save modes like nividia has. Also, according to users on the forums at OCUK, the fan even at idle is irritating, requiring the need for an aftermarket cooler. Still a hell of a lot cheaper the the GTX 280 and if offering 85% of the performance could very well be the card to have.

    http://enthusiast.hardocp.com/article.html?art=MTUyNCw1LCxoZW50aHVzaWFzdA==

    So the question for me is which card do I keep, and which card do I return. That is why I bought both.


  • Registered Users, Registered Users 2 Posts: 1,525 ✭✭✭DanGerMus


    i think you should keep the gtx and sell the 4870 at a considerable discount to....eh... someone....:cool:


  • Registered Users, Registered Users 2 Posts: 3,971 ✭✭✭Flaccus


    Days of ripping myself off are over..from today that is :D


  • Registered Users, Registered Users 2 Posts: 8,067 ✭✭✭L31mr0d


    will you be overclocking these cards though or leaving them at stock?


  • Registered Users, Registered Users 2 Posts: 3,971 ✭✭✭Flaccus


    http://www.adverts.ie/showproduct.php?product=61084

    :D

    They all have good overclock potential so far. The evga preoverlocked once seem to do a bit better as are cherry picked cores.


  • Closed Accounts Posts: 5,111 ✭✭✭MooseJam


    why are ATI releasing HD4870x2, weren't they supposed to have abandoned the high end


  • Registered Users, Registered Users 2 Posts: 2,044 ✭✭✭Sqaull20


    Flaccus wrote: »
    4870 and GTX.

    Already have GX2's and anyone on EVGA forums who has had both the GX2 and a GTX has reported that the GTX feels much smoother in gaming despite benches showing the GX2 ahead in some areas In fact if you look at reviews like Anandtech they show 2x8800gt cards in sli (current setup) matching and even beating a GX2 and a GTX. But most of these sites use canned benches and don't talk about microstuttering in sli, how the framrate varies during the game, or what happens when a card with a ddr3 memory and 256bit bus/512meg ram reaches it's limit at 1920x1200 with lots of AA.

    So the question for me is which card do I keep, and which card do I return. That is why I bought both.

    From my own experience I found the GX2 to be a very disappointing card, reviews showed it/still show it be a much better card than the 8800gtx/ultra, but I found the opposite to be true.Only 2 games I play needed that power ( Crysis, Dirt ) and both played much smoother on my 8800gtx (Which I sold and then went on to trade GX2 for 2 x 8800gtx )


    I like the look of the Gtx260/HD4870, but not much of a step up over single 8800gtx.

    My 8800gtx can be overclocked well past Ultra frequencies, while HD4870 cant be overclocked without dying apparently, I may get Gtx 260 when they drop near 4870 price and overclock them to Gtx 280 frequencies.

    Why did you sell your GTX 280 without testing it Flaccus, anyone that has used one says they are much better than the bad rep suggests?

    Gtx260 for €267 should be your next purchase imo, Im going to get one when they drop nearer 200 and overclock the **** out of it.

    http://www3.hardwareversand.de/3/articledetail.jsp?aid=22296&agid=554


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 1,525 ✭✭✭DanGerMus


    Moose, i think what they've done is abandon the highend singlecore cards in favour of developing cheaper mid range cards and then just slapping 2 together for high end. Saves on development i suppose.

    Squall; with the gtx260 at that price you've made me think twice about my 4870 purchase. Hasn't been shipped yet though. With the 4870 apparently being such bad clockers how far past them do you think the 260 will clock?


  • Registered Users, Registered Users 2 Posts: 3,357 ✭✭✭papu


    DanGerMus wrote: »
    Moose, i think what they've done is abandon the highend singlecore cards in favour of developing cheaper mid range cards and then just slapping 2 together for high end. Saves on development i suppose.

    Squall; with the gtx260 at that price you've made me think twice about my 4870 purchase. Hasn't been shipped yet though. With the 4870 apparently being such bad clockers how far past them do you think the 260 will clock?

    4870's have no proper clocking tool now because of the ddr5 give it a few weeks , ccc will only take it so far , theres a Overclocked edition comming soon with 800/1200 which will be taking on the 280 i believe , ati cards usually have allot of headroom left for clocking :D


  • Closed Accounts Posts: 852 ✭✭✭blackgold>>


    Squall we all know your a die hard nvidia fan but why would you buy a card for 267 when a 140 card can do exactly the same thing?

    And as far as overcloking goes why they hell do you buy a card thats more expensive and overclock it 50-100mhz and talk about it like your actually making a difference to your gameplay?Overclocking is pointless and totally blown out of proportion to the point of been ridiculous.


  • Registered Users, Registered Users 2 Posts: 2,044 ✭✭✭Sqaull20


    Squall we all know your a die hard nvidia fan but why would you buy a card for 267 when a 140 card can do exactly the same thing?

    And as far as overcloking goes why they hell do you buy a card thats more expensive and overclock it 50-100mhz and talk about it like your actually making a difference to your gameplay?Overclocking is pointless and totally blown out of proportion to the point of been ridiculous.

    Yeah, you probably wouldnt even notice the difference with the overclock, but its nice to say you got a Gtx 280 for €267 -132mb of memory :D

    HD4850 is a fine card, but its not as fast as a gtx260, at least in the games I play anyway.


  • Closed Accounts Posts: 852 ✭✭✭blackgold>>


    But you don't have a 280 for 267 you have a 260 for 267 when an 4850 is on par and better with some games and visa versa.
    Your point of view is distorted by your support for one company over another.I fail to see your point about saying you overclocked anything would you brag about overcloking a core duo 1.7ghz to 1.8?


  • Closed Accounts Posts: 5,111 ✭✭✭MooseJam


    why do graphics cores run at ~ 600 mhz while cpu's run at ghz, why no gfx clocked at ghz ?


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 3,357 ✭✭✭papu


    whoooole different architecture , cpus have 2 cores now or 4 , but in a gfx each stream processor is basically a small cpu


  • Closed Accounts Posts: 852 ✭✭✭blackgold>>


    There a totally different kettle of fish as papu said I would read up on oscillator crystal clock rate buses and generally how cpus work.Your question can't be answered sufficiently by us farts.
    This is a good read if your really interested.
    http://en.wikipedia.org/wiki/Clock_rate


  • Registered Users, Registered Users 2 Posts: 3,357 ✭✭✭papu


    1337mhz.


  • Registered Users, Registered Users 2 Posts: 195 ✭✭WEST


    Squall we all know your a die hard nvidia fan but why would you buy a card for 267 when a 140 card can do exactly the same thing?

    And as far as overcloking goes why they hell do you buy a card thats more expensive and overclock it 50-100mhz and talk about it like your actually making a difference to your gameplay?Overclocking is pointless and totally blown out of proportion to the point of been ridiculous.

    I can never understand the brand loyalty myself. Why someone would spend over €100 more for a card that does not provide better performance? Its not like car were everyone can see it. A graphics card is is hidden away were no one can see it.

    I can see the marketing departments rubbing their hands in delight when they see the fanboys (aka nutjobs) reaching for their credit cards.


  • Closed Accounts Posts: 852 ✭✭✭blackgold>>


    I think this picture encompases it quite nicely.
    fanboy-anatomy.jpg


  • Advertisement
  • Closed Accounts Posts: 13,874 ✭✭✭✭PogMoThoin


    LMFAO :D


  • Registered Users, Registered Users 2 Posts: 2,850 ✭✭✭Fnz


    WEST wrote: »
    I can never understand the brand loyalty myself. Why someone would spend over €100 more for a card that does not provide better performance?

    I'm not too up to date on all things GPU-related but I have been told that Nvidia are more 'prompt' when it comes to driver updates. Things like that inspire loyalty.


  • Registered Users, Registered Users 2 Posts: 3,357 ✭✭✭papu


    Fnz wrote: »
    I'm not too up to date on all things GPU-related but I have been told that Nvidia are more 'prompt' when it comes to driver updates. Things like that inspire loyalty.

    they release ALLOT of beta drivers , but more often than not it causes instablility , there was a study that showed that about 30% of vista crashes were caused by Nvidia drivers while ati had only 9.3%link


  • Closed Accounts Posts: 13,874 ✭✭✭✭PogMoThoin


    Fnz wrote: »
    I'm not too up to date on all things GPU-related but I have been told that Nvidia are more 'prompt' when it comes to driver updates. Things like that inspire loyalty.

    Nvidia use tactics similar to Creative, they drop support for certain devices at driver level. My mates Geforce 5300 wouldn't run under Vista as Nvidia decided it'd be best to force customers to buy a new gpu. An older Ati x1300 has a vista driver and is supported.


  • Registered Users, Registered Users 2 Posts: 2,850 ✭✭✭Fnz


    papu wrote: »
    they release ALLOT of beta drivers , but more often than not it causes instablility , there was a study that showed that about 30% of vista crashes were caused by Nvidia drivers while ati had only 9.3%link
    Yeah I did hear about the Vista crash stats.

    It was a friend who told me about Nvidia "being better for drivers". I assume he knows what he's talking about (is really into his games). My impression was that ATI users would be waiting longer for (sometimes vital) bug fixes (for new games, in particular).
    PogMoThoin wrote: »
    Nvidia use tactics similar to Creative, they drop support for certain devices at driver level. My mates Geforce 5300 wouldn't run under Vista as Nvidia decided it'd be best to force customers to buy a new gpu. An older Ati x1300 has a vista driver and is supported.

    That does sound bastardish, alright. I can see why people might become ATI 'fanboys' as a result of such treatment.


  • Registered Users, Registered Users 2 Posts: 3,357 ✭✭✭papu


    Fnz wrote: »
    Yeah I did hear about the Vista crash stats.

    It was a friend who told me about Nvidia "being better for drivers". I assume he knows what he's talking about (is really into his games). My impression was that ATI users would be waiting longer for (sometimes vital) bug fixes (for new games, in particular).

    That does sound bastardish, alright. I can see why people might become ATI 'fanboys' as a result of such treatment.
    yeah but ati do release Hot fixes which fixes allot of problems with new games , Bioshock and the hd2900xt was one i can remember..


  • Registered Users, Registered Users 2 Posts: 17,164 ✭✭✭✭astrofool


    ATI release drivers monthly, nVidia, at most, quarterly.

    The 5300 was the same generation as the Radeon 9700, two before the x1300. ATI still support the 9700 however :)


  • Registered Users, Registered Users 2 Posts: 8,067 ✭✭✭L31mr0d


    anyone think its worth waiting for the 1GB HD4870?


  • Registered Users, Registered Users 2 Posts: 195 ✭✭WEST


    L31mr0d wrote: »
    anyone think its worth waiting for the 1GB HD4870?

    That will depend on what resolution you play at and if you use AA. If you have a 24" monitor or less the 512 should be enough. Best check the Hardocp review for the HD4870, the review there shows the highest playbale settings for the card in a few games. Just after a quick glance at the review it seems 512 is enough.

    Plus it will depend on the game you play with too. Cyrsis uses a lot of memory with AA, then again that game is not really playbale with AA.


  • Closed Accounts Posts: 13,874 ✭✭✭✭PogMoThoin


    L31mr0d wrote: »
    anyone think its worth waiting for the 1GB HD4870?

    Apparently not, I read on XS that there's no need for 1GB as its GDDR5, there's enough bandwitdh with 512MB.


  • Advertisement
Advertisement