Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Next generation graphics cards just around the corner

  • 14-04-2004 8:04pm
    #1
    Registered Users, Registered Users 2 Posts: 6,007 ✭✭✭


    the reg

    After the arguably disappointing GeForce FX family of products, Nvidia has launched its latest graphics chip, which has been the subject of rumours from all corners of the world over the past few weeks. But now it's finally here and it's officially called the GeForce 6800. That's it, no fancy moniker in front of the numbers this time, just plain old 6800. But this time Nvidia has a product that will blow your socks off and no, I don't just mean a noisy fan this time around, I'm talking about the quality of the graphics and the outlandish performance, writes Lars-Göran Nilsson

    ...

    If you've saved up all your hard earned cash and planned to remortgage your house to get one of these new wonder cards, the good news is that as with recent top-end nVidia products, the GeForce 6800 Ultra will be launched at £399 inc VAT. And even if you don't have the fastest computer in the world, the GeForce 6800 Ultra will still do it justice as long as you're willing to play all your games at 1600 x 1200 resolution with 8x anti-aliasing and 8x anisotropic filtering.

    ...

    But what Nvidia should be supplying in the box is a discount voucher for a new power supply, as the GeForce 6800 Ultra needs two power lines to itself from the power supply. But worse than this is the fact that Nvidia recommends a minimum 480W power supply to make sure that the graphics card is supplied with enough juice. In return, you do get the most powerful graphics card on the market, which should convince enough users to fork out the extra money for a new power supply. Just don't expect to put one of these babies in small form factor box.

    ...
    toms hardware

    12.000 points in 3DMark 2003. A score of over 60.000 in AquaMark 3. Over 60fps in Halo at 1600x1200 and more than 50fps in FarCry with High FSAA and 4tap anisotropic filtering at 1024x768 - these are numbers that will bring tears of joy to PC enthusiasts everywhere.

    You'd have to go back quite a bit in the history of graphics cards to find a performance leap of similar magnitude. Maybe the transition from 3dfx's Voodoo1 to the Voodoo 2 comes close. Or the jump from NVIDIA's TNT2 to the GeForce 256 DDR, or perhaps the transition from ATi's Radeon 8500 to the 9700 Pro... Maybe these developments might come close, if, for the moment, we left aside the technological quantum leap in the past. But let's start at the beginning.

    ...

    Intresting to say the least.



    Re prices:

    the reg lists £399(€597) at release.

    TH lists as..
    6800 Ultra, $499(€417), 16 pipes, two Molex connectors, two slots, 400/550
    6800, $299(€249), 12 pipes, one Molex connector, one slot, TBD


Comments

  • Registered Users, Registered Users 2 Posts: 6,560 ✭✭✭Woden


    hell yeah it is, come on ati show us what ya got


  • Moderators, Computer Games Moderators Posts: 23,282 Mod ✭✭✭✭Kiith


    it's just a shame you'll have to sell your computer to afford it.


  • Registered Users, Registered Users 2 Posts: 7,136 ✭✭✭Pugsley


    DOnt need new gfx cards atm, but they'll help in a few years time, and drop prices of current GFX cards, so no harm in it :)


  • Registered Users, Registered Users 2 Posts: 6,560 ✭✭✭Woden


    ah they'll be no more expensive then any other high end card when they are released i reckon


  • Closed Accounts Posts: 944 ✭✭✭Captain Trips


    I'd find it hard to justify this; 9700 Pro and 2800XP+ gets FarCry running smooth at the standard Very High settings. Myabe depends on HL2 and D3, if they ever turn up.


  • Advertisement
  • Closed Accounts Posts: 2,918 ✭✭✭Deadwing


    It'll be cool to see what they can do in the longrun, but right now ill keep my money and my 9800pro. Theres no games at the moment that will really need that kind of raw power, and shelling out 400 or so odd euro for a FPS boost of 10 or 20 in Far cry is going a bit overboard, methinks.


  • Moderators, Category Moderators, Computer Games Moderators Posts: 52,410 CMod ✭✭✭✭Retr0gamer


    The power supply business is a bit dodgy to say the least. It will be interesting to see if the ATI card will have the same inconvenience and if not how it will affect its performance and sales.


  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    The power supply is recommended high as Intels CPUs are already soaking up stupid amounts of power. If they got things under control on their end then the PSU would prolly be 350w to 430w recommended. Besides, people neglect their PSUs, bout time they upgraded. ;)

    In regards "only a bit faster"... have to wonder what you play. Sure, if you play one of the few games, that only show a 33%+ performance increase while improving image quality (and only then at low res), then perhaps you personally can be satisifed with the current gen. I play at 1920x1080, which is too high a res to put AA or AF on with current gen cards, not to mention enabling Stereoscopic 3D, which causes a huge performance hit.

    Some games just dont run well on current cards at even "standard" settings... look at Halo and LockOn, jumps from 23fps to 50fps will make them different games.



    Matt


  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by Deadwing
    It'll be cool to see what they can do in the longrun, but right now ill keep my money and my 9800pro. Theres no games at the moment that will really need that kind of raw power, and shelling out 400 or so odd euro for a FPS boost of 10 or 20 in Far cry is going a bit overboard, methinks.


    That 20fps equates to an 80% increase in FPS:

    http://www.firingsquad.com/hardware/nvidia_geforce_6800_ultra/page23.asp

    Unless you dont put AA on your existing EUR400 card?



    Matt


  • Closed Accounts Posts: 2,918 ✭✭✭Deadwing


    Actually i dont put AA on my card because i find the difference is negligible in most cases, and its not worth it for the performance hit.
    Im sure the NV40 (or 6800) will be a great card, but buying another 400 euro (if its even that cheap) card, when my current one does the same job? No thanks. Ill wait until the next gen cards really come into thier own with doom 4 or half life 3 orsomething until they warrant a purchase.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by Deadwing
    Actually i dont put AA on my card because i find the difference is negligible in most cases, and its not worth it for the performance hit.
    Im sure the NV40 (or 6800) will be a great card, but buying another 400 euro (if its even that cheap) card, when my current one does the same job? No thanks. Ill wait until the next gen cards really come into thier own with doom 4 or half life 3 orsomething until they warrant a purchase.


    Well, in that case I envy your poor eyesight and/or small monitor! :p



    Matt


  • Registered Users, Registered Users 2 Posts: 2,614 ✭✭✭BadCharlie


    You play at such a high res, How on earth do u c anything on the screen ??

    U have a 21inch monitor or something ?


  • Registered Users, Registered Users 2 Posts: 15,817 ✭✭✭✭po0k


    Originally posted by Matt Simis
    Well, in that case I envy your poor eyesight and/or small monitor! :p



    Matt

    I play games at res from 800x600 (far cry) to 1280x1024, and tbqh, once I go past 1024x768 I would never consider enabling AA.
    I simply don't see the difference and it's of no benefit to me, especially in fast-moving games (of which the majority of my gaming time is spent playing).


  • Registered Users, Registered Users 2 Posts: 1,272 ✭✭✭i_am_dogboy


    heres a link to a review of a reference model
    http://www.beyond3d.com/previews/nvidia/nv40/index.php?p=2
    to be honest i think nvidia have either ****ed up royally somewhere or they are scared ****less of ati's next line of cards.......if you look back over the last few years, they havent exactly released any cards with such ridiculously high specs, not even the geforce 2(which was an absolute beast) specs can compare to this. anyway i wont be buying one.........i'd much rather if i could buy a new vodoo :(

    it would be wise to wait and see what s3 and xgi have up there sleeves before buying a new card, hell even matrox might do something on par with the G400

    edit
    i forgot to mention, as the article points out, nvidia have been kown to bull**** a bit about their pipelines, the gedorce fx 5950 was said to have 8x1 pixel pipelines, which would suggest 8 pixel pipelines, but it had 4 and 2 texture units per pipe.......which is a completely pointless addition to my post


  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by SyxPak
    I play games at res from 800x600 (far cry) to 1280x1024, and tbqh, once I go past 1024x768 I would never consider enabling AA.
    I simply don't see the difference and it's of no benefit to me, especially in fast-moving games (of which the majority of my gaming time is spent playing).

    I appreciate we all see things different (literally).. but seriously what size monitors do you use??

    I have 24" CRT, even at 1280x1024 the pixels are close to 1mm in size, more than big enough to see. Also, without AA you suffer "pixel popup", objects in the distance that are just too far to render correctly at low res, so they pop in and out of the visual gameworld (obviously this happens on any monitor, its a problem with low res). This is all ignoring AF, when at full whack it turns the slimy, blurred textures of the standard filtering into sharp, details ones. Without it, its clear you are walking in an "orb" of sharper textures that ends about 2.5m in front of you. And just like AA, the absence of AF means certain details will not be rendered correctly (or even visable). See Firingsquads LockOn pictures.

    Virtually all games played on such high end cards are "fast moving". If they werent fast moving its unlikely you would need such fast cards. I play BF:Vietnam at 1600x1000.. and frankly it looks like a different game when 2xAA and 8XAF is turned on. I simply couldnt play at lower settings.

    Im amazed that there is anyone that could play at lower settings in this day and age, especially when you have meaty cards like 9800s.



    Matt


  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by i_am_dogboy
    heres a link to a review of a reference model
    http://www.beyond3d.com/previews/nvidia/nv40/index.php?p=2
    to be honest i think nvidia have either ****ed up royally somewhere or they are scared ****less of ati's next line of cards.......if you look back over the last few years, they havent exactly released any cards with such ridiculously high specs, not even the geforce 2(which was an absolute beast) specs can compare to this. anyway i wont be buying one.........i'd much rather if i could buy a new vodoo :(

    it would be wise to wait and see what s3 and xgi have up there sleeves before buying a new card, hell even matrox might do something on par with the G400


    Look at what they were competing against back with GF2 days.. They merely are responding to the competition, there is nothing shocking or "iffy" about the GF6800, its merely the competitor to the incredibly fast ATI cards. I wouldnt even list the GF2 as ground breaking, GF1 or GF3 yeah, but GF2 was just a small evolutionary step. Remember the overpriced GF2 Ultra.. all it had was higher clock speed and green heatsinks?

    If you want VooDoo cards, buy the Geforce6800, many of the 3dfx people worked on this card, infact one of them is a team leader on the design team in nVidia.


    Matt


  • Registered Users, Registered Users 2 Posts: 1,272 ✭✭✭i_am_dogboy


    I wasn't saying the geforce 2 was impressive, but the specs for the time were pretty amazing, the geforce 1 couldn't even handle the features it was meant to provide, the geforce 3 was however a great card. And when I say i want a vodoo, I want a vodoo, by 3dfx........it just doesn't seem the same without them.......i remember the excitement the day i bought my vodoo 3.

    The 6800 is looking good, but is the top range model just going to be a mythical peice of hardware like the voodoo 5 6000?

    Still im gonna wait and see retail models and what happens with the other companies before i form a proper opinion.


  • Closed Accounts Posts: 944 ✭✭✭Captain Trips


    Originally posted by i_am_dogboy


    it would be wise to wait and see what s3 and xgi have up there sleeves before buying a new card, hell even matrox might do something on par with the G400


    Sure guy, right as soon as my Amiga order arrives, I'll order a video card from Matrox.


  • Registered Users, Registered Users 2 Posts: 3,754 ✭✭✭Big Chief


    Availability
    The first GPUs based on the NVIDIA GeForce 6 Series, the GeForce 6800 Ultra and GeForce 6800 models, are manufactured using IBM’s high-volume 0.13-micron process technology and are currently shipping to leading add-in-card partners, OEMs, system builders, and game developers.

    Retail graphics boards based on the GeForce 6800 models are slated for release in the next 45 days.

    does this mean they will be on a shelf near us soon?

    i am awaiting Ati's reply to this...


  • Registered Users, Registered Users 2 Posts: 15,817 ✭✭✭✭po0k


    0.13u is a bit shit though, compared to IBM's mastery of the 90nm process (970FX, 2Ghz 64-bit @ 24W:)).
    Spose it does allow them to punch out more chips using a tried and tested fab process, though not as much as 90nm on 300mm wafers....


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 6,560 ✭✭✭Woden


    i'd say it will be 2months before cards hit the shelves

    @ dog boy you can be sure that if the retail cards don't perform like that nvidia will be laughed at. i'm sure the retail cards will be better with variability in the cooling set ups and if they desire somewhat higher frequencies.


  • Registered Users, Registered Users 2 Posts: 6,892 ✭✭✭bizmark


    Originally posted by Captain Trips
    Sure guy, right as soon as my Amiga order arrives, I'll order a video card from Matrox.

    LOL :D

    What are you talking about dogboy ? the 6800 is a masterpiece double the proformance of the 9800xt how could nividia have "****ed up rolaly" by produceing a card that kicks the **** out of the last generations best card??

    Me thinks your stuck in a time warp or something


  • Registered Users, Registered Users 2 Posts: 1,272 ✭✭✭i_am_dogboy


    Originally posted by bizmark
    LOL :D

    What are you talking about dogboy ? the 6800 is a masterpiece double the proformance of the 9800xt how could nividia have "****ed up rolaly" by produceing a card that kicks the **** out of the last generations best card??

    Me thinks your stuck in a time warp or something
    yeah......i'm kind of stuck in the past when it comes to games......i wanna play sonic 1 again......

    what i meant by the ****ed up royally comment is that they might have screwed something up to require such specs(benchmarks dont prove every aspect of performance), in most cases the new generation of cards kick the ass of the previous generation of cards anyway.....im thinking the radeon 9500 pro taking on the ti 4400 and 4600 for a far better price than both.

    and data, i wasn't saying the retail model would perform worse, i just said i wanted to see the retail models, i meant that i wanted to see the different cards available and how they compare to each other.

    and on another note, matrox made great cards a few years ago, did anyone see slave zero with bumpmapping? at the time there was nothing like it out there


  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by i_am_dogboy
    yeah......i'm kind of stuck in the past when it comes to games......i wanna play sonic 1 again......

    what i meant by the ****ed up royally comment is that they might have screwed something up to require such specs(benchmarks dont prove every aspect of performance), in most cases the new generation of cards kick the ass of the previous generation of cards anyway.....im thinking the radeon 9500 pro taking on the ti 4400 and 4600 for a far better price than both.

    and data, i wasn't saying the retail model would perform worse, i just said i wanted to see the retail models, i meant that i wanted to see the different cards available and how they compare to each other.

    and on another note, matrox made great cards a few years ago, did anyone see slave zero with bumpmapping? at the time there was nothing like it out there

    1) The Radeon 9700 was introduced when the Ti4600 reigned, not the 9500 (it came later). By your extremely unusual reasoning, it also must have "****ed up" because it had vastly better spec than the many incarnations of DX8 cards it trumped.
    2) The Matrox G400 had good bump mapping, tho the GF hw had a similar method as well as the superior Dot Product BM technique. Matrox did surprisingly well getting vendors to support them.. but the days of small companies competing on the high end front are over, its simply not financially viable. Any of these minnows (Matrox, VIA, S3, PowerVR etc) could come out with a low end, very competitive card, but the staggering R&D and production cost of the high end cards (that sell less then 5% of the low-mid range cards, no chance to recoup costs unless you have a "top to bottom" chip family ready to go) means only the powerhouses of the industry need apply. Forget about your white knight.


    Matt


  • Registered Users, Registered Users 2 Posts: 17,170 ✭✭✭✭astrofool


    dogboy's just trolling.

    Well, I hope so, cos the other option is that he's extremely stupid, either way, its probably safe to ignore :)

    3dfx powa! ;)


  • Registered Users, Registered Users 2 Posts: 5,463 ✭✭✭shinzon


    I will laugh heartily when these cards come out, when everyone rushes out to get it and install it on there machines and the inevitable happens

    1) Im getting artifacts in such and such

    2) this game wont work

    3) drivers are imcompatible ETC ETC

    Theres an old saying that i believe is applicable in this instance "dont believe the hype", until one of these cards are in your machines up and running with no problems whatsoever then believe it, dont take some tech bods word for it at some hardware expo, hes there to sell the card and rubbish everything else


    Shin


  • Registered Users, Registered Users 2 Posts: 1,272 ✭✭✭i_am_dogboy


    Originally posted by Matt Simis
    1) The Radeon 9700 was introduced when the Ti4600 reigned, not the 9500 (it came later). By your extremely unusual reasoning, it also must have "****ed up" because it had vastly better spec than the many incarnations of DX8 cards it trumped.
    2) The Matrox G400 had good bump mapping, tho the GF hw had a similar method as well as the superior Dot Product BM technique. Matrox did surprisingly well getting vendors to support them.. but the days of small companies competing on the high end front are over, its simply not financially viable. Any of these minnows (Matrox, VIA, S3, PowerVR etc) could come out with a low end, very competitive card, but the staggering R&D and production cost of the high end cards (that sell less then 5% of the low-mid range cards, no chance to recoup costs unless you have a "top to bottom" chip family ready to go) means only the powerhouses of the industry need apply. Forget about your white knight.


    Matt
    Ok, my original comment didn't really come out the way i meant it, what i as trying to saw was that it strikes me as odd that nvidia, a company who usually pioneer new technologies, have produced such a powerhouse of a card, it's not really like them........i just though that maybe they may have screwed up some aspect of performance and are using the massive fill rate to compensate, kind of like covering up an achilles heal. I wasn't in any way implying that they ****ed up by designing and producing a very good fast card. It is a bit extreme i know but it could happen with a reputation like theirs to protect. I didn't say that they definitely did **** up either.......I also suggested that they may competing with ATI.

    And the mention of the 9500 instead of the 9700 was just to prove the point of one generation being better than the previous, I thought the 9500 was a better example because it was a mid range card taking on the high end offerings from nvidia.

    You are right on the matter of matrox and the other companies though.......It's a shame really, I was a big fan of theirs.

    Shinzon i gotta agree with you completely, the amount of problems i had with my geforce was unreal and the performance just wasn't what i was execting either. It didn't live up to the hype for me.


  • Registered Users, Registered Users 2 Posts: 6,421 ✭✭✭Doodee


    It is quite possible that Matrox or PowerVR could come out with a card to compete. For one, PowerVR backed the whole Tile based Rendering, which was far more efficient, they ****ed up though by not providing hardware T&L (or was it Texture compression) on the cards.

    If i recall, when those cards originally came out, everyone blowed on about how their performance was better than the GF3 based cards in favoured games like counter strike or quake3. I also remember suggestions about a new type of tile based card, with signifigant performance increases etc.
    Then again, dont believe what you read.

    This is simply a jump in the generations. The GF4 Cards and FX's belong to the 2.0Ghz + bracket of CPU's. Its now upto 3.4Ghz and over a year since they arrived so its no wonder this card is out. Also, Cards will go hand in hand with the games industry, thats sort of obvious. So what with HL2 meant to be here now, etc, Nvidia are only supplying to the market what was needed. Nobody seems to want to settle for games running anymore, they wanna show off their flash cards and big CPU's because they're overcompensating :D

    Im just curious as to weither ATI will support the newer Generations of Processors or do as they did with the 9500 and increase the performance of an older and more reliable one. (Nvidia backed their newer processors in the FX series)

    Gah, its been so long since i read a 15 page article on the latests and greatest GPU's *sniffle*


  • Closed Accounts Posts: 947 ✭✭✭neXus9


    Originally posted by Matt Simis
    I appreciate we all see things different (literally).. but seriously what size monitors do you use??

    I have 24" CRT, even at 1280x1024 the pixels are close to 1mm in size, more than big enough to see. Also, without AA you suffer "pixel popup", objects in the distance that are just too far to render correctly at low res, so they pop in and out of the visual gameworld (obviously this happens on any monitor, its a problem with low res). This is all ignoring AF, when at full whack it turns the slimy, blurred textures of the standard filtering into sharp, details ones. Without it, its clear you are walking in an "orb" of sharper textures that ends about 2.5m in front of you. And just like AA, the absence of AF means certain details will not be rendered correctly (or even visable). See Firingsquads LockOn pictures.

    Virtually all games played on such high end cards are "fast moving". If they werent fast moving its unlikely you would need such fast cards. I play BF:Vietnam at 1600x1000.. and frankly it looks like a different game when 2xAA and 8XAF is turned on. I simply couldnt play at lower settings.

    Im amazed that there is anyone that could play at lower settings in this day and age, especially when you have meaty cards like 9800s.



    Matt



    What monitor have you got, and how much was it? It would be damn expensive to get a monitor that big with that type of resoloution with a decent refresh rate. What's your refresh rate anyway, at 1600x1000??????????????????????????


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by neXus9
    What monitor have you got, and how much was it? It would be damn expensive to get a monitor that big with that type of resoloution with a decent refresh rate. What's your refresh rate anyway, at 1600x1000??????????????????????????

    Sony W900 24" Widescreen cost me EUR600, but they appear in the Buy and Sell and Boards Forsale (Dubseller selling them afaik) for around the EUR400 mark from time to time. This model is old, but was originally very expensive (EUR2.5K). As long as you can get over the curvature (its not a flat screen, its successor is, the FW900), its a great monitor.

    Resolutions of note:
    1920x1200, 75Hz
    1920x1080, 85Hz
    1600x1024, 85Hz
    1600x900, 100Hz
    800x600, 140Hz

    The top 4 are all widescreen, the top two are the "ideal" modes, as you can fit two A4 pages side by side. I run at 1920x1080 on the desktop, great for web browsing. Id need something as fast as the Geforce 6800 to play games at that res tho. Any decent 21" monitor will do (at least) 1600x1200 nicely however and why not run at high res if you have cards fast enough.. why retard your games to lower res needlessly?


    Matt


  • Closed Accounts Posts: 947 ✭✭✭neXus9


    That's pretty cool. Have a 19 inch samsung syncmaster FST at 1152x864 at 85Hz, with a dot pitch of .20mm. That resoloution does me grand, if I bump it up anymore I'll get a crappy refresh rate. What happens if you overide the refresh rate for your set resoloution???:eek: I'm guessing it'll totally f*ck up.

    Checked out the fst FW900. It's about EUR 2000!!!!!!!!!!!!!


  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by neXus9
    That's pretty cool. Have a 19 inch samsung syncmaster FST at 1152x864 at 85Hz, with a dot pitch of .20mm. That resoloution does me grand, if I bump it up anymore I'll get a crappy refresh rate. What happens if you overide the refresh rate for your set resoloution???:eek: I'm guessing it'll totally f*ck up.

    Checked out the fst FW900. It's about EUR 2000!!!!!!!!!!!!!

    It comes up with a message saying "Signal out of Spec" (doesnt even let it try). Ive always bought monitors second hand (ok, except my LCD), when 17" CRTs were expensive I had a 20" that only cost me £100. I think size really matters with monitors, and new ones just cost too much!


    Matt


  • Closed Accounts Posts: 947 ✭✭✭neXus9


    Originally posted by Matt Simis
    It comes up with a message saying "Signal out of Spec" (doesnt even let it try). Ive always bought monitors second hand (ok, except my LCD), when 17" CRTs were expensive I had a 20" that only cost me £100. I think size really matters with monitors, and new ones just cost too much!


    Matt

    Thanks for the info. I'll look into getting a second hand one, because my dad needs one for his laptop.


  • Registered Users, Registered Users 2 Posts: 1,823 ✭✭✭Horsefumbler


    What's the story with the 2nd power source thing? What does that mean you'll need? two plugs?:confused:


  • Moderators, Computer Games Moderators Posts: 23,282 Mod ✭✭✭✭Kiith


    could you imagine the kind of games that would be available on the pc, if games developers had to really work on the engines to perform on available hardware, instead of constantly making us buy new graphics cards. look at the consoles. X-box has halo 2, which looks amazing. PS2 has Killzone(i think it's called that), which also looks fantastic. both those consoles have less than a 700mhz processer. yet everytime a new engine comes out, like Source or the unreal 3 engine, we've all got to upgrade. it's really annoying.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by Kiith
    could you imagine the kind of games that would be available on the pc, if games developers had to really work on the engines to perform on available hardware, instead of constantly making us buy new graphics cards. look at the consoles. X-box has halo 2, which looks amazing. PS2 has Killzone(i think it's called that), which also looks fantastic. both those consoles have less than a 700mhz processer. yet everytime a new engine comes out, like Source or the unreal 3 engine, we've all got to upgrade. it's really annoying.

    Thats a very simplistic few of the matter.

    For one, consoles run at resolutions people on the PC would laugh at (640x480ish) and thanks to the common low-def TV, textures also need not be near as sharp. They skimp on filtering techniques and generally feature smaller game worlds (look at Deus Ex2 on the XB vs DE1 on PC). They have advantages in the regards the hardware and feature set are static, so games can be designed to get the max out of the machine.

    I dont think the fact the PC market is progressive is "annoying", it keeps things interesting, constantly pushes boundaries. If you dont want to upgrade, then dont, you can play all the new engines on old cards, just set the detail to low and the res to blocky, just like the developers on consoles do.



    Matt



    PS: "CPUs" in the traditional sense are meaningless on consoles, its the graphics engines that matter (much like PCs soon), however the XB has a 733MHz CPU.


  • Moderators, Computer Games Moderators Posts: 23,282 Mod ✭✭✭✭Kiith


    ok that was wordered badly. what i meant was, at the speed that the pc's graphics are increasing, there's very little time to fully optimise the engine. with new engines out quite often, wouldn't it be better (financially at least) to use that engine to it's maximum. look at the quake 3 engine now compared to when it was released. call of duty uses the quake engine, and that look great. granted it's a highly modified version of the engine, but it show that new ones aren't needed as often as they are released.

    i will admit, though, that i like seeing technology jump forward. i only wish it didn't cost as much.


  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by Kiith
    ok that was wordered badly. what i meant was, at the speed that the pc's graphics are increasing, there's very little time to fully optimise the engine. with new engines out quite often, wouldn't it be better (financially at least) to use that engine to it's maximum. look at the quake 3 engine now compared to when it was released. call of duty uses the quake engine, and that look great. granted it's a highly modified version of the engine, but it show that new ones aren't needed as often as they are released.

    i will admit, though, that i like seeing technology jump forward. i only wish it didn't cost as much.

    Thats still all highly subjective. I thought Call of Duty looked awful. Granted, its looked good, for a Quake3 based game, but compared to engines available at the time, its a mod for Q3 or at best, MoH 1.5. If they used a decent engine, the game could have been so much better (still would have been mindless and arcadey tho..).

    Thats another thing, new engines dont come out all the time, new gfx cards come out faster than engines. The two most popular engines at the moment are Q3 and Unreal Warfare, both several years old.



    Matt


  • Registered Users, Registered Users 2 Posts: 1,169 ✭✭✭dangerman


    I don't understand everyone's obsession with how a game looks.

    Granted, graphics are pretty, and they really make a difference...

    ...for the first 5 minutes of the game. After that your involved in the game and you are no longer concentrating on the bump-mapping on the txtures as you kill your mates in ut2004 or whatever.

    My point is, apart from when your friends are over and you want to impress them, its only gameplay that counts.

    I've a 733mhz p3, 384megs of ram, geforce3, which can run games such as desert combat, splinter cell: pandora tomorrow, ut2004 just fine. Who cares if my graphics are bad? What makes the game more enjoyable - the fact that you're winning, or the fact that your losing but the water is super-reflective and the gun is all shiny?

    I've had the cash to upgrade on and off for the last 12 months - my main reason for upgrading now is the simple reason that its at critical mass now ~ my cpu aint fast enough to play the new breed thats coming out at the mo (far cry etc.).

    I'm not saying anyone is a loser cause they get a kick out of getting the most from their rig, i'm just saying I don't understand it.


  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by dangerman
    I don't understand everyone's obsession with how a game looks.

    I've a 733mhz p3, 384megs of ram, geforce3, which can run games such as desert combat, splinter cell: pandora tomorrow, ut2004 just fine. Who cares if my graphics are bad? What makes the game more enjoyable - the fact that you're winning, or the fact that your losing but the water is super-reflective and the gun is all shiny?

    Some of us like to have our cake and eat it, win and have it look good. You could take your line of thought even further.. why bother with sound.. its not required, why use a DX8 card at all for that matter, put in a TNT2? 384MB of Ram? Pah, the game could work on 64MB... Its all about the standard of quality one can accept. You just happen to have low standards. :D


    Matt


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 1,169 ✭✭✭dangerman


    why bother with sound.. its not required,

    but then i wouldn't hear all your bullets missing me. sound is required.

    4000x3000 tru-colour per pixel bump mapping super hard core ****e is not.

    I guess I do have low standards when it comes to graphics - because until photo-realism takes hold (anyone want to guess? I say 10 - 15 years) then graphics will always be not-quite-good-enough.

    Gameplay is the only important factor. Gameplay gameplay gameplay.


  • Closed Accounts Posts: 144 ✭✭andrew12g


    Dangerman has a point, i mean look at Counterstrike or MOHAA, the graphics are not that great but the game is still playable. Good Graphics arent essential


  • Registered Users, Registered Users 2 Posts: 8,718 ✭✭✭Matt Simis


    Originally posted by andrew12g
    Dangerman has a point, i mean look at Counterstrike or MOHAA, the graphics are not that great but the game is still playable. Good Graphics arent essential


    No one said it was, its simply part of the package. I like to have all the parts at their best and some dont, thats fine. Certainly cheaper to have a different mindset.



    Matt


Advertisement