Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

The PC Gaming Hardware Thread

  • 03-06-2014 1:17pm
    #1
    Closed Accounts Posts: 537 ✭✭✭


    I know we have a PC building and upgrading forum, but I thought it would be a good idea to have a thread here covering news and reviews of hardware that pertains specifically to PC gaming. And with Computex already under way, there should be no shortage of news over the coming days.

    For example, Corsair have announced the HG10. Basically it's a bracket/adapter thingy that will allow you to cool your GPU with one of their Hydro series AIO coolers. Can see some disadvantages to it myself, but it could be beneficial to some users



    Staying with Corsair, they've launched their Cherry MX RGB keyboards. Gorgeous, but dayum, they expensive.

    Speaking of expensive, the long awaited Asus ROG Swift PG278Q has been priced and given a release date. 1440P, 144Hz with GSync, it's going to appeal to a lot of people, but it will be $800 when it's released in July. Europeans will of course pay more. :(


Comments

  • Closed Accounts Posts: 537 ✭✭✭Yeti Beast


    AMD have been showing off FreeSync (their alternative to Nvidia's G-Sync) at Computex. They're not revealing the make or model of the monitor they're using, but it's a pre-existing 1440p display. That means a firmware update might be able to make an existing monitor (with Display Port 1.2) compatible with FreeSync, whereas G-Sync requires the addition of hardware.

    AMD do say;
    "...this does not guarantee that firmware alone can enable the feature, it does reveal that some scalar/LCD combinations are already sufficiently advanced that they can support some degree of DRR (dynamic refresh rate) and the full DPAS (DisplayPort Adaptive Sync) specification through software changes."

    Either way, it still looks like variable refresh rate tech is a good few months away from being common place, but surely it's the way to go.


  • Registered Users, Registered Users 2 Posts: 11,749 ✭✭✭✭wes


    Much prefer the open approach from AMD, and it would be interesting to see them add this technology to television, as I currently run PC games on a 42 inch tv, and don't want to use a smaller monitor.


  • Closed Accounts Posts: 537 ✭✭✭Yeti Beast


    wes wrote: »
    Much prefer the open approach from AMD, and it would be interesting to see them add this technology to television, as I currently run PC games on a 42 inch tv, and don't want to use a smaller monitor.

    Agreed - I'd like to see Nvidia adopt Free-Sync as well (assuming it's as good as G-Sync). If they don't, I hope there'll be monitors that support both techs.


  • Registered Users, Registered Users 2 Posts: 12,301 ✭✭✭✭MadYaker


    wes wrote: »
    Much prefer the open approach from AMD, and it would be interesting to see them add this technology to television, as I currently run PC games on a 42 inch tv, and don't want to use a smaller monitor.

    A 42 inch? Jaysus! How far away do you sit?? I use a 24 inch I think, ny bigger and id have to move my head to see the whole screen.


  • Registered Users, Registered Users 2 Posts: 11,749 ✭✭✭✭wes


    MadYaker wrote: »
    A 42 inch? Jaysus! How far away do you sit?? I use a 24 inch I think, ny bigger and id have to move my head to see the whole screen.

    I sit between 4 and 6 feet away depending on where I put my chair. Use the TV for console and PC gaming.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 86,729 ✭✭✭✭Overheal


    wes wrote: »
    Much prefer the open approach from AMD, and it would be interesting to see them add this technology to television, as I currently run PC games on a 42 inch tv, and don't want to use a smaller monitor.

    Open is the way to go. Same with Mantle. And their OpenCL Applied Parallel Processing SDK. The Nvidia cards to my knowledge don't provide as much benefit outside of gaming as the AMDs when it comes to improving performance of non gaming software.

    edit: lol yeah, heres AMD:
    Software[edit]
    AMD develops the AMD CodeXL tool suite which includes a GPU debugger, a GPU profiler, a CPU profiler and an OpenCL static kernel analyzer. CodeXL is freely available at AMD developer tools website.
    AMD has also taken an active part in developing coreboot, and open source projects aimed at replacing the proprietary BIOS firmware.
    Other AMD software includes the AMD Core Math Library, and open-source software including the AMD Performance Library, and the CodeAnalyst performance profiler.
    AMD contributes to open source projects, including working with Sun Microsystems to enhance OpenSolaris and Sun xVM on the AMD platform.[50] AMD also maintains its own Open64 compiler distribution and contributes its changes back to the community.[51]
    In 2008, AMD released the low-level programming specifications for its GPUs, and works with the X.Org Foundation to develop drivers for AMD graphics cards.[52][53]

    heres Nvidia:
    Until 23 September 2013 Nvidia has not published any documentation for its hardware,[35] meaning that programmers could not write appropriate and effective free and open-source device driver for Nvidia's products without resorting to (clean room) reverse engineering.

    Instead, Nvidia provides its own binary GeForce graphics drivers for X.Org and a thin open-source library that interfaces with the Linux, FreeBSD or Solaris kernels and the proprietary graphics software. Nvidia also provided but stopped supporting an obfuscated open-source driver that only supports two-dimensional hardware acceleration and ships with the X.Org distribution.[36]

    The proprietary nature of Nvidia's drivers has generated dissatisfaction within free-software communities.[37] Some Linux and BSD users insist on using only open-source drivers, and regard Nvidia's insistence on providing nothing more than a binary-only driver as wholly inadequate, given that competing manufacturers (like Intel) offer support and documentation for open-source developers, and that others (like AMD) release partial documentation and provide some active development.[38][39]

    Because of the closed nature of the drivers, Nvidia video cards cannot deliver adequate features on some platforms and architectures given that Nvidia only provides x86/x64 driver builds.[40] As a result, support for 3D graphics acceleration in Linux on PowerPC does not exist, nor does support for Linux on the hypervisor-restricted PlayStation 3 console.

    Some users claim that Nvidia's Linux drivers impose artificial restrictions, like limiting the number of monitors that can be used at the same time, but the company has not commented on these accusations.[41]

    As of 31 January 2014, Nvidia's Alexandre Courbot committed an extensive patch set which add initial support for the GK20A (Tegra K1) to nouveau.[42]

    Theres a reason all the consoles are AMD now.


  • Registered Users, Registered Users 2 Posts: 22,924 ✭✭✭✭ShadowHearth


    If AMD can pull it off with software, then more power to them. No way I am paying that much extra for that game sync. It's same as,just buying 2 gpus instead of one and keep V sync on. To make it worse for nvidia, this way I am not tied in with one or another manufacturer. I can change my monitor in to what ever I want and buy any brand gpu.


  • Registered Users, Registered Users 2 Posts: 8,405 ✭✭✭gizmo


    Overheal wrote: »
    Open is the way to go. Same with Mantle. And their OpenCL Applied Parallel Processing SDK. The Nvidia cards to my knowledge don't provide as much benefit outside of gaming as the AMDs when it comes to improving performance of non gaming software.
    Not that I disagree with the open being better approach but it's worth pointing out that AMD have yet to publish the Mantle spec or any form of SDK. They've said early this year or maybe next year but there's been no word on it since the original announcement.

    On nVidia's end they also provide a range of development tools, from full frame and shader debugging tools to GPU profilers for both DirectX and OpenGL, all for free. I've not touched the *Works suite but it looks pretty nifty, however, if there's a lock-in clause for devs as AMD are claiming then they can shove it.

    As for why the consoles are all AMD this time around, well the answer to that is their APU lineup more so than anything else.


  • Closed Accounts Posts: 537 ✭✭✭Yeti Beast


    It's same as,just buying 2 gpus instead of one and keep V sync on.

    It really really isn't the same.


  • Registered Users, Registered Users 2 Posts: 86,729 ✭✭✭✭Overheal


    gizmo wrote: »
    Not that I disagree with the open being better approach but it's worth pointing out that AMD have yet to publish the Mantle spec or any form of SDK. They've said early this year or maybe next year but there's been no word on it since the original announcement.
    It's making it to devs somehow. Crytek and Firaxis both have their hands on the documentation and are working with it. Battlefield 4 was also apparently designed with the technology in mind so it will be ready to fire when it does release. I'd wager it's about launching it right than launching it 'on time'
    As for why the consoles are all AMD this time around, well the answer to that is their APU lineup more so than anything else.

    If that were true nvidia would hold the torch through its Tegra SoC platform, which now sports chips that are reported to rival the performance of either the PS3 or Xbox 360.

    Correctly though, AMD is serious and smart about maximizing potentials from CPU and GPU performance.

    AMD Reports easy portability from Mantle to DirectX12 API http://news.softpedia.com/news/Game-Breaker-AMD-Mantle-Can-Be-Ported-to-DirectX-12-Easily-444625.shtml


  • Advertisement
  • Closed Accounts Posts: 537 ✭✭✭Yeti Beast




  • Registered Users, Registered Users 2 Posts: 22,924 ✭✭✭✭ShadowHearth


    Yeti Beast wrote: »
    It really really isn't the same.

    2 Gpus or more expensive Gpu will have enough power to keep Vsync on, so why would I bother paying that money for G-sync and putting myself in huge restriction and pretty much signing myself to a mercy of Nvidia.
    I am not a fanboy of AMD or Nvidia, I always just go with best option in the market at the time of purchase. No way I would put myself in a stupid situation where I am limited to one manufacturer.
    The monitor they plan on launching there: 144hz G-sync. I am not expert, but aint that a bit useless combination? No way you will be pushing that FPS on majority of games so you wont be having screen tearing issues. If you do pull out 144 frames then you might as well have Vsync on standard monitor.


    If Nvidia really wants me to wow me, then make a good 60hz monitor with G-sync, which is only little bit more expensive or same price and I might buy it. Not twice the price of normal non g-sync monitor.


  • Registered Users, Registered Users 2 Posts: 22,924 ✭✭✭✭ShadowHearth


    Yeti Beast wrote: »

    I would not mind getting that for PC.


  • Registered Users, Registered Users 2 Posts: 8,405 ✭✭✭gizmo


    Overheal wrote: »
    It's making it to devs somehow. Crytek and Firaxis both have their hands on the documentation and are working with it. Battlefield 4 was also apparently designed with the technology in mind so it will be ready to fire when it does release. I'd wager it's about launching it right than launching it 'on time'
    I'm referring to the open source side of their offerings. I've no doubt some developers have their hands on it already. :)
    Overheal wrote: »
    If that were true nvidia would hold the torch through its Tegra SoC platform, which now sports chips that are reported to rival the performance of either the PS3 or Xbox 360.
    Not when their Tegra line to date have been built around the ARM architecture and aimed primarily at the mobile space. It's only with their recently revealed K1-series that they seem to competing with AMD at the higher end of that line. I've never really looked into how different they are but there's an article here based on the latest nVidia offering and AMD's Jaguar based offerings. It's probably the closest comparison you're going to get.


  • Closed Accounts Posts: 537 ✭✭✭Yeti Beast


    2 Gpus or more expensive Gpu will have enough power to keep Vsync on, so why would I bother paying that money for G-sync and putting myself in huge restriction and pretty much signing myself to a mercy of Nvidia.
    I am not a fanboy of AMD or Nvidia, I always just go with best option in the market at the time of purchase. No way I would put myself in a stupid situation where I am limited to one manufacturer.
    The monitor they plan on launching there: 144hz G-sync. I am not expert, but aint that a bit useless combination? No way you will be pushing that FPS on majority of games so you wont be having screen tearing issues. If you do pull out 144 frames then you might as well have Vsync on standard monitor.

    If Nvidia really wants me to wow me, then make a good 60hz monitor with G-sync, which is only little bit more expensive or same price and I might buy it. Not twice the price of normal non g-sync monitor.

    We're arguing different things here - you're giving out that G-Sync comes with a price premium, whereas Free-Sync doesn't. I agree with this. Open source is the way to go and in this regard Free-Sync wins hands down.

    What I'm arguing is that permanent V-Sync is not the same as adaptive refresh rate technology (from either camp) and it comes with it's own huge problems. It's these problems that adaptive sync is trying to eliminate. You can have cards in crossfire/SLI, and you're still going to get lag, micro-stutter etc. V-Sync is one of the causes of those problems, not the solution.

    As for the last part, I assume you're talking about the ROG Swift. It's expensive because it's a 27" 1440p 8bit TN panel that can operate at 144Hz. Take G-Sync out of it and you're still looking at megabucks. Oh, and Asus make and price the unit, not Nvidia.


  • Registered Users, Registered Users 2 Posts: 22,924 ✭✭✭✭ShadowHearth


    Yeti Beast wrote: »
    We're arguing different things here - you're giving out that G-Sync comes with a price premium, whereas Free-Sync doesn't. I agree with this. Open source is the way to go and in this regard Free-Sync wins hands down.

    What I'm arguing is that permanent V-Sync is not the same as adaptive refresh rate technology (from either camp) and it comes with it's own huge problems. It's these problems that adaptive sync is trying to eliminate. You can have cards in crossfire/SLI, and you're still going to get lag, micro-stutter etc. V-Sync is one of the causes of those problems, not the solution.

    As for the last part, I assume you're talking about the ROG Swift. It's expensive because it's a 27" 1440p 8bit TN panel that can operate at 144Hz. Take G-Sync out of it and you're still looking at megabucks. Oh, and Asus make and price the unit, not Nvidia.

    Getting 2 cards is not direct solution, but extra horse power to get Vsync on.You can get one card, but faster one and turn on Vsync on too.
    Gsync is a good technology, but its not worth the premium it charges for, needing extra hardware and limited only to Nvidia GPUS.


  • Registered Users, Registered Users 2 Posts: 86,729 ✭✭✭✭Overheal


    what he's saying is V-sync is old and G-sync is a progressive albeit proprietary new tech

    That and free-sync (prefer the latter) should be encouraged as a means to kill V-sync. V-sync is a very resource-heavy GPU operation and if I could be rid of it I'd be all for it, as V-sync necessarily bastardizes my framerate to eliminate eye-twitch-inducing screen tearing.

    It's new and looks like extra hardware cost now but in 5 years I expect Monitor variable refresh rates to be a norm, yes lets offload that operation to the monitor, for serious.


  • Registered Users, Registered Users 2 Posts: 86,729 ✭✭✭✭Overheal


    To underline the pointlessness of proprietary, nvidia shield the device that let you stream your games "from GTX 650 or above"? Now the concept is openly available to try on any existing hardware via steam.


  • Registered Users, Registered Users 2 Posts: 5,325 ✭✭✭smileyj1987


    I see alienware have decided to release their steam machine this year (Alienware alpha) . They are bundling an Xbox 360 joypad with it . It will run a copy of windows with a custom GUI so you can use the joypad for windows 8 .

    The machine is priced at 549 Dollars for the base machine . Everything other then the gpu is upgradable in the machine .


  • Registered Users, Registered Users 2 Posts: 8,405 ✭✭✭gizmo


    I see alienware have decided to release their steam machine this year (Alienware alpha) . They are bundling an Xbox 360 joypad with it . It will run a copy of windows with a custom GUI so you can use the joypad for windows 8 .

    The machine is priced at 549 Dollars for the base machine . Everything other then the gpu is upgradable in the machine .
    Interesting. Looks like a smaller version of the X51 with a lower price tag to boot. Not that bad when compared to a similarly spec'd Intel NUC and Gigabyte Brix given that it comes with RAM, hard drive, better connection options and a GPU of some description.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 11 musohead


    I too am considering using a 42 TV as a gaming/studio monitor. I was looking at passive 3D but most budget-end TVs don't say what standards they support. I'm using an AMD 7970. Will it just work or is there something I need to look out for? Thanks for any help.
    wes wrote: »
    I sit between 4 and 6 feet away depending on where I put my chair. Use the TV for console and PC gaming.


  • Moderators, Social & Fun Moderators Posts: 28,633 Mod ✭✭✭✭Shiminay


    Having used a TV for doing my gaming on for a while, I'm now at a multi-monitor setup again because it's far better and more flexible. Get a comfy office chair, it'll serve you better than slouching on a couch anyway :p


  • Registered Users, Registered Users 2 Posts: 11,749 ✭✭✭✭wes


    musohead wrote: »
    I too am considering using a 42 TV as a gaming/studio monitor. I was looking at passive 3D but most budget-end TVs don't say what standards they support. I'm using an AMD 7970. Will it just work or is there something I need to look out for? Thanks for any help.

    The thing you need to lookout for with TVs is the input lag. This is a good site for information on that:

    http://www.hdtvtest.co.uk/news/input-lag

    Most TVs will have PC or Game mode, so check for that as well. Also, as long as your card had HDMI you should be fine. I just plugged mine into my tv and set it to the the right mode and it worked fine.


  • Registered Users, Registered Users 2 Posts: 86,729 ✭✭✭✭Overheal


    wes wrote: »
    The thing you need to lookout for with TVs is the input lag. This is a good site for information on that:

    http://www.hdtvtest.co.uk/news/input-lag

    Haha, the first time we hooked the Wii up to the big screen and played the 4player mario - fughettabout it. We thought it was the most poorly tuned game ever. Then a week later we figured it out.


  • Closed Accounts Posts: 537 ✭✭✭Yeti Beast


    First review (that I've seen anyway) of the ROG Swift, over on KitGuru.
    We have been inundated with a slew of impressive Ultra HD 4K monitors in recent months and while the new ASUS PB287Q is a great 4K screen at a bargain price point, I would take the ROG Swift PG278Q any day of the week – even though it ‘only’ has a 1440p resolution.
    In closing, we have no hesitation in giving the Asus ROG Swift PG278Q 144hz G-Sync Monitor our MUST HAVE award. Gamers who require the smoothest frame rates and have yet to jump into the Ultra HD 4K sector will find this screen will fulfill most, if not all of their desires.


  • Registered Users, Registered Users 2 Posts: 22,924 ✭✭✭✭ShadowHearth


    Only $799 in America but £720 in UK ($1235).

    For that price and 2x780ti on top of being tight down to ONLY Nvidia Gpus... nope.


  • Closed Accounts Posts: 537 ✭✭✭Yeti Beast


    Overpriced over here for sure, but the review of the tech itself is promising and it's the first monitor of it's kind - the prices will come down.


  • Registered Users, Registered Users 2 Posts: 22,924 ✭✭✭✭ShadowHearth


    Yeti Beast wrote: »
    Overpriced over here for sure, but the review of the tech itself is promising and it's the first monitor of it's kind - the prices will come down.

    Until it's still nvidia only - useless tech.


  • Closed Accounts Posts: 537 ✭✭✭Yeti Beast


    Until it's still nvidia only - useless tech.

    This monitor review shows that adaptive sync works and is a boon to gamers. I'm also sure Nvidia owners won't think it's "useless".


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 22,924 ✭✭✭✭ShadowHearth


    Yeti Beast wrote: »
    This monitor review shows that adaptive sync works and is a boon to gamers. I'm also sure Nvidia owners won't think it's "useless".

    Maybe a boom of some sort, but it is not cost efficient and is limited to one gpu manufacturer. I would better spend 720 pounds on catleap, which overclocks like a boss and 2x 290s or 2x 770.

    At that price that thing is useless. Limited to one gpu line - useless again.
    I will never limit myself to one gpu family. I buy best bang for your money parts. That monitor is like expensive marriage.


  • Closed Accounts Posts: 537 ✭✭✭Yeti Beast


    Maybe a boom of some sort, but it is not cost efficient and is limited to one gpu manufacturer. I would better spend 720 pounds on catleap, which overclocks like a boss and 2x 290s or 2x 770.

    At that price that thing is useless. Limited to one gpu line - useless again.
    I will never limit myself to one gpu family. I buy best bang for your money parts. That monitor is like expensive marriage.

    Adaptive sync is the general term for the technology and can be applied to both AMD's solution and Nvidia's solution. This review shows that it (the tech) could be as good as many gamers were hoping. Yet again, you're arguing against a point that nobody is making.


  • Registered Users, Registered Users 2 Posts: 22,924 ✭✭✭✭ShadowHearth


    Yeti Beast wrote: »
    Adaptive sync is the general term for the technology and can be applied to both AMD's solution and Nvidia's solution. This review shows that it (the tech) could be as good as many gamers were hoping. Yet again, you're arguing against a point that nobody is making.

    Can be applied, but is working only for nvidia and needs hardware in monitors and nvidia gpu. AMD is working on software solution instead. A lot of could be's in there.
    I am not arguing I am saying my opinion... If you like it and got 720 pounds, them more power to you. I personally don't have jizz in my pants over it. It needs to do a lot more to impress me.


  • Closed Accounts Posts: 537 ✭✭✭Yeti Beast


    Can be applied, but is working only for nvidia and needs hardware in monitors and nvidia gpu. AMD is working on software solution instead. A lot of could be's in there.
    I am not arguing I am saying my opinion... If you like it and got 720 pounds, them more power to you. I personally don't have jizz in my pants over it. It needs to do a lot more to impress me.

    No, adaptive sync can mean both Nvidia's G-sync or AMD's Free-Sync. They implement it in different ways (one via hardware, one via software) but they're both producing adaptive sync. I don't know how I can explain that to you any clearer.

    And I never said anything about wanting this particular monitor. I actually agreed that it's way overpriced.


  • Registered Users, Registered Users 2 Posts: 22,924 ✭✭✭✭ShadowHearth


    Yeti Beast wrote: »
    No, adaptive sync can mean both Nvidia's G-sync or AMD's Free-Sync. They implement it in different ways (one via hardware, one via software) but they're both producing adaptive sync. I don't know how I can explain that to you any clearer.

    And I never said anything about wanting this particular monitor. I actually agreed that it's way overpriced.

    Right adaptive sync is the actual solution, but two different things of getting it. And I don't know how explain it to you anymore. Nvidia way of getting adaptive sync is by putting hardware in to monitors and limiting you to nvidia gpus. On top of that huge price tag. If you think that nvidia technology will help out other camp and make this ( nvidia way) benefit to all gamers, then you are very naive.
    To make it even more simpler: Adaptive sync is a good technology, but the way nvidia doing it is wrong, stupid and not cost efficient. That monitor is stupid in the form it is now. When there will be a monitor with adaptive sync, that is not limited to one or another gpu, then you got my attention.


  • Registered Users, Registered Users 2 Posts: 5,597 ✭✭✭EoinHef


    With a bit of luck monitors in a few years will all have adaptive v sync with the new display port standard(1.2a)that will hopefully become common place.

    VESA and AMD are making progress and have established the open standard,just a matter of time till it starts to trickle through to the consumer. Its already working phase(in the link below AMD are demoing it),just needs to be adopted

    http://hexus.net/tech/news/graphics/71809-amd-publishes-video-showing-freesync-action/


    Personally id prefer to wait till then rather than use Nvidias proprietary tech,be quite annoyed to pay a fortune for a g-sync monitor now only for adaptive v sync to be standard on monitors in a few years using display port.

    The tech looks great though,love to get my hands on one for a bit of testing:(


  • Advertisement
  • Closed Accounts Posts: 537 ✭✭✭Yeti Beast


    EoinHef wrote: »
    With a bit of luck monitors in a few years will all have adaptive v sync with the new display port standard(1.2a)that will hopefully become common place.

    VESA and AMD are making progress and have established the open standard,just a matter of time till it starts to trickle through to the consumer. Its already working phase(in the link below AMD are demoing it),just needs to be adopted

    http://hexus.net/tech/news/graphics/71809-amd-publishes-video-showing-freesync-action/


    Personally id prefer to wait till then rather than use Nvidias proprietary tech,be quite annoyed to pay a fortune for a g-sync monitor now only for adaptive v sync to be standard on monitors in a few years using display port.

    The tech looks great though,love to get my hands on one for a bit of testing:(

    Yeah, hopefully there'll be no significant input lag from using software as opposed to hardware, but after being in the doldrums for so long, it's nice to see actual advancements in monitor tech. With a bit of luck we'll see OLED (and then QLED) monitors becoming more mainstream in the next couple of years too.


  • Registered Users, Registered Users 2 Posts: 22,924 ✭✭✭✭ShadowHearth


    EoinHef wrote: »
    With a bit of luck monitors in a few years will all have adaptive v sync with the new display port standard(1.2a)that will hopefully become common place.

    VESA and AMD are making progress and have established the open standard,just a matter of time till it starts to trickle through to the consumer. Its already working phase(in the link below AMD are demoing it),just needs to be adopted

    http://hexus.net/tech/news/graphics/71809-amd-publishes-video-showing-freesync-action/


    Personally id prefer to wait till then rather than use Nvidias proprietary tech,be quite annoyed to pay a fortune for a g-sync monitor now only for adaptive v sync to be standard on monitors in a few years using display port.

    The tech looks great though,love to get my hands on one for a bit of testing:(

    Thats the point. No one in the right mind should invest in to G-sync, unless you won the lotto. Proprietary stuff just does not work in this day and age unless its super high professional equipment.
    AMD working on software solution, which most likely will be a standard. This makes G-sync looks even worse.


  • Registered Users, Registered Users 2 Posts: 5,597 ✭✭✭EoinHef


    The diversity in monitors is great now,better panels,higher refresh rates,every resolution and aspect ratio seems to be catered for as well.

    Im not a fan of proprietary tech either,limiting choice for the consumer is a bad thing imo.

    But for now gsync is the only show in town,id really love to see a demo in person,not something that can really be seen in a video imo


  • Closed Accounts Posts: 537 ✭✭✭Yeti Beast


    EoinHef wrote: »
    The diversity in monitors is great now,better panels,higher refresh rates,every resolution and aspect ratio seems to be catered for as well.

    Im not a fan of proprietary tech either,limiting choice for the consumer is a bad thing imo.

    But for now gsync is the only show in town,id really love to see a demo in person,not something that can really be seen in a video imo

    Still waiting for my perfect monitor though. I want the high refresh rates and low input lag of a good TN panel but without the dreary colours, and blacks that are really just grey and navy. Add in 21:9 and variable refresh rate technology and I'd be happy. Should only be another five years and cost a few grand. :pac:


  • Registered Users, Registered Users 2 Posts: 22,924 ✭✭✭✭ShadowHearth


    I would be just happy with LG Flatron 34UM95-P or another panel with same resolution and same size for now, but for half the Price. 800£ is a bit steep. :pac:


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 41 pd40


    hi. I'm not sure if I'm on the right thread but I'm looking for some advice. My 10 year old son wants to get a gaming pc. (He has an xbox but his sisters take it over too often for his liking!). I don't know where to start with this. His budget is about 500 euro (is this too optimistic?)... he loves the 'simulator' games (trucks, farm machinery, buses etc)... can anyone advise?
    Thanks.


  • Closed Accounts Posts: 477 ✭✭McSasquatch II


    You might be better off posting over in the PC Building and Upgrading forum. :)

    www. boards.ie/vbulletin/forumdisplay.php?f=842


  • Registered Users, Registered Users 2 Posts: 1,750 ✭✭✭john the one


    So my r9 270x card isn't really wowing me at the moment, I'm getting 20-25 fps in dayz in towns and maybe 45 outside towns, would I be better off getting another 270x and cross firing or upgrading the card?

    The processor is an fx6300 which I would like to upgrade in the next 6 months


  • Registered Users, Registered Users 2 Posts: 41 pd40


    tnks oak76


  • Registered Users, Registered Users 2 Posts: 5,597 ✭✭✭EoinHef


    So my r9 270x card isn't really wowing me at the moment, I'm getting 20-25 fps in dayz in towns and maybe 45 outside towns, would I be better off getting another 270x and cross firing or upgrading the card?

    The processor is an fx6300 which I would like to upgrade in the next 6 months

    Arma/Dayz is very resource intensive and in towns the frame rate chugs on most systems!! Are you playing the mod or standalone? What sort of settings and resolution have you got it at? If you can manipulate it to get constantly 30+ in towns you should be ok and it wont be too noticeable. Any lower than 30fps you will really notice

    Ive found using the games in built graphics presets to be terrible,in my experience your better off setting it to high/very high preset and then adjusting settings down in the advanced menu,although this is for the arma2 mod.

    If your playing standalone you will probably be waiting for better optimisation from Bohemia,Arma 3 was a resource hog in alpha but runs great now for me.

    Have you overclocked either the cpu or gpu?


  • Registered Users, Registered Users 2 Posts: 22,924 ✭✭✭✭ShadowHearth


    So my r9 270x card isn't really wowing me at the moment, I'm getting 20-25 fps in dayz in towns and maybe 45 outside towns, would I be better off getting another 270x and cross firing or upgrading the card?

    The processor is an fx6300 which I would like to upgrade in the next 6 months

    DayZ is as optimised as a Sunday Cupcake. I get horrible FPS on my GTX 680. I would not even recommend chasing fps in DayZ as it is just not cost efficient. Just get Mod and wait for standalone not sucking balls.


  • Registered Users, Registered Users 2 Posts: 1,750 ✭✭✭john the one


    EoinHef wrote: »
    Arma/Dayz is very resource intensive and in towns the frame rate chugs on most systems!! Are you playing the mod or standalone? What sort of settings and resolution have you got it at? If you can manipulate it to get constantly 30+ in towns you should be ok and it wont be too noticeable. Any lower than 30fps you will really notice

    Ive found using the games in built graphics presets to be terrible,in my experience your better off setting it to high/very high preset and then adjusting settings down in the advanced menu,although this is for the arma2 mod.

    If your playing standalone you will probably be waiting for better optimisation from Bohemia,Arma 3 was a resource hog in alpha but runs great now for me.

    Have you overclocked either the cpu or gpu?

    Playing the standalone but might just go for the mod now. I have messed around with settings a little but will do some more, thing is i like shadows :(
    I havent overclocked either just yet and dont plan on just for the sake of an alpha game!
    DayZ is as optimised as a Sunday Cupcake. I get horrible FPS on my GTX 680. I would not even recommend chasing fps in DayZ as it is just not cost efficient. Just get Mod and wait for standalone not sucking balls.

    this seems to be the only true solution! Its a bummer. I likes me games looking well


  • Registered Users, Registered Users 2 Posts: 10,766 ✭✭✭✭degrassinoel


    i'm using a GTX 780 too, and Dayz is just a pig for resources.. i actually gave up playing it because it never seems to get any better


  • Registered Users, Registered Users 2 Posts: 5,597 ✭✭✭EoinHef


    As barebones as it is Arma 3 Breaking Point is looking very good atm,id much rather play it than standalone or the Arma 2 mod,which is undergoing a change with the jump to steam servers and the gamespy shutdown so different launchers have version compatibility issues:(

    Standalone is just not in a place where Bohemia are concentrating on optimization,there concentrating on the mechanics and content,all be it slowly so far,so performance in alpha is just not going to be good. Arma 3 was the same in alpha,runs great now. Have to be realistic,a game/games of the scope of these games/mods is huge,its a truely online multiplayer simulation like no other ive played so things like the demands on servers must be huge,and optimization for all PC configs must be huge for the devs as well,with constantly updating drivers etc.

    TBH i am a bit of a "fanboi" of the Arma series and its various mods/missions,and also think Bohemia have done a very good job so far handling the multiple titles they now have on the go for a relatively small studio,although they are definitely growing. I really do have a lot of good will/faith in them

    It almost reminds me of minecraft,except Bohemia have developed a sandbox which has potential on a much grander scale and the best thing about it,they let the community mod it(Hopefully not hackers though:mad:):)


  • Registered Users, Registered Users 2 Posts: 5,597 ✭✭✭EoinHef


    Anybody use a usb extension lead with a wired 360 controller?

    Standard lead doesnt quite go far enough,an extra meter or so would go a long way! Hoping a female-male usb will do the job


  • Advertisement
Advertisement