Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Off Topic Thread V2.0

1141142144146147334

Comments

  • Closed Accounts Posts: 3,006 ✭✭✭_Tombstone_


    Argos are great aswell but scum are abusing it like you wouldn't believe.


  • Registered Users, Registered Users 2 Posts: 2,928 ✭✭✭VenomIreland


    Argos are great aswell but scum are abusing it like you wouldn't believe.

    In what way?


  • Closed Accounts Posts: 5,824 ✭✭✭RoyalMarine


    So.. Planning for a big upgrade next year.
    Looking at Pascal GPU's.

    Any new plans for CPU's for mid next year?
    Will PCI based SSD's be the way to go?


  • Registered Users, Registered Users 2 Posts: 11,397 ✭✭✭✭Digital Solitude


    So.. Planning for a big upgrade next year.
    Looking at Pascal GPU's.

    Any new plans for CPU's for mid next year?
    Will PCI based SSD's be the way to go?

    No point upgrading your CPU anyways


  • Closed Accounts Posts: 5,824 ✭✭✭RoyalMarine


    how come?


  • Advertisement
  • Closed Accounts Posts: 6,934 ✭✭✭MarkAnthony


    how come?

    There's been very little real world improvement since the 2XXX series. PCIe SSD are all well and good but again no real world improvement.


  • Registered Users, Registered Users 2 Posts: 4,698 ✭✭✭Gumbi


    There's been very little real world improvement since the 2XXX series. PCIe SSD are all well and good but again no real world improvement.

    Incrementally there is a pretty big difference between Skylake and Sandy Bridge though at this stage.

    Plus if you do encoding and stuff x264 is even faster again.


  • Closed Accounts Posts: 29,930 ✭✭✭✭TerrorFirmer


    From a gaming point of view though, there's still almost no reason to go from Sandybridge to Skylake, unless you're rocking a GTX980Ti/Titan X type setup - and even in some cases then, it doesn't really make much difference either.

    At the moment playing Fallout 4 at 1440p with my 980Ti Hybrid (which has a hefty overclock) and the card is the bottleneck.


  • Registered Users, Registered Users 2 Posts: 23,141 ✭✭✭✭TheDoc


    Monitor upgrade is probably where I'll be looking at next year if a bonus comes in (normally around March/April)

    I've never had actual hands on in the presence of the new types. Be it 1440p of ones with 140mhz (I don't even know if what I'm saying are the correct terms)

    Not sure how immediately sold I am on getting a much larger monitor, since I'm happy with the size of my two current ones. My main PC game is WoW and I need to accept that really, while I play other games, WoW is the main one. So that game I have it capped to 60fps, even though I could go as high as 100+stable if I want. But I can't tell any difference really.

    But this craic of 100+mhz monitors that seem to make a big difference, I'd say I'd be really into that. I personally rank smoothness > graphical detail


  • Closed Accounts Posts: 6,934 ✭✭✭MarkAnthony


    It's actually Hz for monitors :) thankfully we only need 60-144 cycles a second to trick our eyes rather than 100,000 (100Mhz) otherwise the world would move very slowly indeed! :pac:

    I recently got a ultra wide - I have to say I absolutely love it. The immersion I get from using it as it fills my FoV is almost as good as the Oculus. I went from a 2560x1440 to 3440x1440 and I can't overstate the difference. That said for me it's all about resolution. FPS doesn't actually matter much to be, I'm not that sensitive to it. In fact I've had this monitor at 75Hz overclocked but due to issues with the cable (I hope) I'm on 50Hz HDMI at the moment. Can't tell the difference*!

    *Best example is Elite Dangerous where I'm able to max out the FPS vs Refresh. Cant tell the difference between 50FPS and 75FPS even when there's loads of action on screen. Pew pew pew!!!


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 23,141 ✭✭✭✭TheDoc


    I only bought a new GPU this year, a 290x, so I wouldn't want to be going down the root of maybe upgrading that again. Want to get a few years from it, and while I dip into new releases here and there, nothing I play regularly is taxing in terms of specs.

    I have two BENQ GL2450HM 24" Full HD LED. Really love them, think the colour is great and all that jazz. None of my immediate friends really play PC games, so I'm a bit at a loss in terms of visibly experiencing the new types of monitors first hand.

    Considering how close I sit to the monitors, and WoW being my main game, I'd fear that any bigger might be a negative experience. I for a while played through a 32" 720p TV that was like, a foot away from my face. That was excrutiaing looking back


    *Granted I have my current monitors wall mounted, so gained a good bit of viewing distance.

    Like are they exclusive from each other, or are they actually both come hand in hand. This 1440p and 140hz stuff or whatever its called. My monitors I think are 60hz. So I guess going to something higher in terms of hz, would be a massive visual improvement

    (I'm not overly confident on the tie ins between monitor hz and game FPS, I thought I knew, but read something recently that debunked my understanding) I have my games locked in at 60fps, on the understanding that with a 60hz monitor, that works nicely hand in hand. Is that understanding right?


  • Closed Accounts Posts: 6,934 ✭✭✭MarkAnthony


    There are issues such as tearing but the only real reason to go for a higher refresh is higher frame rates.

    If you've a 50Hz monitor like I do (at the mo) you only ever get 50FPS, the GPU might be drawing 100FPS but you only see 50 with possible visual issues like tearing. I presume these 144hz guys are into shooters and are more sensitive to lag etc. Apparently the reason they cant just lock to 60FPS is V-Sync causes lag. I dunno, it doesn't affect me.

    To be honest if you're not noticing an issue on a 60Hz monitor I doubt you'll gain much from a higher refresh/FPS especially in an MMO,


  • Registered Users, Registered Users 2 Posts: 878 ✭✭✭Luck100


    I thought 60 Hz v-sync was fine until I tried 144 Hz. No going back now!


  • Registered Users, Registered Users 2 Posts: 23,141 ✭✭✭✭TheDoc


    There are issues such as tearing but the only real reason to go for a higher refresh is higher frame rates.

    If you've a 50Hz monitor like I do (at the mo) you only ever get 50FPS, the GPU might be drawing 100FPS but you only see 50 with possible visual issues like tearing. I presume these 144hz guys are into shooters and are more sensitive to lag etc. Apparently the reason they cant just lock to 60FPS is V-Sync causes lag. I dunno, it doesn't affect me.

    To be honest if you're not noticing an issue on a 60Hz monitor I doubt you'll gain much from a higher refresh/FPS especially in an MMO,

    this is one of the things I believed, that FPs was tied into the HZ of the monitor, that I read a few bits about that debunked it. I know the PcMasterRace subreddit took massive exception to me asking if that was the case.

    Plenty there felt the need to explain to me how the HZ on a monitor is not directly tied into the FPS that a game outputs.

    It's all so confusing who and what to believe nowadays with stuff : /


  • Closed Accounts Posts: 6,934 ✭✭✭MarkAnthony


    TheDoc wrote: »
    this is one of the things I believed, that FPs was tied into the HZ of the monitor, that I read a few bits about that debunked it. I know the PcMasterRace subreddit took massive exception to me asking if that was the case.

    Plenty there felt the need to explain to me how the HZ on a monitor is not directly tied into the FPS that a game outputs.

    It's all so confusing who and what to believe nowadays with stuff : /

    FPS isn't tied to the monitor but what you see is. Perhaps there is some hidden advantage to having higher FPS than cycles per second on the monitor but I couldn't imagine what it would be.


  • Registered Users, Registered Users 2 Posts: 4,698 ✭✭✭Gumbi


    Think of it this way. Say you are rendering 60FPS on a 60hz monitor. The key here is, you are rendering 60 frames per second on average.

    Take a second of frames. You render 60. That's a frame every ~16.67ms on average. but of course, it never happens like that. One frame could take 10ms to render (equivalent to 100 FPS if stretched over a timeline of a minute), but the next frame could take 30ms to render (equivalent to 33.3FPS when stretched over a min), and so you know a small stutter in the game play.

    That's why having a higher FPS than the monitor's refresh rate can help.


  • Moderators, Technology & Internet Moderators Posts: 18,382 Mod ✭✭✭✭Solitaire


    Multi-card configurations with dodgy p!ssing-contest drivers and shockingly awful "next-gen" console ports exposed the horrors of Frame Time Variance to the public. Its a thing, it doesn't just affect bad ports or SLi/CF and its usually what our eyes perceive as graphical stuttering. Consistent 60Hz+60FPS+low-variance frame timing is more than enough for the average human being to consider "silky smooth", and I've never really understood the allure of 144Hz monitors given the limitations of both graphics hardware and human sight. If you're having issues with 60Hz@60FPS content, its bad variance causing the frames to bunch up making certain fractions of a second effectively run at 20-30FPS or worse, and not a need to upgrade to 144Hz "goodness" you cannot even perceive :p

    The good news is that nVidia and AMD's awful behaviour over the whole issue in their quest for misleading FPS "gains" over the last few years caught them an avalanche of salt from the gaming and tech press over the last couple of years and they've been forced via the wonderful medium of humiliation to rectify their ways since then ^_^


  • Registered Users, Registered Users 2 Posts: 4,698 ✭✭✭Gumbi


    Solitaire wrote: »
    Multi-card configurations with dodgy p!ssing-contest drivers and shockingly awful "next-gen" console ports exposed the horrors of Frame Time Variance to the public. Its a thing, it doesn't just affect bad ports or SLi/CF and its usually what our eyes perceive as graphical stuttering. Consistent 60Hz+60FPS+low-variance frame timing is more than enough for the average human being to consider "silky smooth", and I've never really understood the allure of 144Hz monitors given the limitations of both graphics hardware and human sight. If you're having issues with 60Hz@60FPS content, its bad variance causing the frames to bunch up making certain fractions of a second effectively run at 20-30FPS or worse, and not a need to upgrade to 144Hz "goodness" you cannot even perceive :p

    The good news is that nVidia and AMD's awful behaviour over the whole issue in their quest for misleading FPS "gains" over the last few years caught them an avalanche of salt from the gaming and tech press over the last couple of years and they've been forced via the wonderful medium of humiliation to rectify their ways since then ^_^

    Indeed, some games have truly woeful frame time variance.

    By the way, I've a friend who insists he can tell the difference between even 120 and 144hz; he's quite tech literate too and understands what he's talking about so I dunno, maybe there is something to it.

    I definitely agree though, silky smooth 60FPS (that is, 60FPS with next to no frame time variance) will satisfy most anyone.


  • Registered Users, Registered Users 2 Posts: 23,141 ✭✭✭✭TheDoc


    Gumbi wrote: »
    Indeed, some games have truly woeful frame time variance.

    By the way, I've a friend who insists he can tell the difference between even 120 and 144hz; he's quite tech literate too and understands what he's talking about so I dunno, maybe there is something to it.

    I definitely agree though, silky smooth 60FPS (that is, 60FPS with next to no frame time variance) will satisfy most anyone.

    Well, I guess the next question is how I rule out this frame time variance(which I've never heard of before)

    like if I'm honest watching youtube 720p 60fps videos, looks "smoother" then my wow which tells me its constant 60fps


  • Closed Accounts Posts: 6,934 ✭✭✭MarkAnthony


    Tried turning off VSync in WoW? WoW isn't graphically intensive so you should see 100FPS+ and see if you see any difference. Also maybe try other games.

    The thing is with 60FPS films is there is other things going on. I've always wondered why we didn't move to higher FPS for movies rather than higher resolutions all the time. Certain *ahem* movies I've seen in 60FPS look more natural.


  • Advertisement
  • Moderators, Technology & Internet Moderators Posts: 18,382 Mod ✭✭✭✭Solitaire


    TheDoc wrote: »
    Well, I guess the next question is how I rule out this frame time variance(which I've never heard of before)

    like if I'm honest watching youtube 720p 60fps videos, looks "smoother" then my wow which tells me its constant 60fps

    Alas, its not a constant 60FPS, its an average 60FPS. Dig down below the second marker and the average FPS will change a lot more often, and occasionally dip low causing a stutter. Devs and card makers live in a world where FPS is measured by the second, and will lie, cheat and steal to maximize that number regardless of actual output smoothness. AMD and nVidia have been forced to mend their ways but devs are still cutting corners where they shouldn't and a certain MMO known for optimization nearly as woeful as modded Minecraft is a particular problem child in this respect...


  • Closed Accounts Posts: 789 ✭✭✭Fakman87


    Tried turning off VSync in WoW? WoW isn't graphically intensive so you should see 100FPS+ and see if you see any difference. Also maybe try other games.

    The thing is with 60FPS films is there is other things going on. I've always wondered why we didn't move to higher FPS for movies rather than higher resolutions all the time. Certain *ahem* movies I've seen in 60FPS look more natural.

    No chance man, films and tv shows look absolutely ghastly at 60fps!


  • Closed Accounts Posts: 6,934 ✭✭✭MarkAnthony


    Fakman87 wrote: »
    No chance man, films and tv shows look absolutely ghastly at 60fps!

    Watch different films ;)

    There has been some rumblings about eventually making the move. Obviously there is an issue in just existing stuff and dumping it to 60FPS.


  • Registered Users, Registered Users 2 Posts: 23,141 ✭✭✭✭TheDoc


    Tried turning off VSync in WoW? WoW isn't graphically intensive so you should see 100FPS+ and see if you see any difference. Also maybe try other games.

    The thing is with 60FPS films is there is other things going on. I've always wondered why we didn't move to higher FPS for movies rather than higher resolutions all the time. Certain *ahem* movies I've seen in 60FPS look more natural.

    I've had vsync off for as long as I can remember in games, but recently turned it back on. Don't know why now that you mention it.

    In a raid at the moment, will turn it off see if things look steadier. I was watching it throughout a boss fight there, 60fps pretty much all the time


  • Registered Users, Registered Users 2 Posts: 4,698 ✭✭✭Gumbi


    Solitaire wrote: »
    Alas, its not a constant 60FPS, its an average 60FPS. Dig down below the second marker and the average FPS will change a lot more often, and occasionally dip low causing a stutter. Devs and card makers live in a world where FPS is measured by the second, and will lie, cheat and steal to maximize that number regardless of actual output smoothness. AMD and nVidia have been forced to mend their ways but devs are still cutting corners where they shouldn't and a certain MMO known for optimization nearly as woeful as modded Minecraft is a particular problem child in this respect...

    Bingo.

    As I explained above, TheDoc, 60FPS, all that means is your GPU rendered 60 frames in one second. Theoretically this can mean nothing s rendered for the first 500ms, then all 60 frames are rendered in the final 500ms (obviously this is an extreme example, but you get my point) and this makes for a stuttery experience.

    A smooth 60FPS will have the frames evenly spread, roughly 16.67ms~ apart (because 60 x 16.67ms = 1000ms/1second).

    It's quite possible to have a smoother 45fps than 60fps if the frames are more evenly spread.


  • Closed Accounts Posts: 789 ✭✭✭Fakman87


    Watch different films ;)

    There has been some rumblings about eventually making the move. Obviously there is an issue in just existing stuff and dumping it to 60FPS.

    My dad had that "motion plus" feature turned on his new tv and I hated it so much. I find it good for sports but it gives films a really cheap soap opera look and the fluidity of the movement feels really weird to me.


  • Closed Accounts Posts: 6,934 ✭✭✭MarkAnthony


    Fakman87 wrote: »
    My dad had that "motion plus" feature turned on his new tv and I hated it so much. I find it good for sports but it gives films a really cheap soap opera look and the fluidity of the movement feels really weird to me.

    Thats not 60FPS that's some TV feature. You need to look at movies specifically rendered at 60FPS.


  • Registered Users, Registered Users 2 Posts: 7,181 ✭✭✭Serephucus


    To chime in here:

    I agree with most of what's been said here, except for two main points.

    1) 60FPS is adequate - I can't say I agree with this. I've used 144Hz monitors (albeit briefly), and they're definitely smoother. Certainly not the same jump as going from 30 to 60, but that could simply be a product of my not getting enough time with the monitor to get used to it.

    2) Movies look better at 24FPS. Wrong. After having used SVP (Smooth Video Project) for over a year, I'm never going back. Yes, it's not perfect. Yes, there are artefacts. Yes, it's relatively CPU-heavy when compared to normal video playback. I don't care. It's just so much better. Once you get used to it... 24FPS video is like playing games locked at 30FPS. It's just bad.


    Oh, and it goes without saying that G-Sync is great and every monitor from now on should take it's framerate from t he GPU driving it.


  • Registered Users, Registered Users 2 Posts: 7,047 ✭✭✭Bazzo


    Thats not 60FPS that's some TV feature. You need to look at movies specifically rendered at 60FPS.

    Didn't they do this for the hobbit movies? I remember thinking the first one looked a bit odd to me and thinking it was down to whatever way they did the 3d but finding out later it was probably due to the 60fps filming.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 1,635 ✭✭✭Redfox25


    Got a XFX 390X from Amazon in August. I started hearing alot of noise in the last few days in my case. At first I thought a fan had one of its bearings starting to go in my case but once I tab out of a game the noise stops.
    Disconnected all fans one by one and its the gpu causing the issue.

    Contacted Amazon last night at 12, they replied with 3 hours to say that a replacement card was on route.
    Pretty amazing customer service.

    Just wondering if there is any difference between the 390x's. The XFX one runs pretty hot, hitting mid 80's and creeping up to 90 at peak.
    Does anyone know if any of the other cards run cooler? I have pretty good airflow in case, 200mm with no cages in front blowing in air and 3 140mm out take fans at the top/back.
    The Sapphire looks good but they dont seem to sell it atm, what is the Asus Stix like or should I just stick with XFX?


This discussion has been closed.
Advertisement