Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi all! We have been experiencing an issue on site where threads have been missing the latest postings. The platform host Vanilla are working on this issue. A workaround that has been used by some is to navigate back from 1 to 10+ pages to re-sync the thread and this will then show the latest posts. Thanks, Mike.
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Off Topic Thread V2.0

18485878990200

Comments

  • Registered Users, Registered Users 2 Posts: 23,137 ✭✭✭✭TheDoc


    I only bought a new GPU this year, a 290x, so I wouldn't want to be going down the root of maybe upgrading that again. Want to get a few years from it, and while I dip into new releases here and there, nothing I play regularly is taxing in terms of specs.

    I have two BENQ GL2450HM 24" Full HD LED. Really love them, think the colour is great and all that jazz. None of my immediate friends really play PC games, so I'm a bit at a loss in terms of visibly experiencing the new types of monitors first hand.

    Considering how close I sit to the monitors, and WoW being my main game, I'd fear that any bigger might be a negative experience. I for a while played through a 32" 720p TV that was like, a foot away from my face. That was excrutiaing looking back


    *Granted I have my current monitors wall mounted, so gained a good bit of viewing distance.

    Like are they exclusive from each other, or are they actually both come hand in hand. This 1440p and 140hz stuff or whatever its called. My monitors I think are 60hz. So I guess going to something higher in terms of hz, would be a massive visual improvement

    (I'm not overly confident on the tie ins between monitor hz and game FPS, I thought I knew, but read something recently that debunked my understanding) I have my games locked in at 60fps, on the understanding that with a 60hz monitor, that works nicely hand in hand. Is that understanding right?


  • Closed Accounts Posts: 6,934 ✭✭✭MarkAnthony


    There are issues such as tearing but the only real reason to go for a higher refresh is higher frame rates.

    If you've a 50Hz monitor like I do (at the mo) you only ever get 50FPS, the GPU might be drawing 100FPS but you only see 50 with possible visual issues like tearing. I presume these 144hz guys are into shooters and are more sensitive to lag etc. Apparently the reason they cant just lock to 60FPS is V-Sync causes lag. I dunno, it doesn't affect me.

    To be honest if you're not noticing an issue on a 60Hz monitor I doubt you'll gain much from a higher refresh/FPS especially in an MMO,


  • Registered Users, Registered Users 2 Posts: 878 ✭✭✭Luck100


    I thought 60 Hz v-sync was fine until I tried 144 Hz. No going back now!


  • Registered Users, Registered Users 2 Posts: 23,137 ✭✭✭✭TheDoc


    There are issues such as tearing but the only real reason to go for a higher refresh is higher frame rates.

    If you've a 50Hz monitor like I do (at the mo) you only ever get 50FPS, the GPU might be drawing 100FPS but you only see 50 with possible visual issues like tearing. I presume these 144hz guys are into shooters and are more sensitive to lag etc. Apparently the reason they cant just lock to 60FPS is V-Sync causes lag. I dunno, it doesn't affect me.

    To be honest if you're not noticing an issue on a 60Hz monitor I doubt you'll gain much from a higher refresh/FPS especially in an MMO,

    this is one of the things I believed, that FPs was tied into the HZ of the monitor, that I read a few bits about that debunked it. I know the PcMasterRace subreddit took massive exception to me asking if that was the case.

    Plenty there felt the need to explain to me how the HZ on a monitor is not directly tied into the FPS that a game outputs.

    It's all so confusing who and what to believe nowadays with stuff : /


  • Closed Accounts Posts: 6,934 ✭✭✭MarkAnthony


    TheDoc wrote: »
    this is one of the things I believed, that FPs was tied into the HZ of the monitor, that I read a few bits about that debunked it. I know the PcMasterRace subreddit took massive exception to me asking if that was the case.

    Plenty there felt the need to explain to me how the HZ on a monitor is not directly tied into the FPS that a game outputs.

    It's all so confusing who and what to believe nowadays with stuff : /

    FPS isn't tied to the monitor but what you see is. Perhaps there is some hidden advantage to having higher FPS than cycles per second on the monitor but I couldn't imagine what it would be.


  • Registered Users, Registered Users 2 Posts: 4,698 ✭✭✭Gumbi


    Think of it this way. Say you are rendering 60FPS on a 60hz monitor. The key here is, you are rendering 60 frames per second on average.

    Take a second of frames. You render 60. That's a frame every ~16.67ms on average. but of course, it never happens like that. One frame could take 10ms to render (equivalent to 100 FPS if stretched over a timeline of a minute), but the next frame could take 30ms to render (equivalent to 33.3FPS when stretched over a min), and so you know a small stutter in the game play.

    That's why having a higher FPS than the monitor's refresh rate can help.


  • Moderators, Technology & Internet Moderators Posts: 18,377 Mod ✭✭✭✭Solitaire


    Multi-card configurations with dodgy p!ssing-contest drivers and shockingly awful "next-gen" console ports exposed the horrors of Frame Time Variance to the public. Its a thing, it doesn't just affect bad ports or SLi/CF and its usually what our eyes perceive as graphical stuttering. Consistent 60Hz+60FPS+low-variance frame timing is more than enough for the average human being to consider "silky smooth", and I've never really understood the allure of 144Hz monitors given the limitations of both graphics hardware and human sight. If you're having issues with 60Hz@60FPS content, its bad variance causing the frames to bunch up making certain fractions of a second effectively run at 20-30FPS or worse, and not a need to upgrade to 144Hz "goodness" you cannot even perceive :p

    The good news is that nVidia and AMD's awful behaviour over the whole issue in their quest for misleading FPS "gains" over the last few years caught them an avalanche of salt from the gaming and tech press over the last couple of years and they've been forced via the wonderful medium of humiliation to rectify their ways since then ^_^


  • Registered Users, Registered Users 2 Posts: 4,698 ✭✭✭Gumbi


    Solitaire wrote: »
    Multi-card configurations with dodgy p!ssing-contest drivers and shockingly awful "next-gen" console ports exposed the horrors of Frame Time Variance to the public. Its a thing, it doesn't just affect bad ports or SLi/CF and its usually what our eyes perceive as graphical stuttering. Consistent 60Hz+60FPS+low-variance frame timing is more than enough for the average human being to consider "silky smooth", and I've never really understood the allure of 144Hz monitors given the limitations of both graphics hardware and human sight. If you're having issues with 60Hz@60FPS content, its bad variance causing the frames to bunch up making certain fractions of a second effectively run at 20-30FPS or worse, and not a need to upgrade to 144Hz "goodness" you cannot even perceive :p

    The good news is that nVidia and AMD's awful behaviour over the whole issue in their quest for misleading FPS "gains" over the last few years caught them an avalanche of salt from the gaming and tech press over the last couple of years and they've been forced via the wonderful medium of humiliation to rectify their ways since then ^_^

    Indeed, some games have truly woeful frame time variance.

    By the way, I've a friend who insists he can tell the difference between even 120 and 144hz; he's quite tech literate too and understands what he's talking about so I dunno, maybe there is something to it.

    I definitely agree though, silky smooth 60FPS (that is, 60FPS with next to no frame time variance) will satisfy most anyone.


  • Registered Users, Registered Users 2 Posts: 23,137 ✭✭✭✭TheDoc


    Gumbi wrote: »
    Indeed, some games have truly woeful frame time variance.

    By the way, I've a friend who insists he can tell the difference between even 120 and 144hz; he's quite tech literate too and understands what he's talking about so I dunno, maybe there is something to it.

    I definitely agree though, silky smooth 60FPS (that is, 60FPS with next to no frame time variance) will satisfy most anyone.

    Well, I guess the next question is how I rule out this frame time variance(which I've never heard of before)

    like if I'm honest watching youtube 720p 60fps videos, looks "smoother" then my wow which tells me its constant 60fps


  • Closed Accounts Posts: 6,934 ✭✭✭MarkAnthony


    Tried turning off VSync in WoW? WoW isn't graphically intensive so you should see 100FPS+ and see if you see any difference. Also maybe try other games.

    The thing is with 60FPS films is there is other things going on. I've always wondered why we didn't move to higher FPS for movies rather than higher resolutions all the time. Certain *ahem* movies I've seen in 60FPS look more natural.


  • Advertisement
  • Moderators, Technology & Internet Moderators Posts: 18,377 Mod ✭✭✭✭Solitaire


    TheDoc wrote: »
    Well, I guess the next question is how I rule out this frame time variance(which I've never heard of before)

    like if I'm honest watching youtube 720p 60fps videos, looks "smoother" then my wow which tells me its constant 60fps

    Alas, its not a constant 60FPS, its an average 60FPS. Dig down below the second marker and the average FPS will change a lot more often, and occasionally dip low causing a stutter. Devs and card makers live in a world where FPS is measured by the second, and will lie, cheat and steal to maximize that number regardless of actual output smoothness. AMD and nVidia have been forced to mend their ways but devs are still cutting corners where they shouldn't and a certain MMO known for optimization nearly as woeful as modded Minecraft is a particular problem child in this respect...


  • Closed Accounts Posts: 789 ✭✭✭Fakman87


    Tried turning off VSync in WoW? WoW isn't graphically intensive so you should see 100FPS+ and see if you see any difference. Also maybe try other games.

    The thing is with 60FPS films is there is other things going on. I've always wondered why we didn't move to higher FPS for movies rather than higher resolutions all the time. Certain *ahem* movies I've seen in 60FPS look more natural.

    No chance man, films and tv shows look absolutely ghastly at 60fps!


  • Closed Accounts Posts: 6,934 ✭✭✭MarkAnthony


    Fakman87 wrote: »
    No chance man, films and tv shows look absolutely ghastly at 60fps!

    Watch different films ;)

    There has been some rumblings about eventually making the move. Obviously there is an issue in just existing stuff and dumping it to 60FPS.


  • Registered Users, Registered Users 2 Posts: 23,137 ✭✭✭✭TheDoc


    Tried turning off VSync in WoW? WoW isn't graphically intensive so you should see 100FPS+ and see if you see any difference. Also maybe try other games.

    The thing is with 60FPS films is there is other things going on. I've always wondered why we didn't move to higher FPS for movies rather than higher resolutions all the time. Certain *ahem* movies I've seen in 60FPS look more natural.

    I've had vsync off for as long as I can remember in games, but recently turned it back on. Don't know why now that you mention it.

    In a raid at the moment, will turn it off see if things look steadier. I was watching it throughout a boss fight there, 60fps pretty much all the time


  • Registered Users, Registered Users 2 Posts: 4,698 ✭✭✭Gumbi


    Solitaire wrote: »
    Alas, its not a constant 60FPS, its an average 60FPS. Dig down below the second marker and the average FPS will change a lot more often, and occasionally dip low causing a stutter. Devs and card makers live in a world where FPS is measured by the second, and will lie, cheat and steal to maximize that number regardless of actual output smoothness. AMD and nVidia have been forced to mend their ways but devs are still cutting corners where they shouldn't and a certain MMO known for optimization nearly as woeful as modded Minecraft is a particular problem child in this respect...

    Bingo.

    As I explained above, TheDoc, 60FPS, all that means is your GPU rendered 60 frames in one second. Theoretically this can mean nothing s rendered for the first 500ms, then all 60 frames are rendered in the final 500ms (obviously this is an extreme example, but you get my point) and this makes for a stuttery experience.

    A smooth 60FPS will have the frames evenly spread, roughly 16.67ms~ apart (because 60 x 16.67ms = 1000ms/1second).

    It's quite possible to have a smoother 45fps than 60fps if the frames are more evenly spread.


  • Closed Accounts Posts: 789 ✭✭✭Fakman87


    Watch different films ;)

    There has been some rumblings about eventually making the move. Obviously there is an issue in just existing stuff and dumping it to 60FPS.

    My dad had that "motion plus" feature turned on his new tv and I hated it so much. I find it good for sports but it gives films a really cheap soap opera look and the fluidity of the movement feels really weird to me.


  • Closed Accounts Posts: 6,934 ✭✭✭MarkAnthony


    Fakman87 wrote: »
    My dad had that "motion plus" feature turned on his new tv and I hated it so much. I find it good for sports but it gives films a really cheap soap opera look and the fluidity of the movement feels really weird to me.

    Thats not 60FPS that's some TV feature. You need to look at movies specifically rendered at 60FPS.


  • Registered Users, Registered Users 2 Posts: 7,180 ✭✭✭Serephucus


    To chime in here:

    I agree with most of what's been said here, except for two main points.

    1) 60FPS is adequate - I can't say I agree with this. I've used 144Hz monitors (albeit briefly), and they're definitely smoother. Certainly not the same jump as going from 30 to 60, but that could simply be a product of my not getting enough time with the monitor to get used to it.

    2) Movies look better at 24FPS. Wrong. After having used SVP (Smooth Video Project) for over a year, I'm never going back. Yes, it's not perfect. Yes, there are artefacts. Yes, it's relatively CPU-heavy when compared to normal video playback. I don't care. It's just so much better. Once you get used to it... 24FPS video is like playing games locked at 30FPS. It's just bad.


    Oh, and it goes without saying that G-Sync is great and every monitor from now on should take it's framerate from t he GPU driving it.


  • Registered Users, Registered Users 2 Posts: 7,047 ✭✭✭Bazzo


    Thats not 60FPS that's some TV feature. You need to look at movies specifically rendered at 60FPS.

    Didn't they do this for the hobbit movies? I remember thinking the first one looked a bit odd to me and thinking it was down to whatever way they did the 3d but finding out later it was probably due to the 60fps filming.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 1,581 ✭✭✭Redfox25


    Got a XFX 390X from Amazon in August. I started hearing alot of noise in the last few days in my case. At first I thought a fan had one of its bearings starting to go in my case but once I tab out of a game the noise stops.
    Disconnected all fans one by one and its the gpu causing the issue.

    Contacted Amazon last night at 12, they replied with 3 hours to say that a replacement card was on route.
    Pretty amazing customer service.

    Just wondering if there is any difference between the 390x's. The XFX one runs pretty hot, hitting mid 80's and creeping up to 90 at peak.
    Does anyone know if any of the other cards run cooler? I have pretty good airflow in case, 200mm with no cages in front blowing in air and 3 140mm out take fans at the top/back.
    The Sapphire looks good but they dont seem to sell it atm, what is the Asus Stix like or should I just stick with XFX?


  • Closed Accounts Posts: 5,824 ✭✭✭RoyalMarine


    mid 80's to 90's in my opinion shouldn't be reached.

    Running furmark with 2560 x 1440 on 8x MSAA, my GTX 970 reaches 54*c after 30 mins.


  • Registered Users, Registered Users 2 Posts: 1,581 ✭✭✭Redfox25


    mid 80's to 90's in my opinion shouldn't be reached.

    Running furmark with 2560 x 1440 on 8x MSAA, my GTX 970 reaches 54*c after 30 mins.

    The 970 is a much cooler card. Its the nature of the 390 or so it seems to hit that sort of temp. I did some more reading of reviews and they all seem to be on the hot side. It does state in a few reviews that they dont suffer any damage from these temps. I had thought originally that the temps might have brought about the coil whine.

    Thumbs up to Amazon, the replacement is on its way, thats less than 16 hours since I made contact with them. Will see how this one goes and if its noisy too then will see about getting a sapphire Tri-x card, they seem to be coolest of the bunch.

    Will play with my case fan profiles too to see if i can get more airflow going. The CPU doesnt get hot enough to spin them up fully so might have to tweak it more.

    Anyone know if the gigabyte app, SIV or if Msi Afterburner will allow you to set up a custom fan profile for case fans based on the temps of the gpu as well as the cpu?


  • Closed Accounts Posts: 6,934 ✭✭✭MarkAnthony


    It's all down to the thermal design of the card. You can't make a sweeping statement that X temp is the temp for all architectures. AMD have always been about putting as much power through it and creating shed loads of heat. Intel (CPU) and Nvida (GPU) are about better refinement etc. The AMD R9 2 and 3 series are perfectly happy at 95C and will keep going at that temp well beyond their useful life.

    Your VRMs on the other hand are quite happy up to 125C but it's amazing the amount of people that get close to the limit there and never bother to check :)


  • Registered Users, Registered Users 2 Posts: 7,182 ✭✭✭Genghiz Cohen


    Redfox25 wrote: »
    I have pretty good airflow in case, 200mm with no cages in front blowing in air and 3 140mm out take fans at the top/back.
    The Sapphire looks good but they dont seem to sell it atm, what is the Asus Stix like or should I just stick with XFX?

    You might have negative air pressure, if you could move one of those 140mm's to blow in from the front or bottom it might help.

    I am an Asus fanboi, I was very impressed with their DCU2 coolers and Strix looks like the next gen of that. That said, Windforce and Tri-X also look like solid coolers. Not so sure about the XFX ones.


  • Registered Users, Registered Users 2 Posts: 1,581 ✭✭✭Redfox25


    You might have negative air pressure, if you could move one of those 140mm's to blow in from the front or bottom it might help.

    I am an Asus fanboi, I was very impressed with their DCU2 coolers and Strix looks like the next gen of that. That said, Windforce and Tri-X also look like solid coolers. Not so sure about the XFX ones.

    Hi.
    You are probably right about the negative pressure in the case. The case fans are tied into the cpu temp via the fan hub in my case (Phanteks Enthoo luxe).
    As the CPU doesnt get very hot they are not spinning up to high speeds to get a good airflow going in the case.

    I might take the 200mm fan off the front and put it on the top and put 2 of the 3 140's on the front. There isnt space on the floor to fit a fan with the psu shroud in place. I did take out the 3.5inch drive holders as they were not being used. That seems to have boosted the cool air hitting the gpu.

    Is there a way to set up fan profiles based on the gpu as well as the cpu using Afterburner or the Gigabyte app, System Information Viewer.

    On cards, I do like the Strix alright. It would be a flip up between that and the Sapphire one. Will see how this card goes.


  • Advertisement
  • Closed Accounts Posts: 5,824 ✭✭✭RoyalMarine


    but to be 30*c higher surely isn't right?


  • Registered Users, Registered Users 2 Posts: 1,581 ✭✭✭Redfox25


    It's all down to the thermal design of the card. You can't make a sweeping statement that X temp is the temp for all architectures. AMD have always been about putting as much power through it and creating shed loads of heat. Intel (CPU) and Nvida (GPU) are about better refinement etc. The AMD R9 2 and 3 series are perfectly happy at 95C and will keep going at that temp well beyond their useful life.

    Your VRMs on the other hand are quite happy up to 125C but it's amazing the amount of people that get close to the limit there and never bother to check :)

    Always good to know
    Cheers.
    I can happily say that xfx havr improved the vrm cooling issues they had with their 290's. The heatsink now covers them correctly.


  • Closed Accounts Posts: 29,930 ✭✭✭✭TerrorFirmer


    but to be 30*c higher surely isn't right?

    Entirely feasible when you compare a superior cooling solution on a cooler card, to a less effective cooler on a far hotter card.

    90c is perfectly fine for the card, not dangerous. You'd be safe up to 100-110c. I used to have a passively cooled 8800GT years ago and it reached almost 120c under peak load.


  • Closed Accounts Posts: 5,824 ✭✭✭RoyalMarine


    maybe it's just me worrying so :)

    I'd just feel skeptical with something being that hot inside a closed container with other electrical components that cost me the guts of €2k running when I'm asleep or at work.

    (granted it's never near those temps when im not here so I guess nothing to worry about)


  • Registered Users, Registered Users 2 Posts: 4,698 ✭✭✭Gumbi


    People need to stop using Furmark for stress testing. My card throttles at stock unless I up the power limit. It puts way too much stress on your card.

    The Asus cooler for the 290/390 is terrible as they slapped the design from the 780 right onto it, and one or two of the heatpipes make no contact with the GPU at all!

    The XFX 390 is fine, 90c is high under load. Make sure it's mounted correctly and that your case has good airflow.

    My VaporX 290x runs at 68c after hours of gaming Witcher 3, with all case fans at lowest speed with fan controller and my auto fan speed settings (fan at 40%). That's with somewhat low idles but you can see the headroom I have! Excellent case cooling and a good aftermarket cooler are all helping though,


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 36,169 ✭✭✭✭ED E


    Gumbi wrote: »
    People need to stop using Furmark for stress testing. My card throttles at stock unless I up the power limit. It puts way too much stress on your card.

    Would you suggest prime95 is too hard on CPUs? Doubt it. Furmark is just fine.


  • Registered Users, Registered Users 2 Posts: 4,698 ✭✭✭Gumbi


    ED E wrote: »
    Would you suggest prime95 is too hard on CPUs? Doubt it. Furmark is just fine.

    It depends. I mean if it throttles my VaporX at stock (power throttles), then I can imagine with removed power limit and a weak cooler it can makes chips very hot.

    As regards Prime95, if you have a Haswell or higher chip and use a certain version of Prime95 with AVX instructions it can indeed be a bad idea to use it. Even at stock...

    It gets INSANELY hot with AVX instructions under load.


  • Registered Users, Registered Users 2 Posts: 4,698 ✭✭✭Gumbi


    I can find plenty examples very easily, but here's a guy getting almost 80c with decent water cooling (equivalent to high end air) at stock clocks http://www.tomshardware.co.uk/answers/id-2238396/4790k-prime95-avx-questions.html

    I wouldn't recommend it. It won't damage your chip (plenty of protective features will kick in, but it's not a useful barometer of a stable overclock or anything. Same goes for Furmark in my view.


  • Moderators, Technology & Internet Moderators Posts: 18,377 Mod ✭✭✭✭Solitaire


    "Raptr has now enabled video recording whether you want it or not!"
    And 3 seconds later...
    "Raptr is no longer responding"
    Then
    "Raptr.exe has encountered a problem and needs to close."

    ggwp raptr/AMD... :p

    Ever the master of timing, my old Fractal 140mm top exhaust has now decided its a good time to start screaming and roaring with an impending bearing failure that I have taken to fixing by jamming stationary in it until its quits its grumbling. Its beginning to look a lot like Christmas.... :rolleyes::pac:

    Merry Christmas and a booze-laden New Year to those of you who have better things to do than lurk on here tomorrow! :)


  • Registered Users, Registered Users 2 Posts: 6,222 ✭✭✭Calvin


    970 and PSU installed :cool: happy christmas guys!


  • Registered Users, Registered Users 2 Posts: 36,169 ✭✭✭✭ED E


    Next year the father is getting an iPad and I'm revoking his right to use a PC.

    Merry christmas all.

    MV3ovib.png


  • Moderators, Education Moderators, Technology & Internet Moderators Posts: 35,100 Mod ✭✭✭✭AlmightyCushion


    ED E wrote: »
    Next year the father is getting an iPad and I'm revoking his right to use a PC.

    Merry christmas all.

    MV3ovib.png

    All located on the desktop.


  • Registered Users, Registered Users 2 Posts: 36,169 ✭✭✭✭ED E


    All located on the desktop.

    Yup.

    I renamed it to stop explorer from crashing out every minute.


  • Moderators, Education Moderators, Technology & Internet Moderators Posts: 35,100 Mod ✭✭✭✭AlmightyCushion


    ED E wrote: »
    Yup.

    I renamed it to stop explorer from crashing out every minute.

    Ha ha. I didn't realise it was actually the desktop folder. That was a joke. Turned out it was true.


  • Registered Users, Registered Users 2 Posts: 5,752 ✭✭✭Mr Blobby


    So which sites are best for January sales? Looking to pick up a 970 and another 8gb of RAM.


  • Advertisement
  • Closed Accounts Posts: 5,824 ✭✭✭RoyalMarine


    Gf got me a Logitech Orion G910!!
    And an NXZT S340 black and blue
    4 x corsair AF120's
    Mac Mini 2012 (second hand) and 16gb of ram and SSD to upgrade it

    Will go very nicely with my proteos core mouse and G430 headset


    She knows the way to my heart!


  • Closed Accounts Posts: 5,824 ✭✭✭RoyalMarine


    So finally got around to setting up the keyboard and other few bits.

    This keyboard is the sexiest thing I've seen


  • Closed Accounts Posts: 6,934 ✭✭✭MarkAnthony


    So finally got around to setting up the keyboard and other few bits.

    This keyboard is the sexiest thing I've seen

    Err...
    Gf got me a Logitech Orion G910!!
    And an NXZT S340 black and blue
    4 x corsair AF120's
    Mac Mini 2012 (second hand) and 16gb of ram and SSD to upgrade it

    Will go very nicely with my proteos core mouse and G430 headset


    She knows the way to my heart!

    That memory of your needs work, and I'm not talking about the RAM. Expect a lot less RAM is the GF is on boards :pac:


  • Closed Accounts Posts: 5,824 ✭✭✭RoyalMarine


    Err...



    That memory of your needs work, and I'm not talking about the RAM. Expect a lot less RAM is the GF is on boards :pac:

    She's foreign. She doesn't even know what boards is 😎


  • Registered Users, Registered Users 2 Posts: 23,137 ✭✭✭✭TheDoc


    Having a read of various threads, and can't help but feel that like so many other communities and even tech websites, there is a lot of misinformation about the new Pascal range of cards coming out. A card with a design method for unique servers and "supercomputers" with absolutely no indication of reveals related to gaming, a lot of people drawing conclusions over nothing.

    I rarely keep up to date on the latest range of stuff, as I only start getting interested when I'm going to upgrade, but seems astonishing the amount of conclusions being drawn this early on about the Pascal range, with absolutely no details from Nvidia.


  • Registered Users, Registered Users 2 Posts: 2,125 ✭✭✭game4it70


    TheDoc wrote: »
    Having a read of various threads, and can't help but feel that like so many other communities and even tech websites, there is a lot of misinformation about the new Pascal range of cards coming out. A card with a design method for unique servers and "supercomputers" with absolutely no indication of reveals related to gaming, a lot of people drawing conclusions over nothing.

    I rarely keep up to date on the latest range of stuff, as I only start getting interested when I'm going to upgrade, but seems astonishing the amount of conclusions being drawn this early on about the Pascal range, with absolutely no details from Nvidia.

    Exactly,the key thing to remember about this "info" that is out there is that its all rumors at this stage with sites just using it as clickbait.


  • Registered Users, Registered Users 2 Posts: 23,137 ✭✭✭✭TheDoc


    game4it70 wrote: »
    Exactly,the key thing to remember about this "info" that is out there is that its all rumors at this stage with sites just using it as clickbait.

    As I don't normally keep myself up to the latest, I tend to not get into the "correction" posts, but even a few posts I've seen on here in the last week, and especially on places like BuildaPc on reddit are extremely misleading.

    Like for example if I was looking to buy a 970 say now, from the information provided by Nvidia in what looked to be a pretty detailed keynote, I'd see no reason to recommend holding off for that card. As its looking to me like a specialist unit not really for domestic use. And I doubt it will have a price impact on the existing ranges of cards surely?


  • Registered Users, Registered Users 2 Posts: 7,180 ✭✭✭Serephucus


    I would actually hold off on a 970 purchase (to jump into correction posts myself. :P) as recently there's been a lot of demand for them, and a slowdown in supply - may or may not be due to Pascal. The result being that 970s are about 10-15% more expensive than normal, at least in the US. I haven't looked into pricing here much.

    As for Pascal: I've made a few speculative posts myself lately, but I don't think the information has been misleading. What we can say based on known figures from NVIDIA is that the flagship part will be extremely powerful. We've known this for a long time. Now, this is in terms of raw compute power which, you're right, doesn't always translate into games, but it's a fair indication.


  • Moderators, Education Moderators, Technology & Internet Moderators Posts: 35,100 Mod ✭✭✭✭AlmightyCushion


    TheDoc wrote: »
    Having a read of various threads, and can't help but feel that like so many other communities and even tech websites, there is a lot of misinformation about the new Pascal range of cards coming out. A card with a design method for unique servers and "supercomputers" with absolutely no indication of reveals related to gaming, a lot of people drawing conclusions over nothing.

    I rarely keep up to date on the latest range of stuff, as I only start getting interested when I'm going to upgrade, but seems astonishing the amount of conclusions being drawn this early on about the Pascal range, with absolutely no details from Nvidia.

    Maxwell was originally meant to be produced at 20nm process node. Something went wrong, TMSC couldn't get to 20nm in time so nVidia they went back to 28nm for it. Pascal should be going 16nm and it looks like TMSC is able to do in volume so it should be ready for Pascal's launch. With Pascal nVidia will be using HBM2. We've seen what HBM has done for AMD with the Fury line (lower power usage vs GDDR5 so they can push core clock speeds higher and much higher memory bandwidth) and HBM2 is twice as fast and not limited to only 4GBs. These are two big technological leaps that will help performance regardless of anything else. If you were to take a 980Ti and shrink it to 16nm and change the memory from GDDR5 to HBM2 it would have a lot more memory bandwidth and would allow for increase clocks and lower power usage and heat. That's without even taking into account the improvements a new architecture can bring. This is part of the reason why people are so excited for Pascal and Arctic Islands.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 54,626 ✭✭✭✭Headshot




This discussion has been closed.
Advertisement