Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Graphics card for watching 4k @60Hz

  • 11-09-2019 8:28pm
    #1
    Registered Users, Registered Users 2 Posts: 66,118 ✭✭✭✭
    Chauffe, Marcel, chauffe!


    Not into gaming, but I like watching high quality video. 4k / 2160p.

    And I'd like something a bit better than my Intel HD graphics. It can do 4k for sure, but only at 30Hz. And does not have a Display Port out, which I believe is needed for 60Hz? My monitor is a professional quality Dell UP3216Q

    I have an RX570 4GB that I can use, but it is overkill, it gets quite hot and it uses too much energy for my needs. My PC is on 24/7

    If there is a passively cooled graphics card out there, even better!

    Any suggestions (maybe with a link to Amazon) would be greatly appreciated.


Comments

  • Registered Users, Registered Users 2 Posts: 4,028 ✭✭✭H3llR4iser


    Literally anything from the last couple of generations, either AMD or nVidia makes little difference for your use. Something like a GTX 1030 is usually passively cooled.

    Also, may cards are "passively cooled" when their 3D capabilities are not used - my GTX 1070 never spins the fans unless I launch a game or something using D3D / OpenGL / Vulkan and whatnot.

    The problem is that low end graphics cards tend to be lacking in terms of connections - some of them will even be limited to HDMI 1.4; It's also likely that your integrated graphics, if the CPU is recent, are able for 4k at 60hz, it's just the motherboard that doesn't have the right support - https://hackintosher.com/blog/want-hdmi-4k-60hz-work-integrated-graphics-buy-one-motherboards/

    That said...you already have the solution in hand, that RX570 should be barely using any energy at all just to display video; I've just ran a quick test on my GTX 1070 with a 4k video and it stays below 10% load, with the clock at only 139mhz and 40c of temperature. That RX570 and has all the right connectors and will use essentially no more energy than a smaller card when in its lowest state (pc idle doing nothing).


  • Registered Users, Registered Users 2 Posts: 66,118 ✭✭✭✭unkel
    Chauffe, Marcel, chauffe!


    That's interesting about the motherboards, I have a Z370 mobo (with Coffee Lake i5 CPU), but it is not one of those three. Would be ideal if the mobo would provide the 4k @60Hz. Or does a dedicated graphics card give noticeably better quality playback than Intel HD graphics at same 4k @60Hz?

    Unfortunately, the RX570 is quite noisy and does get warm. You have to put this into perspective, my PC is very quiet apart from it


  • Registered Users, Registered Users 2 Posts: 5,583 ✭✭✭EoinHef


    What media are you using that requires 60Hz at 4K?

    If your not rendering games i dont really see the need for 4K/60Hz.

    Media plays back at the rate its made in,the framerate does not increase based on hardware used to playback.


  • Registered Users, Registered Users 2 Posts: 66,118 ✭✭✭✭unkel
    Chauffe, Marcel, chauffe!


    It's for watching high quality media, but also just for general PC use. I have a high quality monitor (see my OP) and 4k @60Hz via a dGPU and DP just looks far superior to 4k @30Hz from my Intel HD Graphics via HDMI


  • Registered Users, Registered Users 2 Posts: 5,583 ✭✭✭EoinHef


    Hmm not sure it should look any different depending on the interface method.

    Thats not to say that theres not cases where media is made with 60Hz in mind but its not really widespread.

    Most cinematic stuff is 24Hz which means as long as your source is capable of outputting 4K you shouldnt really need anything else as most claimed 4K devices should be able to handle 4K@30Hz.


  • Advertisement
  • Closed Accounts Posts: 115 ✭✭knockers84


    Try underclocking the card

    Maybe something like a GTX 750 2nd hand(not sure of equivalent these days)


  • Registered Users, Registered Users 2 Posts: 4,028 ✭✭✭H3llR4iser


    unkel wrote: »
    That's interesting about the motherboards, I have a Z370 mobo (with Coffee Lake i5 CPU), but it is not one of those three. Would be ideal if the mobo would provide the 4k @60Hz. Or does a dedicated graphics card give noticeably better quality playback than Intel HD graphics at same 4k @60Hz?

    Unfortunately, the RX570 is quite noisy and does get warm. You have to put this into perspective, my PC is very quiet apart from it

    I don't think there'll be any difference in terms of visual quality between an integrated GPU and a separate card. I've just checked my laptop out of curiosity - it can indeed output 4k at 60Hz while in desktop mode, using the iGPU (it's an I7-8750H). Old one I had until a year ago was limited to 30Hz.

    I find it odd that the RX570 gets hot&noisy while doing nothing - I am a fan of...silence myself, I can definitely hear the 1070 even when it spins up to 800-1000 rpm, but 90% of the time it stays in "passive" mode.

    It might be worth trying one of these boards if you like tinkering a bit...or find a 1030 with HDMI 2.0 or DisplayPort on Amazon.

    Alternatively, but this depends on how much you wanna spend AND what CPU you currently have, there are Ryzen CPUs with integrated Radeon graphics which tend to be vastly superior to the Intel iGPUs...but IIRC the most powerful available is in Ryzen 5 form, depending on what i5 you have it could be a bit of an upgrade as well (more cores and generally more powerful for productivity). Also, a motherboard having an HDMI 2.0 would still be required...


  • Registered Users, Registered Users 2 Posts: 66,118 ✭✭✭✭unkel
    Chauffe, Marcel, chauffe!


    I might underclock the RX570. The card is currently memory overclocked (for mining) so that might be an explanation for the higher temps.


  • Registered Users, Registered Users 2 Posts: 36,170 ✭✭✭✭ED E


    If purchasing (you shouldn't really need to) ensure you have HEVC decoding support. CPU decoding 4k60 HEVC will produce about 2fps on a normal system.


  • Registered Users, Registered Users 2 Posts: 18,810 ✭✭✭✭K.O.Kiki


    Who TF still mines on GPUs?


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 66,118 ✭✭✭✭unkel
    Chauffe, Marcel, chauffe!


    Me. Well I did until the end of May, when I no longer needed to heat the house. Had been building it down over the last year or so, sold most stuff. Currently not very profitable with ETH very low. But will start again if value goes up once winter kicks in. A lot of my day time electricity comes from my large solar PV install and at night time I pay the 8c / kWh night rate. Overall mining has been profitable for me.


  • Registered Users, Registered Users 2 Posts: 66,118 ✭✭✭✭unkel
    Chauffe, Marcel, chauffe!


    Flashed the RX570 back to stock and it's doing a good job now. Lovely picture quality with the added benefit that I can now send the sound directly via HDMI in to my AV receiver (digitally) and the HDMI out of my AV receiver can also power a second monitor (and I can even power further monitors if I wish)

    Pretty happy with this solution, but must have a look at power use. PC without the RX570 consumes about 30W when idle. And I don't really want this to go to something like 80W as I prefer to keep the machine on 24/7


  • Closed Accounts Posts: 14,983 ✭✭✭✭tuxy


    Usually both CPUs and GPUs running at stock speeds can be undervolted. I've not attempted it on desktops but on the laptops I've tried it on I've usually been able to save about 10% on power usage. May be worth it if your trying to keep power usage to a minium and no downside to it once you figure out what voltage the system is stable at.


  • Registered Users, Registered Users 2 Posts: 66,118 ✭✭✭✭unkel
    Chauffe, Marcel, chauffe!


    That's it tuxy, I used to undervolt the card and underclock the core (and overclock the memory) for mining. I've done this mostly in the various Linux OS I was used for mining and in Windows I generally used settings within the mining programs.

    I know you can do it with various Windows applications as well (I think I used MSI afterburner and maybe one other one) but I wasn't as successful with those and I found them cumbersome. But I guess it is a much less tall task to just underclock voltage, core and ram, compared to the delicate balance I was finding above :p

    Which program do you suggest for ease of use?

    Modern Intel CPU does not need undervolting. It has very good built-in scaling down if you don't need full capacity. Without a dGPU the whole system idles at just 30W!


  • Registered Users, Registered Users 2 Posts: 4,028 ✭✭✭H3llR4iser


    unkel wrote: »
    Flashed the RX570 back to stock and it's doing a good job now. Lovely picture quality with the added benefit that I can now send the sound directly via HDMI in to my AV receiver (digitally) and the HDMI out of my AV receiver can also power a second monitor (and I can even power further monitors if I wish)

    Pretty happy with this solution, but must have a look at power use. PC without the RX570 consumes about 30W when idle. And I don't really want this to go to something like 80W as I prefer to keep the machine on 24/7


    Yep, thought so; Power draw for that card at idle should be whereabouts of 20W, maybe less. I'm however a bit curious about why you're so concerned about a few Watts of difference :confused:


  • Registered Users, Registered Users 2 Posts: 66,118 ✭✭✭✭unkel
    Chauffe, Marcel, chauffe!


    Well at idle without dGPU about 30W and I was thinking with dGPU about 80W (hopefully you're right and it will be less)

    That gives an extra consumption of ((80-30W)/1000) kWh * 24 * 365 = 440kWh, or about 13% of the total electricity bill for an average Irish household. Not unsubstantial at the guts of €100 per year


  • Registered Users, Registered Users 2 Posts: 18,810 ✭✭✭✭K.O.Kiki


    unkel wrote: »
    Well at idle without dGPU about 30W and I was thinking with dGPU about 80W (hopefully you're right and it will be less)

    That gives an extra consumption of ((80-30W)/1000) kWh * 24 * 365 = 440kWh, or about 13% of the total electricity bill for an average Irish household. Not unsubstantial at the guts of €100 per year

    Use a PSU calculator - Irish power is ~€0.16/kwH
    https://outervision.com/power-supply-calculator


  • Registered Users, Registered Users 2 Posts: 66,118 ✭✭✭✭unkel
    Chauffe, Marcel, chauffe!


    K.O.Kiki wrote: »
    Use a PSU calculator - Irish power is ~€0.16/kwH
    https://outervision.com/power-supply-calculator

    I don't need a PSU calculator and no, Irish power is not €0.16c/kWh

    Best rate last time I looked was 18c so above saving is about €85

    In my own case I do have a large solar PV array and I do have cheap night rate, so the damage is not that bad, but I'd rather not spend it / waste energy if I don't have to.


Advertisement