Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Off-Topic Thread V3.0

Options
18182848687175

Comments

  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    EoinHef wrote: »
    The GL850 comes set with Gamer 1 profile as default which has the brightness at 100%.

    Unless I was trying to use HDR or was working in a very bright room I really don't see the need for 100% brightness. These panels don't have the peak brightness for HDR to give the full effect anyway so that use case is a waste as well.

    In a room with a lamp on and not the main light 45% is perfect for me but I do agree its not a setting that has one "right" setting,its going to depend on context of use.

    Depends on the monitor I guess but the GL850 peak brightness is only 350 nits. You need 1000 nits for true HDR. Maybe your monitor is higher. Mine will do 400 and has a HDR 400 rating but it's still not enough.


  • Registered Users Posts: 5,572 ✭✭✭EoinHef


    BloodBath wrote: »
    Depends on the monitor I guess but the GL850 peak brightness is only 350 nits. You need 1000 nits for true HDR. Maybe your monitor is higher. Mine will do 400 and has a HDR 400 rating but it's still not enough.

    The GL850 also say HDR400 sandard but yeah the peak brightness is way too low for HDR,even HDR 400. Thats what I meant,HDR is pretty pointless on the GL850 even at 100% brightness. So the only use case left for high brightness would be a very bright room.

    I've a 4K Sony TV permanently connected to the PC that supports HDR and has FALD so i use that for HDR stuff. But tbh HDR is something I could easily live without,for gaming anyway.


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    HDR for gaming is awful anyway. It introduces a load of input lag.

    I don't know why you're linking the brightness gains to only HDR though. It improves the image regardless.


  • Registered Users Posts: 5,572 ✭✭✭EoinHef


    BloodBath wrote: »
    HDR for gaming is awful anyway. It introduces a load of input lag.

    I don't know why you're linking the brightness gains to only HDR though. It improves the image regardless.

    Ah no I'm not trying to say brightness is only good for HDR. Just that dependent on use case going lower won't really diminish picture quality. Detail is still clear when your in a dark room at a low brightness for instance.


  • Registered Users Posts: 14,309 ✭✭✭✭wotzgoingon


    Best mod you can do for any computer setup is black out curtains.


  • Advertisement
  • Registered Users Posts: 18,703 ✭✭✭✭K.O.Kiki


    I really want to get myself a new mouse but can't decide between Roccat Kone Pure Ultra & Endgame Gear XM1.

    Used gearsearch's comparison tool, checked dozens of reviews, etc.

    Might end up just buying both & then returning the loser.


  • Registered Users Posts: 13,981 ✭✭✭✭Cuddlesworth


    Best mod you can do for any computer setup is black out curtains.

    Its not good to have a direct light source shining right into your eye.


  • Registered Users Posts: 7,707 ✭✭✭Mr Crispy


    I hate using an IPS in a dark room. Really highlights their inherent glow and any backlight bleed. Should at the very least have some bias lighting behind the screen, imo.


  • Registered Users Posts: 14,309 ✭✭✭✭wotzgoingon


    Its not good to have a direct light source shining right into your eye.

    If you have a south facing room with direct sunlight coming in and you can't see the screen it gets annoying real quick.


  • Registered Users Posts: 36,167 ✭✭✭✭ED E


    ED E wrote: »
    Do I learn? Hell no I don't. Disaster with €70 AIOs so I go and buy a set of €50 AIOs. Genius I am I tell ya.

    HKiH8tRl.jpg


    Mobo doesnt have any RGB headers (I was aware in advance) but as the L240 cannot have its LEDs turned off I decided to hotwire the L240T to 12v and GND the R chan so both are glowing red.

    One runs at a lower RPM with narrower tubing I think so is a little hotter idle but under load they're basically the same. No drips or jets of steam yet.


  • Advertisement
  • Registered Users Posts: 740 ✭✭✭z0oT


    Has anyone here ever re-purposed old hardware as a DIY router?

    I've cobbled together some old parts to do that and thrown in a 4-port NIC PCIe card, but I'm trying to decide upon one of the typical free OS's out there for this purpose.

    Pfsense - Probably the most popular and capable, but it seems to be massive overkill for my needs.
    Opnsense - A bit easier to find your way around the WebUI. I've gotten pretty much everything to work, except the inbuilt OpenVPN server which is a bit of a show stopper for me.
    IPFire - Linux based so much more familiar to me. Everything works including the inbuilt OpenVPN server. Very easy to setup via the WebUI, but some settings require SSH access via the LAN to change.

    Not sure if I'll actually replace the router in the house, (a good off the shelf router & ethernet switch is probably a way easier and just as good solution) but the advantage is that you can run much more on the router using a re-purposed PC.


  • Registered Users Posts: 13,981 ✭✭✭✭Cuddlesworth


    Your typical router is 5-10 watts. Repurposed x86 boxes aren't exactly comparable.

    edit:zoots got it down


  • Registered Users Posts: 740 ✭✭✭z0oT


    I think you mean aren't exactly comparable. A typical repurposed PC will require orders of magnitude more power than an off the shelf router.

    The best you could probably do with a DIY setup is 10-20W at idle with a low power & PicoPSU setup. 40-50W is probably more what you could expect from a standard setup. It's approximately €50 extra per year, which isn't enormous.

    Practically speaking something like this is probably a better option than a standard DIY PC if you're talking power usage, or even just for footprint size alone.

    https://www.amazon.co.uk/Partaker-Celeron-Computer-Support-Pfsense/dp/B07SC67C4T/ref=sr_1_13?dchild=1&keywords=mini+router+pc&qid=1593516591&sr=8-13


  • Registered Users Posts: 36,167 ✭✭✭✭ED E


    Yeah, something like you linked or a C2550D4I are the way to go.
    Remember your router only needs two ports, stick a switch to meet your requirements underneath something compact running PFSense (If you're gonna DIY I think it makes sense to go for the full featured option).


  • Registered Users Posts: 13,981 ✭✭✭✭Cuddlesworth


    ED E wrote: »
    Yeah, something like you linked or a C2550D4I are the way to go.

    Didn't those boards have serious issues with BMC chips and the processors? Not sure I'd be recommending that platform to anybody at this point.


  • Registered Users Posts: 740 ✭✭✭z0oT


    ED E wrote: »
    Yeah, something like you linked or a C2550D4I are the way to go.
    Remember your router only needs two ports, stick a switch to meet your requirements underneath something compact running PFSense (If you're gonna DIY I think it makes sense to go for the full featured option).
    Yes, just one for LAN, the other for WAN. It can be pretty hard to find a cheap ITX board that has 2 NICs on it though, an add-in PCIe card is the cheaper option.
    Didn't those boards have serious issues with BMC chips and the processors? Not sure I'd be recommending that platform to anybody at this point.

    For that particular board, I believe there was an issue with the BMC RAM wearing out which I think the issue was fixed with a later BMC firmware release if I'm not mistaken. Might not be a bad choice at all for a router build given that.

    Another thing if you were to re-purpose that board as a router is you would need to know how the BMC's network connection is bonded, if it is.

    The Ryzen ASRock "Server" board I have in my server has 2 NICs, and by default the BMC's LAN is connected in an active backup bonding config to one of the 2 Intel NICs with its own dedicated Realtek NIC.

    It means if you weren't careful about which port becomes your WAN port and didn't disable the bond (if that board has one), then your BMC could be exposed to the internet directly.


  • Registered Users Posts: 36,167 ✭✭✭✭ED E


    Didn't those boards have serious issues with BMC chips and the processors? Not sure I'd be recommending that platform to anybody at this point.

    Intel made a balls of the LPC clock which when it failed would prevent the board booting as it was using that clock sig. The boards were all reworked. Asrock Rack did a half decent job, Synology did a real bodge.
    z0oT wrote: »
    Another thing if you were to re-purpose that board as a router is you would need to know how the BMC's network connection is bonded, if it is.

    The Ryzen ASRock "Server" board I have in my server has 2 NICs, and by default the BMC's LAN is connected in an active backup bonding config to one of the 2 Intel NICs with its own dedicated Realtek NIC.

    It means if you weren't careful about which port becomes your WAN port and didn't disable the bond (if that board has one), then your BMC could be exposed to the internet directly.

    Three NICs, optional dedi or no. Just enter the bios when the board is new and turn shared off and boom no IPMI over WAN because yes that would be very bad. You could always enable it on the LAN side if you didnt want to loop it through your switch.


  • Registered Users Posts: 53,932 ✭✭✭✭Headshot


    I'm getting that upgrading itch again lol BUT i'm unsure would I get a big difference to what I have now

    My current specs:

    Monitor: Alienware AW3418DW G-SYNC Ultra Wide Curved IPS Monitor 3440x1440,
    Case: Cooler Master MasterCase H500P Mesh White
    CPU: AMD Ryzen 9 3900X 12-Core Processor
    CPU Cooler: Corsair Hydro 100i RGB Platinum SE, Hydro Series
    Motherboard: GIGABYTE X470 AORUS Ultra Gaming AM4/DDR4 Motherboard
    GPU: EVGA GeForce RTX 2080 Ti FTW3 ULTRA GAMING, 11GB
    Hard Drive: Corsair MP600 Force Series, 2 TB High-speed Gen 4 PCIe x4, NVMe M.2 SSD
    Hard Drive: Kingston Technology SSD A1000 (SA1000M8/960G) 960 GB
    Ram: Corsair CMK16GX4M2B3200C16 Vengeance 32GB
    PSU: EVGA SuperNOVA 650 G2, 80+ GOLD 650W, Fully Modular,

    I plan to get into the 4K some time but maybe when the new NVIDIA GeForce RTX 3080 Ti come out so looking at the specs above is it worth upgrading, I want to ready for Cyberpunk 2077 to play at the highest quality:)


  • Registered Users Posts: 36,167 ✭✭✭✭ED E


    Headshot wrote: »
    I'm getting that upgrading itch again lol BUT i'm unsure would I get a big difference to what I have now

    You should get some cream for that, seems to never leave you!


  • Registered Users Posts: 53,932 ✭✭✭✭Headshot


    ED E wrote: »
    You should get some cream for that, seems to never leave you!

    It's a curse I tells ya

    I like new shiny things


  • Advertisement
  • Registered Users Posts: 36,167 ✭✭✭✭ED E


    Mate had a new camera the other day, I'm now 20 videos deep in reviews. Its a curse.


  • Registered Users Posts: 18,703 ✭✭✭✭K.O.Kiki


    Headshot wrote: »
    I'm getting that upgrading itch again lol BUT i'm unsure would I get a big difference to what I have now

    My current specs:

    Monitor: Alienware AW3418DW G-SYNC Ultra Wide Curved IPS Monitor 3440x1440,
    Case: Cooler Master MasterCase H500P Mesh White
    CPU: AMD Ryzen 9 3900X 12-Core Processor
    CPU Cooler: Corsair Hydro 100i RGB Platinum SE, Hydro Series
    Motherboard: GIGABYTE X470 AORUS Ultra Gaming AM4/DDR4 Motherboard
    GPU: EVGA GeForce RTX 2080 Ti FTW3 ULTRA GAMING, 11GB
    Hard Drive: Corsair MP600 Force Series, 2 TB High-speed Gen 4 PCIe x4, NVMe M.2 SSD
    Hard Drive: Kingston Technology SSD A1000 (SA1000M8/960G) 960 GB
    Ram: Corsair CMK16GX4M2B3200C16 Vengeance 32GB
    PSU: EVGA SuperNOVA 650 G2, 80+ GOLD 650W, Fully Modular,

    I plan to get into the 4K some time but maybe when the new NVIDIA GeForce RTX 3080 Ti come out so looking at the specs above is it worth upgrading, I want to ready for Cyberpunk 2077 to play at the highest quality:)

    I bet your mouse/keyboard are shiiiiiiiiiiit :pac:


  • Registered Users Posts: 7,707 ✭✭✭Mr Crispy


    Yeah, tell us what peripherals you're using. What's your audio set-up?


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    Chair too if you're really itching. Not much to be done with that PC atm.


  • Moderators, Technology & Internet Moderators Posts: 17,133 Mod ✭✭✭✭cherryghost


    Speaking of chair, anyone recommend one that will used for gaming and office work? Have 4 monitors and need one thats comfortable and easy to swivel etc. I have this Ikea one for 8 years and its been amazing. It's started to shed its material now so think its time for a new one


  • Posts: 0 [Deleted User]


    A nice audio setup would be my suggestion unless you have one already.


  • Registered Users Posts: 3,495 ✭✭✭Lu Tze


    Speaking of chair, anyone recommend one that will used for gaming and office work? Have 4 monitors and need one thats comfortable and easy to swivel etc. I have this Ikea one for 8 years and its been amazing. It's started to shed its material now so think its time for a new one

    If it's a Markus, they have 10 year warranty I think so you might be able to get it replaced!


  • Moderators, Technology & Internet Moderators Posts: 17,133 Mod ✭✭✭✭cherryghost


    Lu Tze wrote: »
    If it's a Markus, they have 10 year warranty I think so you might be able to get it replaced!

    Nah, its the cheaper Millberget.


  • Moderators, Technology & Internet Moderators Posts: 17,133 Mod ✭✭✭✭cherryghost


    A nice audio setup would be my suggestion unless you have one already.

    I have 2 kids so no chance :) Always have headset on


  • Advertisement
  • Posts: 0 [Deleted User]


    I have 2 kids so no chance :) Always have headset on

    Sorry, forgot to quote Headshot from his post.


Advertisement