Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Nvidia RTX Discussion

17778808283209

Comments

  • Registered Users Posts: 1,186 ✭✭✭Squaredude


    MiskyBoyy wrote: »
    You must be one of the only ones. Reddit is full of people complaining about their scummy tactics.

    I myself went to pre-order one with them but had my revolut card frozen just to see, and they tried to take the full amount. Even tho it could be months before they deliver.

    I'd rather they take the money honestly. Stop me from spending it in the mean time. I'm sure the money will be taken in the next day or two. Wish they'd hurry up and ship the case I ordered 10 days ago though


  • Registered Users Posts: 8,798 ✭✭✭MiskyBoyy


    Squaredude wrote: »
    I'd rather they take the money honestly. Stop me from spending it in the mean time. I'm sure the money will be taken in the next day or two.

    I'd be afraid they'd take the money, without having a card to ship for a long time and miss out on picking one up somewhere else in the meantime.


  • Registered Users Posts: 4,487 ✭✭✭TheChrisD


    I never would have though there were this many sharks in the water willing to spend 700 quid on a gfx card.

    Given it's a lower price point than the previous gen, a massive leap in power over the previous gen, and there's still people out there (like me) with a 900 seires or 10 series card looking to upgrade... it kinda makes sense, doesn't it?


  • Registered Users Posts: 1,450 ✭✭✭jebidiah


    That money is just resting in their account!


  • Registered Users Posts: 978 ✭✭✭earthwormjack


    MiskyBoyy wrote: »
    I'd be afraid they'd take the money, without having a card to ship for a long time.

    Going by their forum, that's exactly what they are doing.


  • Advertisement
  • Registered Users Posts: 5,415 ✭✭✭.G.


    That's what they've always done though so it should be no surprise. They did it for the 10 series launch and the 20 series and cards were in very short supply for both of them too. Scan do it too I think. No company should be taking money for anything they can't guarantee when they'll ship but outside of Amazon it does seem to be quite common in the tech space.


  • Registered Users Posts: 2,624 ✭✭✭Thor


    It makes more sense to take the money right away and figure it out afterwards.

    I'll go a slight step further and say it's good they increased the prices a bit aswell. They have a massive amount of orders over MRSP, this will give the retailer stronger purchasing power and compete against other retailers and still provide decent profits. Since now they know they have orders secured already. If they don't get units quick, more and more cancel and will wait for AMD launch for more competition.


  • Banned (with Prison Access) Posts: 1,306 ✭✭✭bobbyy gee




  • Registered Users Posts: 7,878 ✭✭✭frozenfrozen


    breaking news


  • Moderators, Recreation & Hobbies Moderators Posts: 4,658 Mod ✭✭✭✭Hyzepher


    With those recent 3090 leaked benchmarks its hard to see how any card released between the 3080 and 3090 could be worth it, for gaming. The prospect of 8k gaming is at least another generation away so I can't see how that's a thing.

    People might prefer a 20G 3080 for some type of furture proofing but that's probably going to come in around the €1200 mark - especially AIB models, and I'm not really sure 20G is going to make any difference.


  • Advertisement
  • Registered Users Posts: 2,624 ✭✭✭Thor


    Hyzepher wrote: »
    With those recent 3090 leaked benchmarks its hard to see how any card released between the 3080 and 3090 could be worth it, for gaming. The prospect of 8k gaming is at least another generation away so I can't see how that's a thing.

    People might prefer a 20G 3080 for some type of furture proofing but that's probably going to come in around the €1200 mark - especially AIB models, and I'm not really sure 20G is going to make any difference.

    Totally agree. I don't see any real advantage to having more vram, especially 20GB, unless your workload requires it. Gaming simply won't need that much, even at 4K. VRAM speed is far more important for performance overall.

    Pricing on the 3090 is simply horrible. It's really aimed at those that absolutely need the VRAM and can't go lower. A 3080 with 20GB will most likely be priced over €1000, offer not performance increase over the 3080 10GB. The no idea that is future proofing makes no sense to me. Developers won't suddenly start pushing more VRAM usage as they still have to aim at the middle. Designing a game and levels around higher VRAM would ruin the game on the majority of gamers setups. Console's might be a concern since they have 16GB, but OS and other factors are in play there, so not all is available to the developer.

    Ultimately, I feel more than comfortable with a 10GB card, and I think Nvidia have ruined hype for any 3080Ti/super, since it will again fall in the middle of a 10-percent jump.


  • Banned (with Prison Access) Posts: 1,306 ✭✭✭bobbyy gee


    Thor wrote: »
    Totally agree. I don't see any real advantage to having more vram, especially 20GB, unless your workload requires it. Gaming simply won't need that much, even at 4K. VRAM speed is far more important for performance overall.

    Pricing on the 3090 is simply horrible. It's really aimed at those that absolutely need the VRAM and can't go lower. A 3080 with 20GB will most likely be priced over €1000, offer not performance increase over the 3080 10GB. The no idea that is future proofing makes no sense to me. Developers won't suddenly start pushing more VRAM usage as they still have to aim at the middle. Designing a game and levels around higher VRAM would ruin the game on the majority of gamers setups. Console's might be a concern since they have 16GB, but OS and other factors are in play there, so not all is available to the developer.

    Ultimately, I feel more than comfortable with a 10GB card, and I think Nvidia have ruined hype for any 3080Ti/super, since it will again fall in the middle of a 10-percent jump.
    if amd bring in good prices NVidia will drop its prices or bring out slightly better cards they will make them 10% better than existing NVidia cards it will probably take a year to buy as people are using computer bots to buy up the cards


  • Registered Users Posts: 4,338 ✭✭✭Homelander


    What wrecks my head with Nvidia is their blatant anti-consumer allocation of vram based purely on their status as market leader, knowing that the inevitable upgrade will likely lead to another Nvidia purchase.

    They could easily, so easily, allocate the cards better vram, but clearly their long-term strategy was beat down AMD - which they largely did - and then go into cruise control, artificially limiting GPU longevity to secure future purchases.


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    The 3090 is basically a budget titan for developers/designers that need the vram and have the option of SLI.

    I wouldn't be spending 700-800 on a card for only 1-2 years though. 10gb is not enough for a card in that price range imo and there will be games that want more for the highest settings.

    Larger amounts of Vram is one of the best ways of increasing noticeable graphics quality in either static or dynamic lighting scenarios.

    Maybe the new decompression techniques on the gpu will make it less important.

    I think 20gb is still too much for a gaming card for a while. 12gb would have been nice, there's even 2 spaces left for the extra 2gb. 16gb would have been even better for long term.


  • Registered Users Posts: 8,798 ✭✭✭MiskyBoyy


    I wish :rolleyes:

    uX77y62.jpg


  • Registered Users Posts: 4,338 ✭✭✭Homelander


    I've given up on the 3080. I'd be happy enough to see my order officially cancelled, can wait it out and see what AMD brings to the table. If nothing else, if might drive Nvidia prices down. I have a 2070 Super, so really in many ways I was just being a sucker anyway.

    At the moment my two most played games are Overwatch (runs at 1080 240hz) and Paladins (runs at 1080 175 30% usage with 200% res scale) so I was really just being ridiculous


  • Moderators, Recreation & Hobbies Moderators Posts: 4,658 Mod ✭✭✭✭Hyzepher


    bobbyy gee wrote: »
    if amd bring in good prices NVidia will drop its prices or bring out slightly better cards they will make them 10% better than existing NVidia cards it will probably take a year to buy as people are using computer bots to buy up the cards
    Homelander wrote: »
    What wrecks my head with Nvidia is their blatant anti-consumer allocation of vram based purely on their status as market leader, knowing that the inevitable upgrade will likely lead to another Nvidia purchase.

    They could easily, so easily, allocate the cards better vram, but clearly their long-term strategy was beat down AMD - which they largely did - and then go into cruise control, artificially limiting GPU longevity to secure future purchases.


    I don't think there is enough performance gap between the current 3080 and 3090 (if the leaked performance is right) to position any cards between them that would make sense. The 3090 with its large vram isn't showing that much of an increase.

    I still don't believe that vram is a limiting factor or will be for 90% of gamers


  • Registered Users Posts: 10,299 ✭✭✭✭BloodBath


    It's rumored the ti or super versions will be switching over to TSMC's 7nm process so there may well be room for more performance than the 3090.


  • Registered Users Posts: 2,624 ✭✭✭Thor


    BloodBath wrote: »
    It's rumored the ti or super versions will be switching over to TSMC's 7nm process so there may well be room for more performance than the 3090.

    It's possible, but more performance for a TI over the more expensive 3090. We would be looking at least a year a way.


  • Registered Users Posts: 4,027 ✭✭✭H3llR4iser


    Hyzepher wrote: »
    People might prefer a 20G 3080 for some type of furture proofing but that's probably going to come in around the €1200 mark - especially AIB models, and I'm not really sure 20G is going to make any difference.
    Thor wrote: »
    Totally agree. I don't see any real advantage to having more vram, especially 20GB, unless your workload requires it. Gaming simply won't need that much, even at 4K. VRAM speed is far more important for performance overall.

    VRam usage goes up in a deceptively fast way - Linus Tech Tips made a video, a few days ago, where they're running Crysis (yep, the old one) in 4k at max settings, and it's sucking 4GB of Vram. That's a game from 2005, when cards had 512MB of memory.

    More interestingly perhaps, is the of these cards outside of gaming; I do some 3D rendering as an hobby - and I can assure you even a relatively simple scene can run out of the 8GB of VRAM very very easily, all you need are a bunch of high-res textures.

    And this is at an after work, amateur level - there are plenty of professionals who use gaming cards instead of Quadro ones due to the sheer cost of the latter; Anyone I know, in fact, that's a designer or architect runs GTX or RTX cards in their rigs.

    In these regards, really, one needs to wonder how much a 16GB 3080 would affect Quadro sales.


  • Advertisement
  • Banned (with Prison Access) Posts: 1,306 ✭✭✭bobbyy gee




  • Registered Users Posts: 13,980 ✭✭✭✭Cuddlesworth


    H3llR4iser wrote: »
    VRam usage goes up in a deceptively fast way - Linus Tech Tips made a video, a few days ago, where they're running Crysis (yep, the old one) in 4k at max settings, and it's sucking 4GB of Vram. That's a game from 2005, when cards had 512MB of memory.

    More interestingly perhaps, is the of these cards outside of gaming; I do some 3D rendering as an hobby - and I can assure you even a relatively simple scene can run out of the 8GB of VRAM very very easily, all you need are a bunch of high-res textures.

    And this is at an after work, amateur level - there are plenty of professionals who use gaming cards instead of Quadro ones due to the sheer cost of the latter; Anyone I know, in fact, that's a designer or architect runs GTX or RTX cards in their rigs.

    In these regards, really, one needs to wonder how much a 16GB 3080 would affect Quadro sales.

    You can't tell what the VRAM usage is, only the allocation. Currently the only real way to test is via a game engine debug, probably some internal chip maker tools or just testing cards with different vram amounts to see noticeable differences in performance.

    I really doubt that 4k gaming comes close to 10gig of vram and Nvidia knows it. And anything above is unplayable.


  • Registered Users Posts: 13,980 ✭✭✭✭Cuddlesworth


    TheChrisD wrote: »
    Given it's a lower price point than the previous gen, a massive leap in power over the previous gen, and there's still people out there (like me) with a 900 seires or 10 series card looking to upgrade... it kinda makes sense, doesn't it?

    Its a bit like talking to a person in a bad relationship, where the bad stuff is ok because the partner is really nice sometimes.

    Nvidia has been inflating prices for a few generations now. This is the first time in 5 or 6 years that a new generation simply didn't come in at a higher price point. And even then, its over priced. This is the mid end card, the 3090 is the "high end" part.

    First time in a long time that they will probably have some competition too.

    I get that people like the idea of affordable cards again but the sheer panic around getting one right now is nuts.


  • Registered Users Posts: 12,598 ✭✭✭✭errlloyd


    Nvidia has been inflating prices for a few generations now. This is the first time in 5 or 6 years that a new generation simply didn't come in at a higher price point. And even then, its over priced. This is the mid end card, the 3090 is the "high end" part.

    I find the "rebenchmarking" of premium card pricing really interesting. I have only rekindled my interest in Pc Gaming during Lockdown. But I wonder if any of you have any observations about how that has changed over the past 15 years or so?

    It seems that a premium card would have been €250e at launch in 2005. Nvidia monopoly and an aging cohort of gamers with more money seems to have inflated that hugely. But the other aspects of a gaming build would have been more costly - for example Processors, MBoards and Ram have come down in price or stayed the same because they are also used by non gamers?

    Is that perception correct?


  • Registered Users Posts: 13,980 ✭✭✭✭Cuddlesworth


    errlloyd wrote: »
    I find the "rebenchmarking" of premium card pricing really interesting. I have only rekindled my interest in Pc Gaming during Lockdown. But I wonder if any of you have any observations about how that has changed over the past 15 years or so?

    It seems that a premium card would have been €250e at launch in 2005. Nvidia monopoly and an aging cohort of gamers with more money seems to have inflated that hugely. But the other aspects of a gaming build would have been more costly - for example Processors, MBoards and Ram have come down in price or stayed the same because they are also used by non gamers?

    Is that perception correct?

    Sort of. Volumes do make a difference and dedicated graphics sell nowhere near as much as the CPU's or dram modules do.

    Graphics cards have been increasing in size, they are throwing more transistors at the problem every generation. Larger chips = more cost to manufacture.

    They are bleeding edge for certain components, like ram. Increases cost.

    They suck up power. Systems in general have been decreasing in power usage for a good while, meanwhile graphics cards float around 250-300 watts at the high end usually. Power usage = power stages + component's and more cost.

    So I would expect some price creep. But its well known they Nvidias top end is pure profit at this stage(and the balance sheets show it). So the price creep has really been a lack of competition.


  • Registered Users Posts: 5,755 ✭✭✭Cordell


    It's a combination of facts indeed. Adjusted for inflation it's around e315, the GPUs are now much larger and complex, the games are much more demanding than they used to be. And of course the good old corporate greed :)


  • Registered Users Posts: 710 ✭✭✭mad turnip


    My 3080 from overclockers arrived today. Hopefully everyone elses ships soon.


  • Registered Users Posts: 4,314 ✭✭✭sink


    I had the Zotac trinity on order from Amazon.de but I've changed my mind and I've pre-ordered the EVGA 3080 XC3 Black from Alternate.de. It doesn't have a stock back plate but I'm going to get an EK water block + back plate for it and EK have confirmed they will be supporting this model.

    With EVGA removing the stock cooler doesn't void their warranty and I figured that if they release a 20GB version in the next 90 days I can also upgrade to it with the step-up program. So it seems the best choice overall.

    It's not one of the high power models but judging by the GN stream, there doesn't seem to be much room for overclock these cards anyway.


  • Registered Users Posts: 858 ✭✭✭one armed dwarf


    I went and cancelled my Ventus order with scan also, impressions place it right at the bottom of all AIBs so happy to wait for a while tbh.

    Similar to above there's not a lot I'm playing right now that requires this kind of power. I mainly just want something that can rock Cyberpunk, which might be difficult to get before launch but would rather have peace of mind than a substandard AIB.


  • Advertisement
  • Registered Users Posts: 13,729 ✭✭✭✭Inquitus


    Its currently pretty much vapourware, sold out in 99% of places.


Advertisement