Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Nvidia RTX Discussion

14546485051126

Comments

  • Registered Users Posts: 668 ✭✭✭MidlanderMan


    Nedved85 wrote: »
    Where did you read this?

    From somenone who works for Amazon.


  • Registered Users, Registered Users 2 Posts: 18,303 ✭✭✭✭Dohnjoe


    K.O.Kiki wrote: »
    BTW looking at the RTX 3080 PCB shots, I'm not so sure that 20Gb 3080 models will exist - they might keep that configuration for 3090 & maybe produce a 12Gb Ti/Super variant.

    Videocardz, which has been fairly accurate with their leaks about the 30 series, have info from Gigabyte which seems to confirm the 3080 20gb variant

    https://videocardz.com/newz/gigabyte-confirms-geforce-rtx-3060-8gb-rtx-3070-16gb-and-rtx-3080-20gb
    The product list features GeForce RTX 3060 (S – SUPER?) 8GB model, GeForce RTX 3070 16GB (S- SUPER?) and Geforce RTX 3080 20GB (also labelled S). It remains unclear if the S means a SUPER series, or as our sources tend to believe, the Ti models.

    The GeForce RTX 3080 20GB model is not expected to feature a higher core count than the just-released 10GB model, thus we are unsure if the product code is not final or the card might actually feature 3080 Ti or 3080 SUPER branding.


  • Registered Users, Registered Users 2 Posts: 6,289 ✭✭✭Cordell


    Hi,

    The PCB for the 3080 and 3090 are the same custom PCB. The number of RAM slots on the front should add up to 12, with 2 unused for the 3080 and all in use for the 3090, the main difference is that the 3090 is using double sided mounting. This was done historically but hasn't been done for a while by Nvidia but they seem to have moved back to this to save space. The 3080 20GB variant would just have to have the same 2 sided installation, so it's totally possible to have a 20 GB one. Heck, if you could get your hands on the RAM and were a magician with PCBs and soldering you could turn your 3080 10 GB into 24GB, but you would still have the 320bit architecture. Not the 384-bit bus.

    Micron hints 16Gb/2GB chips: https://www.tomshardware.com/news/micron-reveals-gddr6x-details-the-future-of-memory-or-a-proprietary-dram

    The 384-bit but width implies 12 chips, and obviously 320 implies 10. The unused chip pads are probably backed up by defective or disabled memory controller chip channels, so just soldering a chip there will not work.


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    Exactly, the RTX titan is also 384 bit with 12 memory chips. The 3090, also 384 bit, will be the same which indicates they do indeed have 2gb ddr6x chips.


  • Registered Users, Registered Users 2 Posts: 4,314 ✭✭✭sink




  • Registered Users, Registered Users 2 Posts: 14,005 ✭✭✭✭Cuddlesworth


    z0oT wrote: »
    It's the same thing every launch.

    Availability is a pipe dream, and prices are high if you're lucky to get your hands on the stuff. It makes the launch far more paper-like than what Nvidia/AMD/Intel would have you believe. It can often take up to 6 or so months for it to become more reasonable.

    I'm planning on skipping the 3000 generation along with whatever AMD have coming. Can't really justify an upgrade from the 5700XT for the games I've been playing at 1440p.

    The demand for this card is a bit mental, I'm having trouble with understanding the why. I never would have though there were this many sharks in the water willing to spend 700 quid on a gfx card.


  • Registered Users, Registered Users 2 Posts: 8,615 ✭✭✭grogi


    The demand for this card is a bit mental, I'm having trouble with understanding the why. I never would have though there were this many sharks in the water willing to spend 700 quid on a gfx card.

    That is probably already half of the overall demand for this generation.


  • Registered Users, Registered Users 2 Posts: 18,751 ✭✭✭✭K.O.Kiki


    The demand for this card is a bit mental, I'm having trouble with understanding the why. I never would have though there were this many sharks in the water willing to spend 700 quid on a gfx card.

    Due to the COVID lockdown I've barely spent a dime on takeout coffee, lunch, pubs or public transport for most of the year.

    Even being out of a job, I could buy a 3090 tomorrow & not even touch my savings account.


  • Registered Users, Registered Users 2 Posts: 6,756 ✭✭✭Thecageyone


    K.O.Kiki wrote: »
    Due to the COVID lockdown I've barely spent a dime on takeout coffee, lunch, pubs or public transport for most of the year.

    Even being out of a job, I could buy a 3090 tomorrow & not even touch my savings account.

    Been opposite for me, finding myself spending more on junk and more broke because of it through lock down - But ... I have been keeping up with bills a bit better, that'll be a big chunk of it, and I did buy a new PC and phone ... but if I was still working I'd have managed all that and more. Starting back next week thankfully, going a bit stir crazy! I'll give it a few weeks to settle back in but then i am treating myself F it ... I see a 1440p monitor in the not too distant future for starters


  • Registered Users, Registered Users 2 Posts: 20,558 ✭✭✭✭dreamers75


    Leaked 3090 review, make of it what you want.

    https://wccftech.com/nvidia-geforce-rtx-3090-teclab-review-leaked/

    im one of the 20k amazon buyers, card not charged yet so not holding out much hope.

    I intended to get the 3080 for the kid and a 3090 for me (gotta flex to the kid) they xmas presents so no real rush if i dont get the 3080.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 4,028 ✭✭✭H3llR4iser


    The demand for this card is a bit mental, I'm having trouble with understanding the why. I never would have though there were this many sharks in the water willing to spend 700 quid on a gfx card.
    grogi wrote: »
    That is probably already half of the overall demand for this generation.
    K.O.Kiki wrote: »
    Due to the COVID lockdown I've barely spent a dime on takeout coffee, lunch, pubs or public transport for most of the year.

    Even being out of a job, I could buy a 3090 tomorrow & not even touch my savings account.

    Yeah probably a combination of all of the above - there are a lot of people who either managed to save money or haven't bought anything (nor been on holidays and such) basically looking for an "excuse" to buy something expensive.

    Also, I wouldn't underestimate the volume added by all the scalpers; We've seen individuals managing to order 40+ cards using bots. This is all artificial demand that might or might not have come from the interested public.


  • Registered Users, Registered Users 2 Posts: 18,751 ✭✭✭✭K.O.Kiki


    https://twitter.com/TEAMEVGA/status/1305567664534745088

    EVGA charging 40 quid for a cable Seasonic gives away for free.

    581ra6gjb5o51.png
    njhnohdij5o51.png

    Apparently all the AIBs are air-freighting in thousands of cards/week.


  • Registered Users, Registered Users 2 Posts: 18,751 ✭✭✭✭K.O.Kiki




  • Moderators, Science, Health & Environment Moderators Posts: 1,425 Mod ✭✭✭✭slade_x


    Did anyone else notice the Zotac and all EVGA 3080's are no longer available to pre-order on Overclockers?

    https://www.overclockers.co.uk/pc-components/graphics-cards/nvidia/geforce-rtx-3080


  • Registered Users, Registered Users 2 Posts: 10,684 ✭✭✭✭Samuel T. Cogley


    dreamers75 wrote: »
    Leaked 3090 review, make of it what you want.

    https://wccftech.com/nvidia-geforce-rtx-3090-teclab-review-leaked/

    I wasn't expecting much of the 3090 in fairness. Just looking at the specs it's not that much better in terms of CUDA cores. I'm sure in semi professional applications the memory badwidth and edge in cores will help but I wasn't expecting more than 15% in games. I think you'd have to be mad buying the 3090 for gaming unless you're doing something mad like multiple 4K displays given the price difference and potential 20GB 3080s.


  • Registered Users Posts: 1,770 ✭✭✭Nedved85


    Do overclockers charge the full amount for a pre order?


  • Registered Users, Registered Users 2 Posts: 8,799 ✭✭✭MiskyBoyy


    Nedved85 wrote: »
    Do overclockers charge the full amount for a pre order?

    They sure do.


  • Registered Users Posts: 1,770 ✭✭✭Nedved85


    MiskyBoyy wrote: »
    They sure do.

    Lol, they are some shower.


  • Registered Users, Registered Users 2 Posts: 1,188 ✭✭✭Squaredude


    Nedved85 wrote: »
    Do overclockers charge the full amount for a pre order?
    Haven't been charged yet for my pre-order with them


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 8,799 ✭✭✭MiskyBoyy


    Squaredude wrote: »
    Haven't been charged yet for my pre-order with them

    You must be one of the only ones. Reddit is full of people complaining about their scummy tactics.

    I myself went to pre-order one with them but had my revolut card frozen just to see, and they tried to take the full amount. Even tho it could be months before they deliver.


  • Registered Users, Registered Users 2 Posts: 1,188 ✭✭✭Squaredude


    MiskyBoyy wrote: »
    You must be one of the only ones. Reddit is full of people complaining about their scummy tactics.

    I myself went to pre-order one with them but had my revolut card frozen just to see, and they tried to take the full amount. Even tho it could be months before they deliver.

    I'd rather they take the money honestly. Stop me from spending it in the mean time. I'm sure the money will be taken in the next day or two. Wish they'd hurry up and ship the case I ordered 10 days ago though


  • Registered Users, Registered Users 2 Posts: 8,799 ✭✭✭MiskyBoyy


    Squaredude wrote: »
    I'd rather they take the money honestly. Stop me from spending it in the mean time. I'm sure the money will be taken in the next day or two.

    I'd be afraid they'd take the money, without having a card to ship for a long time and miss out on picking one up somewhere else in the meantime.


  • Registered Users, Registered Users 2 Posts: 4,816 ✭✭✭TheChrisD


    I never would have though there were this many sharks in the water willing to spend 700 quid on a gfx card.

    Given it's a lower price point than the previous gen, a massive leap in power over the previous gen, and there's still people out there (like me) with a 900 seires or 10 series card looking to upgrade... it kinda makes sense, doesn't it?


  • Registered Users, Registered Users 2 Posts: 1,450 ✭✭✭jebidiah


    That money is just resting in their account!


  • Registered Users Posts: 982 ✭✭✭earthwormjack


    MiskyBoyy wrote: »
    I'd be afraid they'd take the money, without having a card to ship for a long time.

    Going by their forum, that's exactly what they are doing.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 5,420 ✭✭✭.G.


    That's what they've always done though so it should be no surprise. They did it for the 10 series launch and the 20 series and cards were in very short supply for both of them too. Scan do it too I think. No company should be taking money for anything they can't guarantee when they'll ship but outside of Amazon it does seem to be quite common in the tech space.


  • Registered Users, Registered Users 2 Posts: 2,625 ✭✭✭Thor


    It makes more sense to take the money right away and figure it out afterwards.

    I'll go a slight step further and say it's good they increased the prices a bit aswell. They have a massive amount of orders over MRSP, this will give the retailer stronger purchasing power and compete against other retailers and still provide decent profits. Since now they know they have orders secured already. If they don't get units quick, more and more cancel and will wait for AMD launch for more competition.


  • Banned (with Prison Access) Posts: 1,306 ✭✭✭bobbyy gee




  • Registered Users, Registered Users 2 Posts: 7,902 ✭✭✭frozenfrozen


    breaking news


  • Moderators, Recreation & Hobbies Moderators Posts: 4,667 Mod ✭✭✭✭Hyzepher


    With those recent 3090 leaked benchmarks its hard to see how any card released between the 3080 and 3090 could be worth it, for gaming. The prospect of 8k gaming is at least another generation away so I can't see how that's a thing.

    People might prefer a 20G 3080 for some type of furture proofing but that's probably going to come in around the €1200 mark - especially AIB models, and I'm not really sure 20G is going to make any difference.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 2,625 ✭✭✭Thor


    Hyzepher wrote: »
    With those recent 3090 leaked benchmarks its hard to see how any card released between the 3080 and 3090 could be worth it, for gaming. The prospect of 8k gaming is at least another generation away so I can't see how that's a thing.

    People might prefer a 20G 3080 for some type of furture proofing but that's probably going to come in around the €1200 mark - especially AIB models, and I'm not really sure 20G is going to make any difference.

    Totally agree. I don't see any real advantage to having more vram, especially 20GB, unless your workload requires it. Gaming simply won't need that much, even at 4K. VRAM speed is far more important for performance overall.

    Pricing on the 3090 is simply horrible. It's really aimed at those that absolutely need the VRAM and can't go lower. A 3080 with 20GB will most likely be priced over €1000, offer not performance increase over the 3080 10GB. The no idea that is future proofing makes no sense to me. Developers won't suddenly start pushing more VRAM usage as they still have to aim at the middle. Designing a game and levels around higher VRAM would ruin the game on the majority of gamers setups. Console's might be a concern since they have 16GB, but OS and other factors are in play there, so not all is available to the developer.

    Ultimately, I feel more than comfortable with a 10GB card, and I think Nvidia have ruined hype for any 3080Ti/super, since it will again fall in the middle of a 10-percent jump.


  • Banned (with Prison Access) Posts: 1,306 ✭✭✭bobbyy gee


    Thor wrote: »
    Totally agree. I don't see any real advantage to having more vram, especially 20GB, unless your workload requires it. Gaming simply won't need that much, even at 4K. VRAM speed is far more important for performance overall.

    Pricing on the 3090 is simply horrible. It's really aimed at those that absolutely need the VRAM and can't go lower. A 3080 with 20GB will most likely be priced over €1000, offer not performance increase over the 3080 10GB. The no idea that is future proofing makes no sense to me. Developers won't suddenly start pushing more VRAM usage as they still have to aim at the middle. Designing a game and levels around higher VRAM would ruin the game on the majority of gamers setups. Console's might be a concern since they have 16GB, but OS and other factors are in play there, so not all is available to the developer.

    Ultimately, I feel more than comfortable with a 10GB card, and I think Nvidia have ruined hype for any 3080Ti/super, since it will again fall in the middle of a 10-percent jump.
    if amd bring in good prices NVidia will drop its prices or bring out slightly better cards they will make them 10% better than existing NVidia cards it will probably take a year to buy as people are using computer bots to buy up the cards


  • Registered Users, Registered Users 2 Posts: 4,516 ✭✭✭Homelander


    What wrecks my head with Nvidia is their blatant anti-consumer allocation of vram based purely on their status as market leader, knowing that the inevitable upgrade will likely lead to another Nvidia purchase.

    They could easily, so easily, allocate the cards better vram, but clearly their long-term strategy was beat down AMD - which they largely did - and then go into cruise control, artificially limiting GPU longevity to secure future purchases.


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    The 3090 is basically a budget titan for developers/designers that need the vram and have the option of SLI.

    I wouldn't be spending 700-800 on a card for only 1-2 years though. 10gb is not enough for a card in that price range imo and there will be games that want more for the highest settings.

    Larger amounts of Vram is one of the best ways of increasing noticeable graphics quality in either static or dynamic lighting scenarios.

    Maybe the new decompression techniques on the gpu will make it less important.

    I think 20gb is still too much for a gaming card for a while. 12gb would have been nice, there's even 2 spaces left for the extra 2gb. 16gb would have been even better for long term.


  • Registered Users, Registered Users 2 Posts: 8,799 ✭✭✭MiskyBoyy


    I wish :rolleyes:

    uX77y62.jpg


  • Registered Users, Registered Users 2 Posts: 4,516 ✭✭✭Homelander


    I've given up on the 3080. I'd be happy enough to see my order officially cancelled, can wait it out and see what AMD brings to the table. If nothing else, if might drive Nvidia prices down. I have a 2070 Super, so really in many ways I was just being a sucker anyway.

    At the moment my two most played games are Overwatch (runs at 1080 240hz) and Paladins (runs at 1080 175 30% usage with 200% res scale) so I was really just being ridiculous


  • Moderators, Recreation & Hobbies Moderators Posts: 4,667 Mod ✭✭✭✭Hyzepher


    bobbyy gee wrote: »
    if amd bring in good prices NVidia will drop its prices or bring out slightly better cards they will make them 10% better than existing NVidia cards it will probably take a year to buy as people are using computer bots to buy up the cards
    Homelander wrote: »
    What wrecks my head with Nvidia is their blatant anti-consumer allocation of vram based purely on their status as market leader, knowing that the inevitable upgrade will likely lead to another Nvidia purchase.

    They could easily, so easily, allocate the cards better vram, but clearly their long-term strategy was beat down AMD - which they largely did - and then go into cruise control, artificially limiting GPU longevity to secure future purchases.


    I don't think there is enough performance gap between the current 3080 and 3090 (if the leaked performance is right) to position any cards between them that would make sense. The 3090 with its large vram isn't showing that much of an increase.

    I still don't believe that vram is a limiting factor or will be for 90% of gamers


  • Registered Users, Registered Users 2 Posts: 10,299 ✭✭✭✭BloodBath


    It's rumored the ti or super versions will be switching over to TSMC's 7nm process so there may well be room for more performance than the 3090.


  • Registered Users, Registered Users 2 Posts: 2,625 ✭✭✭Thor


    BloodBath wrote: »
    It's rumored the ti or super versions will be switching over to TSMC's 7nm process so there may well be room for more performance than the 3090.

    It's possible, but more performance for a TI over the more expensive 3090. We would be looking at least a year a way.


  • Registered Users, Registered Users 2 Posts: 4,028 ✭✭✭H3llR4iser


    Hyzepher wrote: »
    People might prefer a 20G 3080 for some type of furture proofing but that's probably going to come in around the €1200 mark - especially AIB models, and I'm not really sure 20G is going to make any difference.
    Thor wrote: »
    Totally agree. I don't see any real advantage to having more vram, especially 20GB, unless your workload requires it. Gaming simply won't need that much, even at 4K. VRAM speed is far more important for performance overall.

    VRam usage goes up in a deceptively fast way - Linus Tech Tips made a video, a few days ago, where they're running Crysis (yep, the old one) in 4k at max settings, and it's sucking 4GB of Vram. That's a game from 2005, when cards had 512MB of memory.

    More interestingly perhaps, is the of these cards outside of gaming; I do some 3D rendering as an hobby - and I can assure you even a relatively simple scene can run out of the 8GB of VRAM very very easily, all you need are a bunch of high-res textures.

    And this is at an after work, amateur level - there are plenty of professionals who use gaming cards instead of Quadro ones due to the sheer cost of the latter; Anyone I know, in fact, that's a designer or architect runs GTX or RTX cards in their rigs.

    In these regards, really, one needs to wonder how much a 16GB 3080 would affect Quadro sales.


  • Advertisement
  • Banned (with Prison Access) Posts: 1,306 ✭✭✭bobbyy gee




  • Registered Users, Registered Users 2 Posts: 14,005 ✭✭✭✭Cuddlesworth


    H3llR4iser wrote: »
    VRam usage goes up in a deceptively fast way - Linus Tech Tips made a video, a few days ago, where they're running Crysis (yep, the old one) in 4k at max settings, and it's sucking 4GB of Vram. That's a game from 2005, when cards had 512MB of memory.

    More interestingly perhaps, is the of these cards outside of gaming; I do some 3D rendering as an hobby - and I can assure you even a relatively simple scene can run out of the 8GB of VRAM very very easily, all you need are a bunch of high-res textures.

    And this is at an after work, amateur level - there are plenty of professionals who use gaming cards instead of Quadro ones due to the sheer cost of the latter; Anyone I know, in fact, that's a designer or architect runs GTX or RTX cards in their rigs.

    In these regards, really, one needs to wonder how much a 16GB 3080 would affect Quadro sales.

    You can't tell what the VRAM usage is, only the allocation. Currently the only real way to test is via a game engine debug, probably some internal chip maker tools or just testing cards with different vram amounts to see noticeable differences in performance.

    I really doubt that 4k gaming comes close to 10gig of vram and Nvidia knows it. And anything above is unplayable.


  • Registered Users, Registered Users 2 Posts: 14,005 ✭✭✭✭Cuddlesworth


    TheChrisD wrote: »
    Given it's a lower price point than the previous gen, a massive leap in power over the previous gen, and there's still people out there (like me) with a 900 seires or 10 series card looking to upgrade... it kinda makes sense, doesn't it?

    Its a bit like talking to a person in a bad relationship, where the bad stuff is ok because the partner is really nice sometimes.

    Nvidia has been inflating prices for a few generations now. This is the first time in 5 or 6 years that a new generation simply didn't come in at a higher price point. And even then, its over priced. This is the mid end card, the 3090 is the "high end" part.

    First time in a long time that they will probably have some competition too.

    I get that people like the idea of affordable cards again but the sheer panic around getting one right now is nuts.


  • Registered Users, Registered Users 2 Posts: 12,616 ✭✭✭✭errlloyd


    Nvidia has been inflating prices for a few generations now. This is the first time in 5 or 6 years that a new generation simply didn't come in at a higher price point. And even then, its over priced. This is the mid end card, the 3090 is the "high end" part.

    I find the "rebenchmarking" of premium card pricing really interesting. I have only rekindled my interest in Pc Gaming during Lockdown. But I wonder if any of you have any observations about how that has changed over the past 15 years or so?

    It seems that a premium card would have been €250e at launch in 2005. Nvidia monopoly and an aging cohort of gamers with more money seems to have inflated that hugely. But the other aspects of a gaming build would have been more costly - for example Processors, MBoards and Ram have come down in price or stayed the same because they are also used by non gamers?

    Is that perception correct?


  • Registered Users, Registered Users 2 Posts: 14,005 ✭✭✭✭Cuddlesworth


    errlloyd wrote: »
    I find the "rebenchmarking" of premium card pricing really interesting. I have only rekindled my interest in Pc Gaming during Lockdown. But I wonder if any of you have any observations about how that has changed over the past 15 years or so?

    It seems that a premium card would have been €250e at launch in 2005. Nvidia monopoly and an aging cohort of gamers with more money seems to have inflated that hugely. But the other aspects of a gaming build would have been more costly - for example Processors, MBoards and Ram have come down in price or stayed the same because they are also used by non gamers?

    Is that perception correct?

    Sort of. Volumes do make a difference and dedicated graphics sell nowhere near as much as the CPU's or dram modules do.

    Graphics cards have been increasing in size, they are throwing more transistors at the problem every generation. Larger chips = more cost to manufacture.

    They are bleeding edge for certain components, like ram. Increases cost.

    They suck up power. Systems in general have been decreasing in power usage for a good while, meanwhile graphics cards float around 250-300 watts at the high end usually. Power usage = power stages + component's and more cost.

    So I would expect some price creep. But its well known they Nvidias top end is pure profit at this stage(and the balance sheets show it). So the price creep has really been a lack of competition.


  • Registered Users, Registered Users 2 Posts: 6,289 ✭✭✭Cordell


    It's a combination of facts indeed. Adjusted for inflation it's around e315, the GPUs are now much larger and complex, the games are much more demanding than they used to be. And of course the good old corporate greed :)


  • Registered Users, Registered Users 2 Posts: 710 ✭✭✭mad turnip


    My 3080 from overclockers arrived today. Hopefully everyone elses ships soon.


  • Registered Users, Registered Users 2 Posts: 4,314 ✭✭✭sink


    I had the Zotac trinity on order from Amazon.de but I've changed my mind and I've pre-ordered the EVGA 3080 XC3 Black from Alternate.de. It doesn't have a stock back plate but I'm going to get an EK water block + back plate for it and EK have confirmed they will be supporting this model.

    With EVGA removing the stock cooler doesn't void their warranty and I figured that if they release a 20GB version in the next 90 days I can also upgrade to it with the step-up program. So it seems the best choice overall.

    It's not one of the high power models but judging by the GN stream, there doesn't seem to be much room for overclock these cards anyway.


  • Registered Users Posts: 898 ✭✭✭one armed dwarf


    I went and cancelled my Ventus order with scan also, impressions place it right at the bottom of all AIBs so happy to wait for a while tbh.

    Similar to above there's not a lot I'm playing right now that requires this kind of power. I mainly just want something that can rock Cyberpunk, which might be difficult to get before launch but would rather have peace of mind than a substandard AIB.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 13,762 ✭✭✭✭Inquitus


    Its currently pretty much vapourware, sold out in 99% of places.


Advertisement