Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

NVIDIA Kepler Information

  • 14-03-2012 4:08pm
    #1
    Registered Users, Registered Users 2 Posts: 7,181 ✭✭✭


    Figured we might as well start this.

    NB - When I mention anything, it's all rumour, and could completely change on launch day. Just bear that in mind.

    Right, just some general info for anyone who hasn't been following Kepler as of late. The part that's going to be release soon (all indications are for the 22nd of this month) is going to be based on the GK104 architecture. This was originally slated to be a mid-range kepler part (similar to the 560 Ti) but because NVIDIA was expecting a hell of a lot more from AMD, they've just said "f*ck it", and have moved to compete directly with the 7970. Indications are that it'll beat it by about 20% or so on average, and cost around $550-600.

    NVIDIA_GeForce_GTX680_Specs.jpg

    From the looks of things, it's going under the GTX 680 moniker, with the 780 arriving a couple of months from now based on GK110 - the true flagship part. This is indented to compete with the 7990 at this point. NVIDIA has also said that their Kepler GPUs (it's unclear if they were talking about the 104 or 110 varient) can run the Samaritan demo. That's one card, running it in real time. This is with FXAA, not MSAA though.

    NVIDIA also has a 690 planned that is likely going to be a slightly underclocked GK104 part coming soon-ish (not too sure on timeframe for this).

    As with the Ivy thread, I'll keep this updated with info as I find stuff.


    Personally, at this point I'm probably going to wait for the GK110 part, simply because I don't really want to only get three months out of a 680 waterblock, and not have it be compatible with a 780. (Having said that, if anyone is looking for a 580 and block, feel free to PM me.) Going to be an interesting launch day I think!


«1345

Comments

  • Registered Users, Registered Users 2 Posts: 4,983 ✭✭✭Tea_Bag


    theyre releasing a 600 and 700 series GPU in the same generation?

    thats going to be annoying when Maxwell comes around. what idiots do the naming conventions for nvidia and AMD?


  • Registered Users, Registered Users 2 Posts: 7,181 ✭✭✭Serephucus


    It's technically the same generation, but it's going to be a while off. It's similar to the 480/580 thing. Odds are Maxwell will be the 800 series or something. (unless they decide to call the GK110 part a 685)


  • Registered Users, Registered Users 2 Posts: 22,929 ✭✭✭✭ShadowHearth


    Well bad news for me so. I was expecting 670i instead. I can't afford 680. I really doubt 7970 will drop in price if this 680 is so drastically prised over 7970 price.

    What are you saying aswell that it is based on current gen architecture? The real deal will be 7xx series?

    Good time to buy 570 or 580 though.


  • Registered Users, Registered Users 2 Posts: 7,181 ✭✭✭Serephucus


    God no, the 680 is Kepler, it's just that it was originally going to be the 670 because AMD was supposed to pull something a lot bigger out of the bag. Wait and see though, this could all change before launch / people could have got rumours wrong, etc.

    Also, the 7970 is $550-600 as well, so I'm not sure what you mean by the pricing bit.


  • Registered Users, Registered Users 2 Posts: 4,983 ✭✭✭Tea_Bag


    Thing about nvidia, is they're usually pretty bang on with their roadmaps.

    nvidia_kepler_maxwell_roadmap.jpg

    i'll eat a raw potato if the 7970 competitor is less than €100 more expensive though.
    the €500 GPU market is a very touchy subject already, i dont see how anyone will want to spend €600+ on a GPU.

    i cant wait to see the X90 prices though. my bets on $1000.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 22,929 ✭✭✭✭ShadowHearth


    Serephucus wrote: »
    God no, the 680 is Kepler, it's just that it was originally going to be the 670 because AMD was supposed to pull something a lot bigger out of the bag. Wait and see though, this could all change before launch / people could have got rumours wrong, etc.

    Also, the 7970 is $550-600 as well, so I'm not sure what you mean by the pricing bit.

    well 1 week to go so.

    still 550-600 price tag? so in EU it will be around 650eu at retailers? damn, you can build a nice budget pc for that money... pricing is really out of control.


  • Registered Users, Registered Users 2 Posts: 7,181 ✭✭✭Serephucus


    I should mention that the prices I gave were 550-600 Dollars.

    The 7970 is 550-600 Dollars as well on Newegg right now, so based on that, they'll be priced pretty similarly.

    And yup, it is supposed to go for around $1,000, so probably about €850-900 or something.

    @Shadow: How do you figure that? If it's $600, the most it's going to be here is €600, and even then, that's very unlikely.


  • Registered Users, Registered Users 2 Posts: 22,929 ✭✭✭✭ShadowHearth


    Tea_Bag wrote: »
    Thing about nvidia, is they're usually pretty bang on with their roadmaps.

    nvidia_kepler_maxwell_roadmap.jpg

    i'll eat a raw potato if the 7970 competitor is less than €100 more expensive though.
    the €500 GPU market is a very touchy subject already, i dont see how anyone will want to spend €600+ on a GPU.

    i cant wait to see the X90 prices though. my bets on $1000.

    all this bull**** pricing, when pc became more affordable... :rolleyes: damn, i hope this wont put us in to stone age again.

    for the matter of intrest, how much 6970 and gtx580 were at launch?


  • Registered Users, Registered Users 2 Posts: 7,181 ✭✭✭Serephucus


    GTX 580 was about €500-550 I believe.


  • Registered Users, Registered Users 2 Posts: 22,929 ✭✭✭✭ShadowHearth


    Serephucus wrote: »
    I should mention that the prices I gave were 550-600 Dollars.

    The 7970 is 550-600 Dollars as well on Newegg right now, so based on that, they'll be priced pretty similarly.

    And yup, it is supposed to go for around $1,000, so probably about €850-900 or something.

    @Shadow: How do you figure that? If it's $600, the most it's going to be here is €600, and even then, that's very unlikely.

    well, when they say 550dollars, it usually means 550eu++ for EU. they were rumoring the price of 7970 500dollars, but its really rare to see one for 500eu. almost all of them are 530ish and up. the only one that is an exception the pixmania one, which is priced at 460eu.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 7,181 ✭✭✭Serephucus


    well, when they say 550dollars, it usually means 550eu++ for EU. they were rumoring the price of 7970 500dollars, but its really rare to see one for 500eu. almost all of them are 530ish and up. the only one that is an exception the pixmania one, which is priced at 460eu.

    Well that's all well and good, but you can't go comparing rumoured launch prices to actual prices in a different continent.

    They go for about $550-600 in the US, and about €500-550 here, which seems perfectly normal to me.


  • Registered Users, Registered Users 2 Posts: 22,929 ✭✭✭✭ShadowHearth


    Serephucus wrote: »
    Well that's all well and good, but you can't go comparing rumoured launch prices to actual prices in a different continent.

    They for about $550-600 in the US, and about €500-550 here, which seems perfectly normal to me.
    True. Rumors are rumors. If it will be 550eu them why not?


  • Registered Users, Registered Users 2 Posts: 6,696 ✭✭✭Jonny7


    Nvidia could be overindulging in a spot of propaganda - they want their potential customers to wait for their new cards in the belief they will stomp AMD's current offerings - so will be interesting to see how much translates into performance rather than hype. If the cards are good we all benefit as AMD will drop prices correspondingly.


  • Registered Users, Registered Users 2 Posts: 2,730 ✭✭✭dan_ep82


    Really wish they drove the 580 prices down a bit more.
    If you could get one for 250 I'd be tempted to get two to replace my 6870's.

    I'm looking forward to seeing what they have to offer in the price range of the 7870/7850. Its where the sweet spot is for price v performance for me


  • Registered Users, Registered Users 2 Posts: 22,929 ✭✭✭✭ShadowHearth


    I for once would love to buy the most high end card :). I know it's stupid, but atleast once in my life!!! I can't afford New Skyline GTR to satisfy a petrolhead in me, but maybe I can get best thing for my nerd side :)


  • Registered Users, Registered Users 2 Posts: 7,181 ✭✭✭Serephucus


    Depending on the performance, I might get one. As it is though my 580 trades blows with the 7970, so it'll have to beat it by a fair margin (and overclock well).


  • Registered Users, Registered Users 2 Posts: 618 ✭✭✭pandaboy


    Serephucus wrote: »
    NVIDIA_GeForce_GTX680_Specs.jpg

    Looking at the CUDA cores it seems like this is going to be a beast card, especially for video editors. The GTX 580 holds just 512 so this does look like it has its value for money. Will you need the new Ivybridge processor to run this though? Is there a sandybridge board that supports PCIE 3.0?


  • Registered Users, Registered Users 2 Posts: 618 ✭✭✭pandaboy


    http://www.ubergizmo.com/2012/03/geforce-gtx-680-specs/

    Some more details released yesterday on the GTX 680.


  • Registered Users, Registered Users 2 Posts: 22,929 ✭✭✭✭ShadowHearth


    pandaboy wrote: »
    Looking at the CUDA cores it seems like this is going to be a beast card, especially for video editors. The GTX 580 holds just 512 so this does look like it has its value for money. Will you need the new Ivybridge processor to run this though? Is there a sandybridge board that supports PCIE 3.0?

    There are tons of gen 3 boards for sandy, but I really doubt there will be any benefit from pcie 3.0. Well if everything will go fine I will get 680 and put it in to my 2500k with pcie 2.0. We will be able to see how 2.0 works with that beast.


  • Registered Users, Registered Users 2 Posts: 618 ✭✭✭pandaboy


    There are tons of gen 3 boards for sandy, but I really doubt there will be any benefit from pcie 3.0. Well if everything will go fine I will get 680 and put it in to my 2500k with pcie 2.0. We will be able to see how 2.0 works with that beast.

    Yeah I saw but they seem fairly redundant. I'm looking to build my first so was checking out the info on PCIE, going to wait for Ivybridge and see if the 2011 socket boards decrease in any way.

    Looking at the power required by the 680, it seems like it should have no hassle on a PCIE 2 board. It uses less power than a 580 but again benchmarks need to be set. Looking forward to release though. I'm half tempted to do either a dual 560ti or 580 set-up depending on final budget and prices so hopefully the prices should drop.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 22,929 ✭✭✭✭ShadowHearth


    Well prices already droped a lot for 570 and 580. It's a real bargain now!
    On the other hand 680 does look really Impresive, but it won't give much advantage over current games unless you will go something like triple monitor or very high resolution.

    680 is just to increase your e-penys size to be honest, not a practical solution. Amd I want that bigger e-penis :)


  • Registered Users, Registered Users 2 Posts: 2,730 ✭✭✭dan_ep82


    Well prices already droped a lot for 570 and 580. It's a real bargain now!
    On the other hand 680 does look really Impresive, but it won't give much advantage over current games unless you will go something like triple monitor or very high resolution.

    680 is just to increase your e-penys size to be honest, not a practical solution. Amd I want that bigger e-penis :)

    You'll need 2 to tale advantage of the dual/triple screen set up aswell,unless nvidia has something similar to eyefinity now?


  • Registered Users, Registered Users 2 Posts: 7,181 ✭✭✭Serephucus


    Well prices already droped a lot for 570 and 580. It's a real bargain now!
    On the other hand 680 does look really Impresive, but it won't give much advantage over current games unless you will go something like triple monitor or very high resolution.

    680 is just to increase your e-penys size to be honest, not a practical solution. Amd I want that bigger e-penis :)

    Everyone says this about every flagship card on every launch, and they're almost always wrong. Yeah it's over-priced for what you get, especially when you compare it to the X70 card, but to say there's no benefit in getting one unless you're running 57x12 or something isn't true. (See Unreal Engine 4 ;))


  • Registered Users, Registered Users 2 Posts: 618 ✭✭✭pandaboy


    Well prices already droped a lot for 570 and 580. It's a real bargain now!
    On the other hand 680 does look really Impresive, but it won't give much advantage over current games unless you will go something like triple monitor or very high resolution.

    680 is just to increase your e-penys size to be honest, not a practical solution. Amd I want that bigger e-penis :)

    For editing though it should be a powerful card. I had to double check when I saw the CUDA cores, now whether the drop in power can fully support the CUDA cores is another thing. Premier Pro should be a doddle with this card now.


  • Registered Users, Registered Users 2 Posts: 22,929 ✭✭✭✭ShadowHearth


    Serephucus wrote: »
    Everyone says this about every flagship card on every launch, and they're almost always wrong. Yeah it's over-priced for what you get, especially when you compare it to the X70 card, but to say there's no benefit in getting one unless you're running 57x12 or something isn't true. (See Unreal Engine 4 ;))

    Ah yeah, I should be more clear. It's still am awesome card and I will get it hopefully at launch, it's just after playing bf3 yesterday on all ultra with only motion blur off and AA down I said myself: right, this game is most demanding in the market and I play it prety much maxed out... What is the point spending so much money now on new gen card?! There is nothing announced for pc that will need very high spec...

    Dead on there bud. 570 does look like a bargain of century and makes the new card look like a product which is not really needed, but there are always people who want more performance, even of they won't use much out of it, yet!

    I just hope misses won't see a bill of 600eu on one item on my account... I shot you not that one of our cars - ford mondeo costed 500eu... She will flip if she will see that I spend so much on damn pc part ! :)


  • Registered Users, Registered Users 2 Posts: 4,983 ✭✭✭Tea_Bag


    BF3 is actually a beast of a game. on all ultra settings @ 1080p its using 99% of my 6990 to run. I don't see how people say they max it on something like a 560Ti. they can't possibly get away with it on a 64player map without some fps spikes?

    there definitely is a market for the flag ships. 6 months ago if you told me I'd own a €700 GPU I'd have laughed in your face. I'm now moving to a 2560*1440 monitor, so I've destroyed any notion of going down budget cards again.

    thing about these Kepler cards is that, Nvidia has pulled its classic move of: let AMD have the early market, and give us 3 months to refine our cards to ensure theyre 20% faster across the board, and slap on a nice premium.


  • Moderators, Technology & Internet Moderators Posts: 17,137 Mod ✭✭✭✭cherryghost


    My OC'd 570 is on par with the 570 and some 7970 benchmarks, so as Serephucus said it'll need to be competitively priced and be more than 20% better than than the 580 to warrant any purchase for me.


  • Registered Users, Registered Users 2 Posts: 2,730 ✭✭✭dan_ep82


    To get a constant 60fps+ I run BF3 on high,no aa but high FXAA and I'm using 6870 CF

    To be honest I think Vram is holding me back from ultra,but I'm going to try OC'ing the them again to see if I can push the settings a bit more.

    I've always been an AMD buyer,purely for price v performance. But with the prices of the 2500k and the drop in the gpu's I'm finding myself swaying to upgrading to S-Bridge and a 580 of their old line or one of their newer mid range cards


  • Registered Users, Registered Users 2 Posts: 4,983 ✭✭✭Tea_Bag


    ill play tonight and monitor my vram usage to see what it uses. I remember reading that BF spikes to 1200mb in some cases.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 22,929 ✭✭✭✭ShadowHearth


    My OC'd 570 is on par with the 570 and some 7970 benchmarks, so as Serephucus said it'll need to be competitively priced and be more than 20% better than than the 580 to warrant any purchase for me.

    Aren't they saying that it will be 20% more then 7970, not 580 m8.


  • Registered Users, Registered Users 2 Posts: 2,730 ✭✭✭dan_ep82


    Tea_Bag wrote: »
    ill play tonight and monitor my vram usage to see what it uses. I remember reading that BF spikes to 1200mb in some cases.

    Thanks,I think the 1gb is holding me back in this and future games:(


  • Registered Users, Registered Users 2 Posts: 22,929 ✭✭✭✭ShadowHearth


    dan_ep82 wrote: »
    Thanks,I think the 1gb is holding me back in this and future games:(

    If you got crossfire, won't system use space fro
    Both Gpus? Like 2x6870 will make 2gb? Or will it use only one Gpus VRAM?


  • Moderators, Technology & Internet Moderators Posts: 17,137 Mod ✭✭✭✭cherryghost


    Aren't they saying that it will be 20% more then 7970, not 580 m8.

    The 580 and 7970 are closely matched in some benchmarks. My 570 is pretty much on par with stock 580 performance. So I want it more than 20% faster than my 570 :P


  • Registered Users, Registered Users 2 Posts: 22,929 ✭✭✭✭ShadowHearth


    The 580 and 7970 are closely matched in some benchmarks. My 570 is pretty much on par with stock 580 performance. So I want it more than 20% faster than my 570 :P

    Ya cheeky bugger :)


  • Registered Users, Registered Users 2 Posts: 7,692 ✭✭✭Dublin_Gunner


    The best thing about Kepler supposedly performing so well, is bringing current high end parts down to price levels where value can be had.

    Nvidia are really just screwing with their potential customer if they truly decide to release GK104 as a high end part.

    Up until very recently, it was going to be a mid range part, with performance similar to 7950/70 - but at mid range prices.

    If they kept with this model, they would sell shed loads of not only the GK104, but the GK110 when it releases - plus no need for artificially blinding people with a 2in1 generation.


    And no, it's not like GTX480 / 580 - ass that was a re-spin of the silicon on a new manufacturing node, with some improvements. GK104/110 is the exact same architecture (just with more execution units in 110 obviously).

    The 4xx series should not have been released as early as they were - nvidia needed to get something out the door though.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 2,730 ✭✭✭dan_ep82


    If you got crossfire, won't system use space fro
    Both Gpus? Like 2x6870 will make 2gb? Or will it use only one Gpus VRAM?

    Doesn't work that way unfortunately,only uses 1gb Vram


  • Registered Users, Registered Users 2 Posts: 4,983 ✭✭✭Tea_Bag


    If you got crossfire, won't system use space fro
    Both Gpus? Like 2x6870 will make 2gb? Or will it use only one Gpus VRAM?
    both cards mirror the same data before compute, meaning that the 'effective' RAM is still only 1gb.

    if you CF or SLI with two different cards, they'll have the effective RAM of the lesser card, as well as operate at the speed of the slowest card.

    I actually get annoyed when reviewers call the 6990 a '4gb' monster. they should know better, and it only confuses people.


  • Registered Users, Registered Users 2 Posts: 7,692 ✭✭✭Dublin_Gunner


    Only the 'main card' in xfire or SLI uses it's framebuffer for output, therefore only the framebuffer of the main card can be counted really as effective gfx ram.

    So yes, only the ram from 1 card is actually used in a traditional sense as the frambuffer output of the xfired / SLI'd cards.


  • Registered Users, Registered Users 2 Posts: 7,181 ✭✭✭Serephucus


    Tea Bag's right; The data's mirrored across both cards, not stored on one. Equates to the same thing, but I'm just being pedantic. :)

    If you use two different cards though, they'll actually operate at their respective speeds, they won't downclock to the lower. Common misconception. Not sure on framebuffer sizes though.


  • Moderators, Computer Games Moderators Posts: 4,282 Mod ✭✭✭✭deconduo


    Serephucus wrote: »
    If you use two different cards though, they'll actually operate at their respective speeds, they won't downclock to the lower. Common misconception. Not sure on framebuffer sizes though.

    Really? So the performance of a 5770+5750 would be somewhere between 2x5750 and 2x5770? That's interesting to know.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 7,181 ✭✭✭Serephucus


    Sorry, I was talking only of SLI there. I'm not sure exactly how CF goes.


  • Registered Users, Registered Users 2 Posts: 4,983 ✭✭✭Tea_Bag


    Serephucus wrote: »
    Sorry, I was talking only of SLI there. I'm not sure exactly how CF goes.
    are you sure? wouldnt that severly increase tearing? each GPU does an alternative line (or in some games one does the top half of the screen and the other does the bottom) and if one line is rendered faster than the one below it, it'll just be aweful.

    it might be a misconception, but i dont see how a 5770 + 5750 will perform better than a 5750 + 5750.


  • Registered Users, Registered Users 2 Posts: 7,181 ✭✭✭Serephucus


    Well it wouldn't work very well with AFR, but with SFR I don't see why it wouldn't work. The slower GPU would just render less of the frame than it otherwise would have.

    I'm not 100% sure, but I remember seeing a post by an Admin on the NVIDIA forums a while back (480-ish time) that said as much.


  • Registered Users, Registered Users 2 Posts: 712 ✭✭✭deejer


    Some "independant" reviews done on performance difference between the 580/6970/680 here.

    http://www.techpowerup.com/162498/GTX-680-Generally-Faster-Than-HD-7970-New-Benchmarks.html


  • Registered Users, Registered Users 2 Posts: 7,181 ✭✭✭Serephucus


    You jerk. :P I found this literally three minutes ago.

    That's interesting alright. About a 20% improvement over my 580, and about a 40% improvement over a stock 580. Might have to pick up one of these!


  • Registered Users, Registered Users 2 Posts: 712 ✭✭✭deejer


    Serephucus wrote: »
    You jerk. :P I found this literally three minutes ago.

    That's interesting alright. About a 20% improvement over my 580, and about a 40% improvement over a stock 580. Might have to pick up one of these!

    I think you should ;)

    I spend alot of time on that site when I am at work. Helps break up the day :D


  • Registered Users, Registered Users 2 Posts: 618 ✭✭✭pandaboy


    Did that cooler come from Iron man's chest?


  • Registered Users, Registered Users 2 Posts: 4,983 ✭✭✭Tea_Bag


    deejer wrote: »
    Some "independant" reviews done on performance difference between the 580/6970/680 here.

    http://www.techpowerup.com/162498/GTX-680-Generally-Faster-Than-HD-7970-New-Benchmarks.html
    fishy.

    "an extreme-cooled Intel Core i7-3960X Extreme Edition processor (running at stock frequency)"

    179a.jpg


  • Registered Users, Registered Users 2 Posts: 7,692 ✭✭✭Dublin_Gunner


    Tea_Bag wrote: »
    fishy.

    "an extreme-cooled Intel Core i7-3960X Extreme Edition processor (running at stock frequency)"

    lol so true.

    Lets waste lots of LN2 running it at stock, for no reason whatsoever.

    Maybe they literally had no other CPU cooler in their lab?? :D:D

    **The screenshot just below in that thread actually shows a 3960X @5Ghz, and its only 300 points higher than the supposed 'stock' result in 3DM11. Complete BS.


  • Registered Users, Registered Users 2 Posts: 7,181 ✭✭✭Serephucus


    Holy crap, finally! Take that eyefinity!


  • Advertisement
Advertisement