• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Keplar real specs...maybe?

But it does show people are willing to pay the hefty prices tho aye?

Id love 2 in CF but they're $600+ over here..... :eek: .7970 are $700+ some online shops 800+. Even tho i have the $$$ there's no way im bowing down to those prices....

IS that including the sales tax or whatever it is thats added at the checkout?
 
AMD might do a small refinement to these cards, but typically they don't. 30% on the clocks is possible, but maybe not realistic.
Why not realistic? Take a look at the figures for power consumption on the significantly overclocked 7970s.

uFNOG.gif

http://hardocp.com/article/2012/01/25/asus_radeon_hd_7970_video_card_review/8

Then think about the kind of cooler a card like that will need.
Cards sold at retail will be put into all sorts of awful systems, and they need to be quite robust as a result.

I don't see why Nvidia won't bring the CUDA Core count up to 768, then use the spare ASIC to either shrink the die, increasing profit, or enhance the acceleration features mentioned on other websites.

It's rather sweet you think a big kepler will use the same or less power than a 580gtx, it won't. A bog standard card that is 45% faster than a as currently stock clocked 7970 will NOT use under 300W, likely much more.

Also two key things to point out, top end card sell almost smeg all in Dell's and the likes, Dell barely give a damn about the high end cards, their entire business is based around making $50 on a $400 x 10's of millions a year. The couple thousand $3000 computers with a $500 profit makes almost nothing comparitively.

Secondly, big kepler is going to use silly power, thirdly, furmark is ENTIRELY irrelevant to everything, in every single way possible.

Average power draw of the 7970 in a demanding game like (I forget if it was metro of crysis 2) 165W.

Further, the Asus card dealt with that, so, actually no it wouldn't need another cooler.

Furmark peak power usage means smeg all, average power usage in games is the only metric, because peaking at 200W in games doesn't matter, it will happen for a second, but the heatsink doesn't suddenly overload in half a second, that is why you have heatsinks as it for all intents and purposes averages out the heat load. 165W average power usage in games is nothing, its not hugely more heavily overclocked.

Big kepler will come out, have power usage that makes the 7970 look poor in performance and great in performance/watt, then once Nvidia breaks the 300W single card barrier...... big time, AMD release either a 8970 with 1200Mhz clocks and close the gap significantly, almost to nothing, or the release a 7980 with the same clocks.

I care less about using 300W average in games than using the 80W idle the 4870 used which was horrible. 80W all day every day, or 15W all day every day and 300W for a few hours here and there. Idle is basically "fixed" on AMD and Nvidia these days, though for dual screens its not where it should be(which is nuts, for both companies).

Honestly the last bit I don't know what you are talking about though? THeres nothing wrong with 768 shaders, the issue is, manufacturing is imperfect off a wafer you will get X% of cores that don't work at all, x% that work but not fully and x% that work fully. You rarely if every disable more than 20% of a core to "bring yields up" meaning that a native 768shader die, which is what GK104 will likely be, means there WILL, without question always be a bunch of cores that won't work with 768shaders working but will work with 10-15% less. Every generation of gpu's, cpu's, always have salvage parts.
 
IS that including the sales tax or whatever it is thats added at the checkout?

Yeah it is inc...... id say most of us are waiting for the prices to drop also. We have less competition over here as well so prices will stay higher for longer. And aussie is the most expensive country ive ever lived in..... its mental! 8-10 bucks for a beer.... hullo!! :confused: (thats 5/6 quid)
 
Well, they are certainly plausible...

They're quite similar to my guess-values from a while ago actually (see here):



Of course there's no way to know if they're actually real at this stage.


One thing to note though: In order for the performance to be consistent with those estimates, each CUDA core must be performing four floating point operations per clock. This can be achieved either by:

a) 'hot clocked' shaders, running at twice the core clock, and performing two FP ops per clock (like Fermi)

b) 'base clocked' shaders, running at the core clock, and performing four FP ops per clock.

On that note, this is an interesting paper to read:
http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=6045685

If any of the recent rumours are true, it may well be that NVIDIA's research in this area has made some forays into Kepler's design... IF.
 
Yeah it is inc...... id say most of us are waiting for the prices to drop also. We have less competition over here as well so prices will stay higher for longer. And aussie is the most expensive country ive ever lived in..... its mental! 8-10 bucks for a beer.... hullo!! :confused: (thats 5/6 quid)

It is funny how things work out, you kind of would expect Fosters to be cheap in Australia, but I bet its not, in the same way Heineken is expensive in Amsterdam. I know that these two beers are brewed in other countires for sale around the world, but when your in the home town of one, you do expect a cheaper deal to be honest.
 
Last edited:
It is funny how things work out, you kind of would expect Fosters to be cheap in Australia, but I bet its not, in the same way Heineken is expensive in Amsterdam. I know that these two beers are brewed in other countires for sale around the world, but when your in the home town of one, you do expect a cheaper deal to be honest.

haha... made me laugh!! :D

Mate, they dont drink Fosters over here (dont even know if they sell it here, ive never seen it) its made for Europe so they think aussies drink it... but they dont.
 
I read a few sites about Nvidia working heavily on getting physics/fluids working extremely strongly on Kepler, but then I swear the same rumours were true for the previous Fermi.
 
I read a few sites about Nvidia working heavily on getting physics/fluids working extremely strongly on Kepler, but then I swear the same rumours were true for the previous Fermi.

Fluid dynamics is an interesting one - I was thinking strictly in the ingame physics sense before but its applicable to a whole wider area of maths where the compute side of it would be very useful.

Would be nice to see some movie CGI level water in games rather than rubbish static polygons and a handful of billboard images trying to look like water.
 
By the time Kepler is out, there will be non ref HD7970's in the market, which will be cheaper, cooler and faster than the reference cards.

Taking a step back, it doesn't seem like Nvidia care too much about this sector, and are concentrating on OEM/mobile solutions rather than enthusiast desktop GPU's?
 
I guess when you consider that going from GTX285 to GTX480 the sps doubled from 240 to 480, then it isn't too wild a claim that going from GTX580 to GTX680 they will double again (512->1024), similalry going from 8800GTX to 280GTX the shaders nearly doubled (128->240)
 
drunkenmaster said:
It's rather sweet you think a big kepler will use the same or less power than a 580gtx, it won't. A bog standard card that is 45% faster than a as currently stock clocked 7970 will NOT use under 300W, likely much more.
Basing that on what idea? Assuming Nvidia haven't refined or changed their designs at all in the past 18 - 24 months:
Starting off, Nvidia shrink the die from 520mm^2 down to ~360mm^2.
They drop the hot clocks, which costs a bit of speed, but the extra clock headroom of the entire chip is now higher.
So we'll have a GTX 580 which is smaller, roughly the same speed, with much less power consumption.

drunkenmaster said:
Honestly the last bit I don't know what you are talking about though? THeres nothing wrong with 768 shaders
Straw man argument. I never said there was a problem with 768 shaders, infact it's probably that their high end part will have that. Which is what I said.

gregster said:
Man I am begging to see some "hard evidence"
You and everyone else looking at the 79x0 series pricing.
 
By the time Kepler is out, there will be non ref HD7970's in the market, which will be cheaper, cooler and faster than the reference cards.
You mean like the Rev2 5800's and 6900's that didn't overclock anywhere near as well as the original reference designs? Rev2 cards are usually built to reduce costs rather than increase performance. OEM'S use cheaper VRM's, caps etc, and stick a bigger looking cooler that actually costs less because it contains less copper. Still, adding a cheap additional fan can make up for lack of copper, and make people think they have got a good deal.

In my experience with both AMD and NVidia cards, non reference cards are mostly inferior to reference designs. You do get the occasional exceptions like the Gigabyte SOC's, but most of the time OEM's just want to save a few $'s and charge a bit more for their "fantastic" new cooler design.
 
Last edited:
650 dollars = 410 pounds sounds about right in terms of pricing if nvidia deliver a card which draws with or beats the 7970.
 
Guess we've only gone AMD to thank for what's turning into a truly horrendously priced generation.
Hip hip!

Or you could thank Nvidia for refusing to drop the 580 prices before the 79xx series of cards launched.

They are both equally at fault.


Actually, scratch that, it's no ones fault. They are luxury items with the associated luxury pricing. If you don't want one at the current prices then don't buy one but for the love of god stop moaning about it.

You don't hear me moaning about how expensive the new Mclaren is and how we've got Ferrari to thank for that.


Anyhow, I like the current prices, it means I don't have to suffer the stigma of having the same card as all the riff raff!

Toodle pip!
 
Last edited by a moderator:
Yes.
I sold one of my 6870's to my GF's dad for 80 quid, then sold one here.
I wanted to go single GPU, and I like my silence so would have WC'ed.
Read a few reviews, the PCS+ cooler is both quiet and decent, so opted for a 7950 PCS+ for 370 quid delivered
It bests my 6870 crossfire when it's clocked and is quiet and silent, in that regard it's perfect, but it's a 200-250 quid card, not a 370 quid card.

So first you complained about people trying to justify the high prices, and then you go out and buy a card? That is about as much as you can justify it! :p
 
Back
Top Bottom