• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The first "proper" Kepler news Fri 17th Feb?

It seems the GTX670 will have 6+6 connectors and the GTX680 will have 6+8 connectors according to Chiphell.

According to Kyle Bennett over on HardOCP he has heard noise that the GK104 is around 45% to 50% faster than a GTX580 in Nvidia benchmarks and has single card surround gaming as standard:

http://hardforum.com/showpost.php?p=1038447903&postcount=437

http://hardforum.com/showpost.php?p=1038447905&postcount=438

I wonder if AMD will counter with an HD7980??

It'll be a similar speed to the 7970 so no need to counter with anything other than price.
 
It'll be a similar speed to the 7970 so no need to counter with anything other than price.

Not true really, lets say for instance this came out at £250, and slightly beat a 7950 and slightly lost to a 7970, you'd see the 7950 hit circa £250 and the 7970 maybe £275-300, if you bring out a 1150-1200Mhz clocked 7980.... you can sell it for £400 still as its "the fastest card available". For every sensible person who buys the cheapest 7950 overclocks to 1200Mhz and only loses out 5-10% total performance, you get a few people who will buy the same damn core, called something different, priced at £400.

Why not have a card available to satisfy those people who HAVE to have the top card at any cost, and take their money if its on offer? We can see from the overclocking results that binning cards to get a dozen a wafer that can be sold at a higher cost would be pretty damn easy, AMD would be pretty mental to not do it. Not only would they make an exta 30% profit on those people, PR wise they can make the GK104 look a bit poo, and also make the GK110/112/680gtx/whatever the heck its called, when its finally released, look less good in the process. "you took 8-9 months to come up with something 15% faster?" looks a lot better than " you may be 9 months late, but its 40% faster, what a card" ;)
 
Indeed...

I imagine that "synthetic benchmarks" means Heaven, 3Dmark11, and perhaps one other. Tessellation and/or PhysX will undoubtedly play a large role.

Gaming performance will be the real test, particularly performance-per-Watt.

Who really cares about performance per watt in high end gaming?
performance per pound yeah sure, and as long as heatsink is good enough, if a card uses 50 more watts for an equivalent performance but is 100 quid less, im getting that faster cheaper card
 
Can anyone remember how long after release of the GTX580 it took the likes of Gigabyte and MSI to release non reference cooler versions? i.e. WINDFORCE and Twin FrozR II / III models?

Hope I don't have too wait too long when the GTX 680 comes out.
 
Can anyone remember how long after release of the GTX580 it took the likes of Gigabyte and MSI to release non reference cooler versions? i.e. WINDFORCE and Twin FrozR II / III models?

Hope I don't have too wait too long when the GTX 680 comes out.

A while. IIRC MSI did the two Lightning versions first.

There wasn't that much point really, given that the 570 and 580 were easily tamed with the stock cooler.
 
Well basically the 670 will have a smaller fan letting you use the two 6 pins.

The 680 will have a larger fan and will use the stacked 6 + 8 pin.

Both will use the same PCB and be 256 or 512bit (most likely 256)
 
Last edited:
The standard 570/580 cooler with 50+ watts less to dissipate would be perfect ;)

It's just a shame that GPU manufacturers want to be so old fashioned IMO.

Blower coolers have been around forever now. Surely for something that costs upwards of £400 they would at least try and make it look a little more exotic. Kinda like the EVGA 580 Class.

Tell you what man, have a dig around Youtube for some of the renderings of cards. Some of them are just amazing :)
 
I still say for the size restrictions they do a damm good job of dissipating 200+ Watts, compare them to what people hang on 100W CPU's ;)

580 classified was a standard 580 cooler block with a little add-on on the horn extension, the lower/base plate was the best most intelligently designed piece I've ever seen on a gfx card, but generally it could have been so much better with a little more thought.
 
Who really cares about performance per watt in high end gaming?

Because we are increasingly limited by power draw. If we are limited to roughly 300W from a single-GPU card, then performance-per-Watt may determine the maximum performance available from the card.

It shouldn't be an issue with GK104 directly as it is the smaller mid-range architecture, but if GK104 has poor performance-per-Watt (i.e. using >250W) then that really doesn't leave anywhere for the full Kepler GK110 to go - especially when you consider that the extra compute features on GK110 will drop the power efficiency somewhat.

Performance-per-Watt in GK104 should give us a good idea of what to expect from GK110 - and from future 28nm Nvidia refreshes.
 
Because we are increasingly limited by power draw. If we are limited to roughly 300W from a single-GPU card, then performance-per-Watt may determine the maximum performance available from the card.

It shouldn't be an issue with GK104 directly as it is the smaller mid-range architecture, but if GK104 has poor performance-per-Watt (i.e. using >250W) then that really doesn't leave anywhere for the full Kepler GK110 to go - especially when you consider that the extra compute features on GK110 will drop the power efficiency somewhat.

Performance-per-Watt in GK104 should give us a good idea of what to expect from GK110 - and from future 28nm Nvidia refreshes.

I think something will have gone horribly wrong in the nvidia camp if GTX580/7950 level performance has a TDP of 100watts more than the AMD 28nm part and the same as 580 on the older process
 
Hard launch or paper launch??

You forgot block of wood launch.

:D

Operation Director of ZOTAC in the Asia-Pacific area said at Weibo:

"GK104 launch won't be delayed to April, and the relative products have been ready".

If the relative products have been ready then why didn't they release them earlier to stop people buying some red GPU's?
 
Last edited:
Back
Top Bottom