• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The first "proper" Kepler news Fri 17th Feb?

Intel own all their fabrication sites so I would assume they don't charge themselves to use it. ;) As far as I'm aware most, if not all, R&D is done "in-house" so to speak

Edit - Also, graphics cards have a lot more on them than a CPU does, the ports (hdmi), the vram, the capacitors, the PCB, the cooling, the GPU its self, it all adds up.

Yeah,Yeah, but the 28nm fab costs are suppose to be less - more chips per wafer ?
Maybe supply/cost has not settled down yet then
 
Yeah,Yeah, but the 28nm fab costs are suppose to be less - more chips per wafer ?
Maybe supply/cost has not settled down yet then

Although technically it's using less silicon, it's still expensive and very complex. R&D costs lots of pennies.

These are just initial prices, they'll come down eventually.

The whole technology is getting cheaper to make so it should be cheaper is a fallacy when this is just the first iteration of 28nm and there is much much more going on than just the design of the GPU.
 
NewEgg on the EVGA 680


Anyone else notice they have raised the cooling fan from the normal center line, that is why the power connectors are moved.
Also better for reference fan as it only blows from the bottom half....still shame they blocked the exhaust with a massive dvi connector :confused:
 
Although technically it's using less silicon, it's still expensive and very complex. R&D costs lots of pennies.

These are just initial prices, they'll come down eventually.

The whole technology is getting cheaper to make so it should be cheaper is a fallacy when this is just the first iteration of 28nm and there is much much more going on than just the design of the GPU.

Yes again, but it just looks like they are trying to recoup all that r&d in the first production run..:D
 
Yes again, but it just looks like they are trying to recoup all that r&d in the first production run..:D

I just wonder how low the levels of stock will be.

As TSMC shut down their 28nm fab for no given reason for a little while. A lot of websites were bemused. :p
 
chispy over at Guru3D, who is always very reliable and non biased when it comes to things like this.

Yeah, the fan profile is set more for noise than cooling apparently, and for 1400mhz you would need 100% fanspeed, but impressive nonetheless, especially when better coolers can be used as the stock ones are usually pretty guff.
 
Keep in mind when looking at the benchmarks that BF3 tends to perform better on Nvidia by default so it's not the entire story (unless of course BF3 is your main game in which case of course it stacks up.

All the figures I've seen seem to be very close, especially with a overclocked 7970 (not seen anything other than auto overclock for 680). As a rule though it looks as if 580 is slightly quicker at 1080p and 7970 (perhaps because of the 3GB RAM or it just scales better) slightly quicker at 1440p.

Either way as it stands I doubt anyone will notice the difference between them in the real world outside of a benchmark.
 
An overclocked 7970 will give better performance in BF3

max_oc_vs_7970.gif
 
An overclocked 7970 will give better performance in BF3

max_oc_vs_7970.gif
What!, surely smoth consistent high fps is superior to saw like high's and lows? Look at the averages as well.=, GK104 is significantly better. I think you missed the point that BOTH were max overclocked.

It really is amazing that the GTX560 replacement can now equal or exceed the 6970 replacement, whilst utilising far less transistors and consuming 20% less power. The fact that it overclocks just as well may also be important for some.

Just a shame about the price. At ~$50 RRP less that the 7970, it is an absolute rip off for what was supposed to be a mainstream card. The 7970 is even worse for a now power hungry, underperforming top-end fail.
 
Last edited:
I really don't understand all this 28nm GFX card pricing, I was just looking at ATI's new midrange...shocking prices.

While undeniably complex, how do these new GFX cards get priced more than a top end sandy bridge CPU and motherboard combo ?

The smaller your process the better your profit margin, and this leads to more competitive prices. A strange thing happens with microelectronics when you own the fab -- you begin competing against yourself. When you have cutting-edge fabs like intel does, it is actually cheaper for you to drive innovation and increasingly reduce the process size because it drives your costs down.

This doesn't translate as well with fabless semiconductor companies like AMD, ATI and NVIDIA as the fab will take the extra profit from this driving of costs down, while the advantage they get is really only going to be in the efficiency and power of your of design -- i.e. being able to pack more transistors -- and thus remain competitive.

As counter-intuitive as it seems it is actually better for intel to keep improving its process at a rapid rate, drive down its production costs and keep selling faster and faster processors even though it has no real competition in the high-end desktop market (from AMD).

And with GPUs, NVIDIA and ATI spend a lot more in production than Intel does.

The second reason is economies of scale. Far more processors will sell than GPUs, because every computer needs a processor, but not every computer needs a discrete GPU. Especially not a high-end one.
 
Back
Top Bottom