• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Only if you assume they don't plan on a 3080Ti SKU.

The problem is the gap between the RTX3080 and RTX3080TI is actually quite large. The rated board powers are nearly the same,with a 17% reduction in shaders and only 40% of the VRAM quantity clocked lower. Unless realworld measurements show otherwise,the RTX3080 looks like it might be bottom barrel leaky silicon,pushed somewhat past its performance/watt sweatspot.

If there is a RTX3080TI that only reinforces that the RTX3080 silicon isn't the best! :p

So it will be interesting to see what actually reviews show. Is the RTX3080 board power rated too high,or is the RTX3090 board power optimistic??

I actually don't have much faith in Samsung,as their 14NM node was essentially used by GF to make Vega,and it was 2/2 disaster. At least Nvidia's uarch design is better!

Would be funny if it is TSMC 7NM - would be a proper jebait by Nvidia! :p

Still will probably beat AMD though!
 
The problem is the gap between the RTX3080 and RTX3080TI is actually quite large. The rated board powers are nearly the same,with a 17% reduction in shaders and only 40% of the VRAM quantity clocked lower. Unless realworld measurements show otherwise,the RTX3080 looks like it might be bottom barrel leaky silicon,pushed somewhat past its performance/watt sweatspot.

If there is a RTX3080TI that only reinforces that the RTX3080 silicon isn't the best! :p

So it will be interesting to see what actually reviews show. Is the RTX3080 board power rated too high,or is the RTX3090 board power optimistic??

I actually don't have much faith in Samsung,as their 14NM node was essentially used by GF to make Vega,and it was 2/2 disaster. At least Nvidia's uarch design is better!

Would be funny if it is TSMC 7NM - would be a proper jebait by Nvidia! :p

The price/performance rumors don't have me in a rush to buy now. I will probably wait until AMD bring their offerings to the table if the current rumors hold true.
 
The price/performance rumors don't have me in a rush to buy now. I will probably wait until AMD bring their offerings to the table if the current rumors hold true.

Nvidia will probably still beat AMD anyway through sheer grunt. Turing might have had its problems,but it was still a performance/watt improvement on the same node. But you never know,maybe the top is TSMC? But makes sense to see what both have to offer before making a choice. A GTX1080TI is still plenty powerful today.
 
The problem is the gap between the RTX3080 and RTX3080TI is actually quite large. The rated board powers are nearly the same,with a 17% reduction in shaders and only 40% of the VRAM quantity clocked lower. Unless realworld measurements show otherwise,the RTX3080 looks like it might be bottom barrel leaky silicon,pushed somewhat past its performance/watt sweatspot.

20830-posts.facebook_lg.jpg
 
What is the public buying, when we get the RTX 3080 or 3070 GPU's. Broken silicon, broken RTX 3090's that did not make the grade. However, as consumer's we are expected the pay through the teeth for GPU's that five year ago cost half the price.

NVIDIA needs some competition, the costs increase are not justified versus the supposed increase in power. AMD needs to step-up and use some of its Ryzen revenue to invest in R and D.
 
as much as we have a good old moan about price, and reveal Nvidia's marketing BS etc

I'm still having a real problem with this 3080 10GB thing if that's true that really is a deliberate and intentional "we don't want to give you more, it might cost $10 more to make but you ain't having it from us at any price"

...that's pretty outrageous if true, I guess we find out in a few days but honestly you can't spin that any other way seriously why rip the consumer off again
 
Nvidia will probably still beat AMD anyway through sheer grunt. Turing might have had its problems,but it was still a performance/watt improvement on the same node. But you never know,maybe the top is TSMC? But makes sense to see what both have to offer before making a choice. A GTX1080TI is still plenty powerful today.

I don't need the fastest available to mankind. I'm looking for a decent upgrade over my 1080Ti without feeling like I'm getting ripped off. The 3080 rumors don't sound like terrible price/performance, but they also don't don't scream "Must buy now!" to me either.

If Turing were the only generation of GPU's I ever knew, the 3080 rumors would look like a bargain. But my point of reference is pretty much *everything other than* Turing.
 
as much as we have a good old moan about price, and reveal Nvidia's marketing BS etc

I'm still having a real problem with this 3080 10GB thing if that's true that really is a deliberate and intentional "we don't want to give you more, it might cost $10 more to make but you ain't having it from us at any price"

...that's pretty outrageous if true, I guess we find out in a few days but honestly you can't spin that any other way seriously why rip the consumer off again
Yeah it's downright stingy.

And hopefully will put people off buying.

8GB already happened since 2015 or earlier. 8GB/10GB on the 3070/80 is almost insulting.
 
I don't need the fastest available to mankind. I'm looking for a decent upgrade over my 1080Ti without feeling like I'm getting ripped off. The 3080 rumors don't sound like terrible price/performance, but they also don't don't scream "Must buy now!" to me either.

If Turing were the only generation of GPU's I ever knew, the 3080 rumors would look like a bargain. But my point of reference is pretty much *everything other than* Turing.

I would just wait and see what is available and wait for a good deal!
 
More leaks:
https://wccftech.com/zotac-geforce-...orce-rtx-3070-custom-graphics-cards-pictured/
https://videocardz.com/newz/zotac-geforce-rtx-3090-trinity-pictured

Pictures of the GA102 in the RTX3080:
https://videocardz.com/newz/nvidia-ampere-ga102-rtx-3090-3080-gpu-pictured

Even the RTX3080 uses 3 PCI-E 8 pin power connectors!!

Edit!!

zotacgaming_ampere_vga_firestorm.jpg


The clockspeed slider goes very high!

Second Edit!!

Apparently there are 20GB RTX3080 models too.

If that is a 3080/3090 die on 7nm, massive to be honest no wonder requires 350w XD

Just looking at the gddr6x chips size wise puts it at roughly 700mm square (24mmx28mm). Considering the 5700XT is only 250mm on 7nm.

cJlBUVL.gif
 
Last edited:
Back
Top Bottom