• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

If we see a 3080Ti but no Titan then i'll get the Ti but i am tempted to hold off for the Titan variant.

I doubt we'll see a release this early if they don't plan to ship until late August/September.
 
Have you seen the latest tests, apparently the lg OLEDS are not capable of the full speed and actually not fully hdmi 2.1 as advertised.

Video here from hdtv test.
https://youtu.be/GFJmjKJGx5o

Im not sure how this lower bandwidth would affect gaming with the new 3080ti??


The panels are 4k, 120hz, HDR and 10bit.

40GBPs bandwith fulfills 4k, 120hz, HDR, 10bit so as far as I'm aware, it makes no difference whatsoever.

It basically makes no meaningful difference.

Nvidia hopefully will cover this with the 3xxx series and not do 8bit or 12bit...
 
The panels are 4k, 120hz, HDR and 10bit.

40GBPs bandwith fulfills 4k, 120hz, HDR, 10bit so as far as I'm aware, it makes no difference whatsoever.

It basically makes no meaningful difference.

Oh ok cool that's good to know, not sure why people even talking about it then.
 
The standard for HDMI 2.1 is 48 Gbps.

As far as I know, actually the HDMI alliance haven't set a number for GBPS which will allow or disallow HDMI 2.1 certification. Its quite a messy situation where HDMI 2.1 has a myriad of features but for HDMI 2.1 certification, you don't need to fulfil them all.
 
Have you seen the latest tests, apparently the lg OLEDS are not capable of the full speed and actually not fully hdmi 2.1 as advertised.

Video here from hdtv test.
https://youtu.be/GFJmjKJGx5o

Im not sure how this lower bandwidth would affect gaming with the new 3080ti??
Huh? I thought that was only the new CX that has this issue and even then it did not matter as the panels are 10bit and only 12 bit would fully saturate it. Basically a non issue. Besides my B9 does not suffer from that issue, it has the full 48Gbps speed anyway.
 
Oh ok cool that's good to know, not sure why people even talking about it then.


I think the issue is:
1. LG held back this information
2. Nvidia GPUs normally operate in 8 bit or 12bit image pass through. They leave 10bit for the Quadro GPUs. If thats the case for the 3xxx series, then we are actually in trouble with the CX range because it won't be able to accept a 12bit HDR 4k 120fps signal and convert it to 10bit.

I've been crossing my fingers for a 3.5k or 4k 77'' LG OLED C9 but sadly it hasn't come up.
 
Perceived value.



Not in raw numbers, no, but consoles have a lower overhead than PCs which significantly boosts the comparative performance.

I haven't seen this performance boost though. Assassins creed Valhalla runs at 4K 30fps on the series x, doesn't seem up to the task of a 2080ti. The only game claiming 120fps is dirt 5, a small track racing game which are known for small performance requirements - take forza 7 as an example - it only uses 70% of the gpu power on the one x to do 4K 60fps - and that's just a rx580.. the rest of the series x games shown so far are all 4K 60fps with mediocre graphics
 
Last edited:
The panels are 4k, 120hz, HDR and 10bit.

40GBPs bandwith fulfills 4k, 120hz, HDR, 10bit so as far as I'm aware, it makes no difference whatsoever.

It basically makes no meaningful difference.

Nvidia hopefully will cover this with the 3xxx series and not do 8bit or 12bit...

HDr still looks awesome right now on Nvidia gpu - but you're saying that even in HDR it's always 8 bit? Meh if that's the case it looks awesome
 
I'm probably going to embarass myself here, but I'm going to make a bold prediction: if the 3080 Ti is released next week, it will be for around £1,000. And there will be plenty of people who will buy it at that price if it reviews well. But after the release of the consoles it will be reduced to about the same price as the consoles.
If they do then I’ll be biting, I’ve been wanting to get rid of my 1080’s in SLI for some time now and 2000 series was a massive disappointment.
 
Don't get your hopes up guys. 99% chance this will just be data centre Ampere and nothing more.

Exactly this, it will be a stream about how great the compute power is and how much extra ray tracing performance. How good the AI stuff is. nothing more there will be no mention of consumer Graphics cards, they will come latter.
The only thing we will get from this up coming GTC stream is we will get to see just how good the AMPERE architecture is and it might give us a rough idea of what the Titan card might be like.
 
Exactly this, it will be a stream about how great the compute power is and how much extra ray tracing performance. How good the AI stuff is. nothing more there will be no mention of consumer Graphics cards, they will come latter.
The only thing we will get from this up coming GTC stream is we will get to see just how good the AMPERE architecture is and it might give us a rough idea of what the Titan card might be like.

it's still worth watching so we can:

* See how much IPC gain Ampere has
* How many cores Nvidia has managed to fit on 7nm EUV
* What clock speed Nvidia has managed

These details will give us a rough idea of what gaming performance improvement we can expect from Ampere

What I'm looking for is how much IPC improvement has Nvidia generated, have they managed to increase clock speeds and lastly how many extra cores can they fit in the same die sizes etc. These are the main factors that would affect performance of the Ampere gaming cards
 
Last edited:
I haven't seen this performance boost though. Assassins creed Valhalla runs at 4K 30fps on the series x, doesn't seem up to the task of a 2080ti. The only game claiming 120fps is dirt 5, a small track racing game which are known for small performance requirements - take forza 7 as an example - it only uses 70% of the gpu power on the one x to do 4K 60fps - and that's just a rx580.. the rest of the series x games shown so far are all 4K 60fps with mediocre graphics

Do you know for sure AC Vlahalla will do 4k@60fps "full options" on a 2080ti?
 
I'm probably going to embarass myself here, but I'm going to make a bold prediction: if the 3080 Ti is released next week, it will be for around £1,000. And there will be plenty of people who will buy it at that price if it reviews well. But after the release of the consoles it will be reduced to about the same price as the consoles.
No embarrassment but no chance either I’m afraid it just not how Nvidia release gaming GPUs
 
Back
Top Bottom