• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

As consoles are so well optimised, the system/OS doesn't use that much, and the majority can be dedicated to the GPU as VRAM. The next gen of consoles have 16GB of shared memory...

Expect AAA games 'ported' from PS5/Next gen Xbox to effectively double in VRAM requirements verses what we see now. This is planned obsolescence, as current 8GB VRAM cards will not be able to run equivalent graphics settings as the new consoles due to this VRAM limitation.

Nvidia will make a killing.

They also have to run the entire system with that same 16gb.

The Xbox series x for example has 10gb fast and 6gb slow memory. While it all could be used you would see a fairly big difference in terms of performance dropping 40-50% in memory speed mid game.

The ps5 has access to nearly all memory I think around 13-14gb, but is obviously not as quick in the specs department with 2070ish levels of performance most likely.

I’d imagine the next line of GPUs will be packing 12-16gb IMO. 8gb is starting to become lacking especially if next gen say a 3060/3070 matches a 2080Ti, I don’t think 8gb will be enough.
 
Last edited:
Hard to see 8k being a thing for a long time. Even for movies a lot of 4k Blu-ray releases even for new films are often upscaled to 4k.

4K is the endgame when it comes to TVs, much as the manufacturers would like to convince us that we all need 8K now.

Previous increases in resolution have been accompanied by increases in size, which have made effective use of the higher res. Even with new technology, there's a limit to how large a TV people actually want in their homes, dominating their living rooms. Even at 85", which is probably the absolute limit for 99.9% of people, you'd need to be sitting 5ft away from it to see the difference between 4K and 8K.
 
4kBE.gif


vRfcoGk.png


Too good to be true?

24GB? Nonsense. There's literally no need for it as hardly anything uses 11/12GB at present and GDDR6 isn't cheap.
 
They also have to run the entire system with that same 16gb.

The Xbox series x for example has 10gb fast and 6gb slow memory. While it all could be used you would see a fairly big difference in terms of performance dropping 40-50% in memory speed mid game.

The ps5 has access to nearly all memory I think around 13-14gb, but is obviously not as quick in the specs department with 2070ish levels of performance most likely.

I’d imagine the next line of GPUs will be packing 12-16gb IMO. 8gb is starting to become lacking especially if next gen say a 3060/3070 matches a 2080Ti, I don’t think 8gb will be enough.

If you read my post, I specifically mention that it's shared memory, and that the OS has to run off that RAM - so no clue why you're repeating my words? My point was the consoles are getting DOUBLE the memory, I.E. a massive VRAM upgrade. The OS of the next gen consoles will not consume double the memory of the current gen after all.

Current Titan has 24GB
Current 2080ti has 11GB
Current Radeon VII has 16GB

I'd expect another 4GB of VRAM at every price point minimum. 12GB 3070 and 3080, along with 16GB 3080TI would be my bet, based on common sense. 4000 series in 2 years likely to have 24GB on the 4080TI, IMO :)
 
I think they're deliberately stalling on performance because once most cards can do 4K easily with decent AA etc the incentive to upgrade disappears. I have an iPhone 6 and can see no reason whatsoever to upgrade.

Agreed and I have no issue with NVidia steadily increasing prices but it should follow performance.

intel seemed to be doing that around haswell time but it’s clear now they were struggling.

game development will continue to force hardware to keep up as long as the hardware keeps pc gamers interested and involved.
 
That's around the price point I'm interested in, regardless of the nomenclature. If it's anything more expensive than that, then the only dilemma I'm going to have is whether to buy Xbox or PS5. I'm actually really looking forward to seeing the performance figures for the consoles.

these are the posts I like to read.
 
If you read my post, I specifically mention that it's shared memory, and that the OS has to run off that RAM - so no clue why you're repeating my words? My point was the consoles are getting DOUBLE the memory, I.E. a massive VRAM upgrade. The OS of the next gen consoles will not consume double the memory of the current gen after all.

Current Titan has 24GB
Current 2080ti has 11GB
Current Radeon VII has 16GB

I'd expect another 4GB of VRAM at every price point minimum. 12GB 3070 and 3080, along with 16GB 3080TI would be my bet, based on common sense. 4000 series in 2 years likely to have 24GB on the 4080TI, IMO :)

True, console OS are very light
Compared to Windows - the console only needs about 2gb of memory for the OS and that is enough even though these new consoles feature real time video capture as well as 5 VMs for instant gaming.
 
It's also power hungry, I was surprised when I found out how much fast gddr6 ram and a memory controller adds to total gpu tdp

Not that much contrary to what most people have read.

Compare the TDP of the 2080 Ti (11GB) to the RTX Titan (24GB) there is not that much difference.

https://www.techpowerup.com/gpu-specs/geforce-rtx-2080-ti.c3305

https://www.techpowerup.com/gpu-specs/titan-rtx.c3311

There is a 30W increase but part of this is also down to the higher stock GPU boost clockspeed on the RTX Titan.

Or putting it another way the extra 13GB of VRAM on the Titan uses less than 30W extra power.
 
4K is the endgame when it comes to TVs, much as the manufacturers would like to convince us that we all need 8K now.
8k is the "endgame" when it comes to TV's? LOL at statements like this... :D

The resolution of the human eye is 576MP which is 32000x18000. That is the "endgame" with visuals, as then things will be indistinguishable from reality. That is where we are heading, or as close to it as possible. And yes, it will take many decades.
 
8k is the "endgame" when it comes to TV's? LOL at statements like this... :D

The resolution of the human eye is 576MP which is 32000x18000. That is the "endgame" with visuals, as then things will be indistinguishable from reality. That is where we are heading, or as close to it as possible. And yes, it will take many decades.

Got some source for that? I'm rather curious about the claim :)
 
Agreed and I have no issue with NVidia steadily increasing prices but it should follow performance.

intel seemed to be doing that around haswell time but it’s clear now they were struggling.

game development will continue to force hardware to keep up as long as the hardware keeps pc gamers interested and involved.
+1
 
Not sure if we'll ever get to 1:1 real life image resolution - 576MP is nothing to sniff at - the idea that a graphics card can push out that many pixels or we could fit that much data on a disk for video playback, it's nuts but yeah anything is possible with time

Eg: Extrapolating out from a 4K bluray - a 576MP movie would be pushing out about 14400Mb/s. Can't wait till we've got 10TB blu ray discs to store all that data. Most difficult part will always be the screen though - trying to find a manufacturing process that can yield a screen with 18 times more pixels than an 8k TV
 
Last edited:
Got some source for that? I'm rather curious about the claim :)
Here is a summary of all of the sources https://www.dpreview.com/forums/thread/2791860

Of course it's not as simple as "576MP is what we need, end of story", but it shows how there is a long way before media resolution actually achieve any kind of endgame... we are still in our relative infancy of what media realism and resolutions will be like.

Not sure if we'll ever get to 1:1 real life image resolution - 576MP is nothing to sniff at - the idea that a graphics card can push out that many pixels or we could fit that much data on a disk for video playback, it's nuts but yeah anything is possible with time

Eg: Extrapolating out from a 4K bluray - a 576MP movie would be pushing ou 14400MB/s

If you think only with the technical limitations of the time then yes, of course it seems 'nuts'. But then again, so did something with the power of an Intel Core 2 Q6600 50 years earlier.

Technology (processing power, data transfer methods) will progress rapidly over the coming decades.
 
Last edited:
Got some source for that? I'm rather curious about the claim :)

From what I remember it is a little more complicated than that - in that the image the average person sees is around 8000x6000 pixel equivalent but the eye can resolve around 4x that detail in a focussed area.
 
Here is a summary of all of the sources https://www.dpreview.com/forums/thread/2791860

Of course it's not as simple as "576MP is what we need, end of story", but it shows how there is a long way before media resolution actually achieve any kind of endgame... we are still in our relative infancy of what media realism and resolutions will be like.



If you think only with the technical limitations of the time then yes, of course it seems 'nuts'. But then again, so did something with the power of an Intel Core 2 Q6600 50 years earlier.

Technology (processing power, data transfer methods) will progress rapidly over the coming decades.

Cheers for sharing, I'll have a look.
 
I think they're deliberately stalling on performance because once most cards can do 4K easily with decent AA etc the incentive to upgrade disappears. I have an iPhone 6 and can see no reason whatsoever to upgrade.

Almost any video card can do its basic stuff: render an image at a certain resolution and refresh rate, maybe encode and decode some stuff. Just like iPhone 6 can do basic stuff. For more advanced stuff, you need more advanced hardware and software.

Until a computer can render an indistinguishable image from reality at whatever frame rate you want, there's always room for improvement. If you'd have a card that would do 4k@120fps, full details today, it can only do that for current "limited" games. For next ones would be already "obsolete" for someone with high standards.
 
Last edited:
From what I remember it is a little more complicated than that - in that the image the average person sees is around 8000x6000 pixel equivalent but the eye can resolve around 4x that detail in a focussed area.

In future a lot of content will likely be projected on to or very close to our eyes (retina), so the dpi will need to be pretty insane. Traditional TV's will likely be very, very old-skool by 2050.

Almost any video card can do its basic stuff: render an image at a certain resolution and refresh rate, maybe encode and decode some stuff. Just like iPhone 6 can do basic stuff. For more advanced stuff, you need more advanced hardware and software.

Until a computer can render an indistinguishable image from reality at whatever frame rate you want, there's always room for improvement. If you'd have a card that would do 4k@120fps, full details today, it can only do that fur current "limited" games. For next ones would be already "obsolete" for someone with high standards.

Yup, this. That people don't seem to understand this basic concept always surprises me a little, because it's really not rocket science.
 
Back
Top Bottom