• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Surely it depends on the person though some having better eyesight than others. My misses can barely tell the difference between Blu-ray and 4K I have to convince her it’s better.
 
Almost any video card can do its basic stuff: render an image at a certain resolution and refresh rate, maybe encode and decode some stuff. Just like iPhone 6 can do basic stuff. For more advanced stuff, you need more advanced hardware and software.

Until a computer can render an indistinguishable image from reality at whatever frame rate you want, there's always room for improvement. If you'd have a card that would do 4k@120fps, full details today, it can only do that fur current "limited" games. For next ones would be already "obsolete" for someone with high standards.
High end cards can't even do that now, unless you cherry pick current games which have pretty low requirements.

I won't be moving to 4K until a xx70 or xx80 class card can do 4K with relatively high lows in demanding/multiplayer games, or ones like in your words not "limited". Until then I am happy with 1440p to mostly keep a high frame rate (above 60fps)
 
Last edited:
Almost any video card can do its basic stuff: render an image at a certain resolution and refresh rate, maybe encode and decode some stuff. Just like iPhone 6 can do basic stuff. For more advanced stuff, you need more advanced hardware and software.

Until a computer can render an indistinguishable image from reality at whatever frame rate you want, there's always room for improvement. If you'd have a card that would do 4k@120fps, full details today, it can only do that fur current "limited" games. For next ones would be already "obsolete" for someone with high standards.
I agree however there are certain milestones along the way. Most people or families now own an HD TV but the availability of content is still limited so despite the rise in number of 4K TVs and monitors again there is an even more pronounced lack of content. To maximise profit the thing to do is to incrementally increase GPU performance slowly so your loyal fan base continue buying slightly faster GPUs for ever increasing prices to just about keep pace with game developments. Instead of making a card that can run current games at 4K with 4-8XMSAA and all ultra 60+fps reset the situation by offering RT and you're back to 1080P at 60fps if you're lucky. They can probably string this out for 1000 years at the current rate of progress.
 
from those links its in the main the "Clarkvision" one that talks about resolving power. Unfortunately there's some man maths going on with it :eek::D. We don't resolve that high at all. its only on a very small angle of view we have detailed vision and its still not that high. There are some very good medical books and medical imaging books you can buy if your interested in the physical limits of the eye.
 
Surely it depends on the person though some having better eyesight than others. My misses can barely tell the difference between Blu-ray and 4K I have to convince her it’s better.
Yeah, because new media standards are of course designed with people like your girlfriend in mind.
 
8k is the "endgame" when it comes to TV's? LOL at statements like this... :D

The resolution of the human eye is 576MP which is 32000x18000. That is the "endgame" with visuals, as then things will be indistinguishable from reality. That is where we are heading, or as close to it as possible. And yes, it will take many decades.
Haha. And you got many people saying 1440p is the endgame and they can’t see the difference between it and 4K. It’s like, best head to specsavers ASAP mate! :p:D

When mentioning 8K, people laugh like it would be a waste of time as there would be no difference. Of course there will.
 
Haha. And you got many people saying 1440p is the endgame and they can’t see the difference between it and 4K. It’s like, best head to specsavers ASAP mate! :p:D

When mentioning 8K, people laugh like it would be a waste of time as there would be no difference. Of course there will.
I have seen people say the same thing with literally every new generation since CRT. First it was LCD'd will never catch on, then it was 720p is all you will ever need, then it was how 1080p s too much, then how 4k was pointless and a waste of good bandwidth... now 8k is apparently the 'endgame'. :D

I don't think people quite understand how technology progression works... we have a looong way to go before things reach any kind of endgame, if ever.
 
Having only recently moved to a 4K OLED watching the Champions League football in 4K is already a vast improvement over 1080P.

Frankly, 8K broadcasts would be astonishing but storage capacities have a long way to go too before that becomes standard for media. Not sure how the the likes of Sky/Virgin will figure it out either for broadcasts as the data bandwidth requirements must be mental.
 
Yeah, because new media standards are of course designed with people like your girlfriend in mind.
Well they need to be because my girlfriend will make 50% of the decision making process when it comes to buying a new tv. I had to take her along with me to Richer Sounds in order to hopefully make her see the benefit of buying a £800 tv, rather than a £300 she would have been happy with.

Those that want that absolute cutting edge bestest are a very niche market, the money is in the mainstream.
 
8k is the "endgame" when it comes to TV's? LOL at statements like this... :D

The resolution of the human eye is 576MP which is 32000x18000. That is the "endgame" with visuals, as then things will be indistinguishable from reality. That is where we are heading, or as close to it as possible. And yes, it will take many decades.

I find the above resolution claim very hard to believe.

If for example you could have a 30" monitor that supported 32000x18000 and used it to look at images where a few pixels were altered between 2 identical images it could take you days or weeks or you may never find the difference.

Eye sight also does not work quite like a camera either and you don't look at an entire scene, you look at points within the scene.
 
As mentioned above it’s the content that will drive it, far cry when it released was amazing, roll forward to now games are taking years to produce, some cost 200 million plus to make, innovation is pretty much zero with reskin after reskin released.

I personally don’t think visuals on a OLED/VA/IPS screen are where it’s going, increased DPI in headsets , better design of them etc etc are what will drive the graphics card market ultimately
 
I find the above resolution claim very hard to believe.

If for example you could have a 30" monitor that supported 32000x18000 and used it to look at images where a few pixels were altered between 2 identical images it could take you days or weeks or you may never find the difference.

Eye sight also does not work quite like a camera either and you don't look at an entire scene, you look at points within the scene.

for reference 576mp is the resolution of the human eye focusing on an object, 7mp is the resolution on the peripheral of the eyesight. Most of the eyesight is blurry low resolution, it's only the center where the eye is focused that's is clear and crisp
 
Last edited:
for reference 576mp is the resolution of the human eye focusing on an object, 7mp is the resolution on the peripheral of the eyesight. Most of the eyesight is blurry low resolution, it's only the center where the eye is focused that's is clear and crisp

Even focusing I find it hard to believe the 576mp figure.

I use equipment at work which checks stuff far better than the human eye can and it most definitely does not run at anything near the figure quoted above.
 
Well they need to be because my girlfriend will make 50% of the decision making process when it comes to buying a new tv. .
That's your problem, not the technology manufacturers. If you see a clear benefit o something like this then you should be able to convince your other half and she should be able trust your judgement in areas you are more capable than her at. Just as I am sure you do with areas she is more capable at.

Even focusing I find it hard to believe the 576mp figure.

I use equipment at work which checks stuff far better than the human eye can and it most definitely does not run at anything near the figure quoted above.

Ahhh... you have "equipment at work that checks stuff far better than the human eye can"? Thanks for that very detailed and scientific information. :D
 
I agree however there are certain milestones along the way. Most people or families now own an HD TV but the availability of content is still limited so despite the rise in number of 4K TVs and monitors again there is an even more pronounced lack of content. To maximise profit the thing to do is to incrementally increase GPU performance slowly so your loyal fan base continue buying slightly faster GPUs for ever increasing prices to just about keep pace with game developments. Instead of making a card that can run current games at 4K with 4-8XMSAA and all ultra 60+fps reset the situation by offering RT and you're back to 1080P at 60fps if you're lucky. They can probably string this out for 1000 years at the current rate of progress.

Hardware is not everything, you still need to make software that works well based on the HW you have. Right now the weakest link is the console, not much thought is put in adding stuff on top. If gamers will stop buying consoles, start buying more high end PC stuff and more games on the PC, maybe the mentality will change. And that's why having stuff at high prices is not a good thing no matter how much some people will try to justify high prices on certain products.

You can make games run at 4k@120hz with 8xMSAA, but it won't be the next Crysis. Most likely Half Life 2, for instance, you can run it like that. Who knows, maybe even at 200fps+.

PS: Ray/Path tracking, although expensive, is miles better than the usual raster stuff. Problem is that so far (1) the hardware is not quite there yet and (2), developers don't adopt it all hands on deck.
 
Yep technology will improve as is the nature of such things but we’re so far off it’s hardly worth bickering about.

We’re still waiting for proper 4K mainstream gaming yet alone a million mp displays.
 
If you read my post, I specifically mention that it's shared memory, and that the OS has to run off that RAM - so no clue why you're repeating my words?

I did... you stated that next gen consoles have 16gb memory and so well optimised that we’d be looking at what 14-15gb available for vram?

I was simply pointing out that isn’t the case. At least not for the Xbox. Hense why the XSX has essentially 10gb for vram 6gb for OS. That’s only 1-2gb more than what the Xbox one x has available for vram. Which makes sense as with the new features I’d be expecting higher ram usage.

I was mainly implying that vram on cards will increase sure but I don’t think it will be anything overly meaningful as current and even the next lineup of games don’t use all that much anyway.
 
Last edited:
Back
Top Bottom