Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
High end cards can't even do that now, unless you cherry pick current games which have pretty low requirements.Almost any video card can do its basic stuff: render an image at a certain resolution and refresh rate, maybe encode and decode some stuff. Just like iPhone 6 can do basic stuff. For more advanced stuff, you need more advanced hardware and software.
Until a computer can render an indistinguishable image from reality at whatever frame rate you want, there's always room for improvement. If you'd have a card that would do 4k@120fps, full details today, it can only do that fur current "limited" games. For next ones would be already "obsolete" for someone with high standards.
I agree however there are certain milestones along the way. Most people or families now own an HD TV but the availability of content is still limited so despite the rise in number of 4K TVs and monitors again there is an even more pronounced lack of content. To maximise profit the thing to do is to incrementally increase GPU performance slowly so your loyal fan base continue buying slightly faster GPUs for ever increasing prices to just about keep pace with game developments. Instead of making a card that can run current games at 4K with 4-8XMSAA and all ultra 60+fps reset the situation by offering RT and you're back to 1080P at 60fps if you're lucky. They can probably string this out for 1000 years at the current rate of progress.Almost any video card can do its basic stuff: render an image at a certain resolution and refresh rate, maybe encode and decode some stuff. Just like iPhone 6 can do basic stuff. For more advanced stuff, you need more advanced hardware and software.
Until a computer can render an indistinguishable image from reality at whatever frame rate you want, there's always room for improvement. If you'd have a card that would do 4k@120fps, full details today, it can only do that fur current "limited" games. For next ones would be already "obsolete" for someone with high standards.
Yeah, because new media standards are of course designed with people like your girlfriend in mind.Surely it depends on the person though some having better eyesight than others. My misses can barely tell the difference between Blu-ray and 4K I have to convince her it’s better.
Haha. And you got many people saying 1440p is the endgame and they can’t see the difference between it and 4K. It’s like, best head to specsavers ASAP mate!8k is the "endgame" when it comes to TV's? LOL at statements like this...
The resolution of the human eye is 576MP which is 32000x18000. That is the "endgame" with visuals, as then things will be indistinguishable from reality. That is where we are heading, or as close to it as possible. And yes, it will take many decades.
I have seen people say the same thing with literally every new generation since CRT. First it was LCD'd will never catch on, then it was 720p is all you will ever need, then it was how 1080p s too much, then how 4k was pointless and a waste of good bandwidth... now 8k is apparently the 'endgame'.Haha. And you got many people saying 1440p is the endgame and they can’t see the difference between it and 4K. It’s like, best head to specsavers ASAP mate!
When mentioning 8K, people laugh like it would be a waste of time as there would be no difference. Of course there will.
Well they need to be because my girlfriend will make 50% of the decision making process when it comes to buying a new tv. I had to take her along with me to Richer Sounds in order to hopefully make her see the benefit of buying a £800 tv, rather than a £300 she would have been happy with.Yeah, because new media standards are of course designed with people like your girlfriend in mind.
8k is the "endgame" when it comes to TV's? LOL at statements like this...
The resolution of the human eye is 576MP which is 32000x18000. That is the "endgame" with visuals, as then things will be indistinguishable from reality. That is where we are heading, or as close to it as possible. And yes, it will take many decades.
I find the above resolution claim very hard to believe.
If for example you could have a 30" monitor that supported 32000x18000 and used it to look at images where a few pixels were altered between 2 identical images it could take you days or weeks or you may never find the difference.
Eye sight also does not work quite like a camera either and you don't look at an entire scene, you look at points within the scene.
for reference 576mp is the resolution of the human eye focusing on an object, 7mp is the resolution on the peripheral of the eyesight. Most of the eyesight is blurry low resolution, it's only the center where the eye is focused that's is clear and crisp
That's your problem, not the technology manufacturers. If you see a clear benefit o something like this then you should be able to convince your other half and she should be able trust your judgement in areas you are more capable than her at. Just as I am sure you do with areas she is more capable at.Well they need to be because my girlfriend will make 50% of the decision making process when it comes to buying a new tv. .
Even focusing I find it hard to believe the 576mp figure.
I use equipment at work which checks stuff far better than the human eye can and it most definitely does not run at anything near the figure quoted above.
I agree however there are certain milestones along the way. Most people or families now own an HD TV but the availability of content is still limited so despite the rise in number of 4K TVs and monitors again there is an even more pronounced lack of content. To maximise profit the thing to do is to incrementally increase GPU performance slowly so your loyal fan base continue buying slightly faster GPUs for ever increasing prices to just about keep pace with game developments. Instead of making a card that can run current games at 4K with 4-8XMSAA and all ultra 60+fps reset the situation by offering RT and you're back to 1080P at 60fps if you're lucky. They can probably string this out for 1000 years at the current rate of progress.
The only reason you would buy an 8k tv at the minute is so you could tell YOURSELF " look at my 8k tv, isn't it amazing".
If you read my post, I specifically mention that it's shared memory, and that the OS has to run off that RAM - so no clue why you're repeating my words?
You can make games run at 4k@120hz with 8xMSAA, but it won't be the next Crysis.
We’re still waiting for proper 4K mainstream gaming yet alone a million mp displays.