Soldato
I'll take 1440p RT over 4k non RT anyday.
Me too.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I'll take 1440p RT over 4k non RT anyday.
Probably because it is something that belongs more in console space than desktop - and potentially is relatively short lived in usefulness though that remains to be seen.
If something like this provided the user with a considerable uplift then we would be fools not to embrace it. AMD are operating in the CPU and GPU space so it's only right for them to find a way to tie their products together.
The amount of extra details you can perceive on a large screen at 4k is night and day. Otherwise, we would all be gaming at 1080p 7" screens since they got the highest pixel density. Add to that how far monitors have been left behind compared to modern TVs. If you are gaming on a tiny 1440p monitor with no real hdr in 2020 you are doing totally wrong no matter how powerful your pc is. Your experience is vastly inferior to the one that even an average console gamer is getting.It's not like TVs are much bigger and have different pixel densities compared to smaller monitors or anything.
Me on the other hand find it insane to still game at 1440p a resolution that even console gamers are now shunning. My 8 year old nephew will be gaming at more than than twice the resolution compared to you in his little console
Plus ray tracing will work just fine on RDNA 2.
If you are gaming on a tiny 1440p monitor with no real hdr in 2020 you are doing totally wrong no matter how powerful your pc is. Your experience is vastly inferior to the one that even an average console gamer is getting.
The amount of extra details you can perceive on a large screen at 4k is night and day. Otherwise, we would all be gaming at 1080p 7" screens since they got the highest pixel density. Add to that how far monitors have been left behind compared to modern TVs. If you are gaming on a tiny 1440p monitor with no real hdr in 2020 you are doing totally wrong no matter how powerful your pc is. Your experience is vastly inferior to the one that even an average console gamer is getting.
Stuff like Infinity Cache tend to only really work where game developers are creating their products from the ground up with it in mind as happens on consoles - in desktop space you much more quickly run into the limitations of it - as it is the cache hit rate with it are only around 50-60% and that will drop with newer games on PC.
If the technology starts to get utilised in consoles then there is a chance that AMD could pull ahead in future releases as AMD pc hardware starts to get used in the same way.
Ultrawides are far too claustrophobic for me with their limited vertical resolution. 43" 4k is the best monitor size at the moment in my opinion. 48" curved oled would be the dream monitor. No ultra wide nonsense though.You sit closer to a monitor and it fills a similar field of vision as sitting back on a sofa playing on a TV.
Stop being obtuse.
I'm on a 1440p ultrawide. I'll take it over a 4k TV anyday.
Now if I had a 4k monitor, sure, I can see the argument for fidelity. But not a TV.
Was expecting a response along those lines.
Won't happen - development for games on PC never goes like that. Maybe if AMD captured 80-90% of the desktop add-in board market maybe but developers don't tend to have the same focus on desktop in terms of getting the most out of the hardware in the same way they do on console (amongst other things too many variations).
Ram is supposedly one of the things hampering 30xx availability.
Ultrawides are far too claustrophobic for me with their limited vertical resolution. 43" 4k is the best monitor size at the moment in my opinion. 48" curved oled would be the dream monitor. No ultra wide nonsense though.
Yeah but if... IF it was a "quick win" and didn't require a huge amount of work then it could happen.
The one thing reported all over is Nvidia have bucket loads of 6x Ram, it's the 8nm yield that's the problem on top of launching early because AMD.
Also it's funny people still comparing 6x 10gb vs 16gb normal 6..... They work on totally different tech to achieve the same thing, one uses large volumes the other doesn't need it ...
Ultrawides are far too claustrophobic for me with their limited vertical resolution. 43" 4k is the best monitor size at the moment in my opinion. 48" curved oled would be the dream monitor. No ultra wide nonsense though.
Nothing most people given the chance buy Nvidia graphics cards better features better drivers and faster in more games.
https://www.extremetech.com/gaming/316522-report-nvidia-may-have-canceled-high-vram-rtx-3070-3080-cards#:~:text=Micron's yields on GDDR6X are also reportedly poor.&text=GDDR6X is faster-clocked GDDR6,stands for picoJoules per bit).
They say it's a ram yield issue =/ Seen other places saying the same thing.