A slight correction for you. FidelityFX Super Resolution will kill off DLSS.
Why? What additional advantages will it have? Last I heard it may have inferior image quality but equal performance. Happy to be proved wrong though.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
A slight correction for you. FidelityFX Super Resolution will kill off DLSS.
Why? What additional advantages will it have? Last I heard it may have inferior image quality but equal performance. Happy to be proved wrong though.
He is also "ignorant" because you don't need that much bandwidth if you use DLSS so that means the 3000 series could have used a 256 bit bus, if the DLSS was the most important feature. But they were built for native 4k and the 8k/60 fps or 8k RT with DLSS, were marketing tricks.It's not that long ago we used to render games at higher res that our monitors supported.
Now here we are where the future is glorified upscaling.
News at 10 : Faster cards are faster, current GPU's cant game at 8k native
Do you have nvidia stock or something ? unless nvidia opens up that solution for other to adopt (consoles ect) OR make the implementation game engine agnostic that will just never happen. Just look at gsync ect.
A DLSS type technology that isn't HW specific may help out but really no one will be looking at 8K rendering for a very VERY long time so for now that (and RT) are small fries in comparison to native rasterization performance. I mean the 1440p / 4k user base is still in the single percentage points
From steam survey (main display resolution) :
1080p : 65%
1440p : 7%
4k : 2%
Other (including 8k) : 2.11%
It cant its compute and DLSS runs on tensor cores which is faster.A slight correction for you. FidelityFX Super Resolution will kill off DLSS.
It cant its compute and DLSS runs on tensor cores which is faster.
The 30 seriers is better placed to do 8k with double the cores, much higher memory bandwidth and DLSS. 8k games really need that and the 6900xt and 6800xt are no suited. The 6000 seriers has half the number of cores and much lower memory bandwidth. With the weak RT of the 6900xt only the 3090 can do 8k and RT with DLSS of course. The video cherry picks its games for the 8k resolution which means its more of less a waste of time. You can do that with any card but the 3090 is the best placed to run 8k @ 60fps.
The 30 seriers is better placed to do 8k with double the cores
DLSS will kill off native rendering in future.
Sure, theres no problem with stock, fake news
https://www.overclockers.co.uk/zota...dr6x-pci-express-graphics-card-gx-123-zt.html
If it were an AMD feature you'd be calling it a performance hack which spoils image quality.
With all the arguing going on, I have one question this Friday.
Can I buy a card yet?
Great, I was due to get a lobotomy on Monday. Once the stupidity has kicked in I can buy that.
You really do read from the Jenson marketing handbook, its no wonder some people think you are a literal Nvidia shill.
How many cores it has is utterly meaningless, the 10700K has the same number of cores as the 5800X but the latter is per clock 30% faster. Besides that the 3090 has 82 CU's vs 80 on the 6900XT, they both have 64 dual-core SM's, 5120 ALUs and 5120 RT cores for the 6900XT and 5248 ALUs with 5248 RT cores for the 3090, the only difference being Nvidia call their dual SM cores individual CUDA cores because they are both capable of FP32, which is fair enough but Nvidia have 2 more CU's, 128 more shader maps. (2.5%)