Associate
Looks like the 6800 Nitro has sold out
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Looks like the 6800 Nitro has sold out
It's karma telling me to wait on a 6800xt coming back into stock.
Wow this thread blew up. Its been interesting to read through, to say the least.
I'll say that I've gone ahead and bit the bullet. The nitro was out of stock, so I settled for a pulse as its basically the same just no adjustable RGB. I'm convinced that this card is going to be the best for price/performance for a good while. (720 quid is still a stinger though). But its not a whole lot more than what I'd be normally paying i guess. I just wanted something that performs well, and the 6800 sounded right up my alley. Ray tracing is nice, but I've never been that bothered about it.
And I'm certainly not bothered about cyberpunk after that horror show of a release lol.
meh, maybe amd will add something like dlss at some point
You don't really need dedicated hardware for upscalling because the cost of rendering something @4k is much higher than the cost of upscaling from a lower resolution to 4k. Dedicated hardware can do it faster and at a lower performance cost so for example you can have almost the same FPS at upscaled 4k as you had at 1440p or 1080p without upscaling. Without dedicated hardware you will lose some FPS but much less than you lose if you render the game at native 4k.I'll be amazed if amd have something comparable to dlss 2.0 for any of their current cards given they don't have the hardware for it....
ATM, their answer to dlss is fidelityFX as found in a few games i.e. cyberpunk, which does a good job but still not a patch on DLSS.
RDNA 3 for proper ray tracing perf. and a DLSS competitor imo.
You don't really need dedicated hardware for upscalling because the cost of rendering something @4k is much higher than the cost of upscaling from a lower resolution to 4k. Dedicated hardware can do it faster and at a lower performance cost so for example you can have almost the same FPS at upscaled 4k as you had at 1440p or 1080p without upscaling. Without dedicated hardware you will lose some FPS but much less than you lose if you render the game at native 4k.
The bad thing is we don't hear any news from AMD and Microsoft ( because a lot of the upscaling problem AMD has, depends on Microsoft and Direct ML )
Fidelity FX is not the same thing, it is rendering the game at lower/dynamic res for increased or stable FPS.
Which makes it not a patch on nvidias dlss
Yes, you don't need dedicated hardware but the results won't be as good, there's only so much you can do through optimisation/drivers.
You would think if amd had anything remotely comparable to dlss upcoming, they would at least acknowledge it or give some info., which is why I can't see them releasing anything comparable to dlss any time soon, instead fidelityfx (and yes not the same thing but it's the next best thing for amd users atm) will have to do.
I used the numbers as an arbitrary example of the difference between upscaling with or without dedicated hardware. So let's say that on CP you get 75 FPS at 1440p on the 3080 and 70 FPS on the 6800xt.Yup it's no problem if you're already pushing 100+ fps (at which case, dlss or anything isn't really required then...) but what about games where fps is already less than 60? i.e. cyberpunk. 20 fps boost with dedicated hardware when in 30/40/50 fps range is quite a good chunk better.
You don't really need dedicated hardware for upscalling because the cost of rendering something @4k is much higher than the cost of upscaling from a lower resolution to 4k. Dedicated hardware can do it faster and at a lower performance cost so for example you can have almost the same FPS at upscaled 4k as you had at 1440p or 1080p without upscaling. Without dedicated hardware you will lose some FPS but much less than you lose if you render the game at native 4k.
The bad thing is we don't hear any news from AMD and Microsoft ( because a lot of the upscaling problem AMD has, depends on Microsoft and Direct ML )
Fidelity FX is not the same thing, it is rendering the game at lower/dynamic res for increased or stable FPS.
Up to triple FPS comes from rendering the game at lower resolutions, the dedicated hardware won't give you more FPS, for example if you upscale it from 1440p to 4k and you get 100 FPS at 1440p you'll get up to 100 FPS at 4k.Running 1440p on a 4k tv VS running 4k on a 4k tv with DLSS, the answer is 4K with DLSS, the quality degradation is hardly noticeable.
The performance improvement of DLSS when running at 4k can be as much as triple framerates, as you say the hardware lowers performance cost compared to a GPU with no dedicated hardware.
AMD need DLSS equivalent or they simply get blown out of the water for this generation where to play at 4k with the best IQ vs lowest potential framerate loss you need to be using this upscaling system.
And because AMD don't have dedicated hardware means it's hard to see how they can come up to Nvidia standard of higher res gaming performance.
Up to triple FPS comes from rendering the game at lower resolutions, the dedicated hardware won't give you more FPS, for example if you upscale it from 1440p to 4k and you get 100 FPS at 1440p you'll get up to 100 FPS at 4k.
Without dedicated hardware you won't get 100 but you will get 80 which is more than 60 that you can get rendering the game at native resolution.This is what i explained the upscaling cost is less than rendering at native cost.
So even if tomorrow they both go by the same standard, Nvidia will have more FPS at upscaled res. Not by a lot because AMD has an advantage at lower resolutions, the performance difference could be the same as it is now at 4k, maybe a little bigger.
But then you can always drop the rendered resolution and swear the image looks better than native and much better than it looks on Nvidia (or AMD) at higher resolution, because this is the nature of upscaling and image quality: One thinks it is better than native, another one thinks it is blurrier and so on.
Once we will stop judging performance based in native resolutions, anything is possible.