• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Rx 6800 good enough?

Wow this thread blew up. Its been interesting to read through, to say the least.

I'll say that I've gone ahead and bit the bullet. The nitro was out of stock, so I settled for a pulse as its basically the same just no adjustable RGB. I'm convinced that this card is going to be the best for price/performance for a good while. (720 quid is still a stinger though). But its not a whole lot more than what I'd be normally paying i guess. I just wanted something that performs well, and the 6800 sounded right up my alley. Ray tracing is nice, but I've never been that bothered about it.

And I'm certainly not bothered about cyberpunk after that horror show of a release lol.
 
Wow this thread blew up. Its been interesting to read through, to say the least.

I'll say that I've gone ahead and bit the bullet. The nitro was out of stock, so I settled for a pulse as its basically the same just no adjustable RGB. I'm convinced that this card is going to be the best for price/performance for a good while. (720 quid is still a stinger though). But its not a whole lot more than what I'd be normally paying i guess. I just wanted something that performs well, and the 6800 sounded right up my alley. Ray tracing is nice, but I've never been that bothered about it.

And I'm certainly not bothered about cyberpunk after that horror show of a release lol.

Welcome to the club boiii. I'm sure the Pulse will serve you well!

meh, maybe amd will add something like dlss at some point

They are in fact already working on it and will release it for the 6000 series soon.
 
I'll be amazed if amd have something comparable to dlss 2.0 for any of their current cards given they don't have the hardware for it....

ATM, their answer to dlss is fidelityFX as found in a few games i.e. cyberpunk, which does a good job but still not a patch on DLSS.

RDNA 3 for proper ray tracing perf. and a DLSS competitor imo.
 
I'll be amazed if amd have something comparable to dlss 2.0 for any of their current cards given they don't have the hardware for it....

ATM, their answer to dlss is fidelityFX as found in a few games i.e. cyberpunk, which does a good job but still not a patch on DLSS.

RDNA 3 for proper ray tracing perf. and a DLSS competitor imo.
You don't really need dedicated hardware for upscalling because the cost of rendering something @4k is much higher than the cost of upscaling from a lower resolution to 4k. Dedicated hardware can do it faster and at a lower performance cost so for example you can have almost the same FPS at upscaled 4k as you had at 1440p or 1080p without upscaling. Without dedicated hardware you will lose some FPS but much less than you lose if you render the game at native 4k.
The bad thing is we don't hear any news from AMD and Microsoft ( because a lot of the upscaling problem AMD has, depends on Microsoft and Direct ML )

Fidelity FX is not the same thing, it is rendering the game at lower/dynamic res for increased or stable FPS.
 
You don't really need dedicated hardware for upscalling because the cost of rendering something @4k is much higher than the cost of upscaling from a lower resolution to 4k. Dedicated hardware can do it faster and at a lower performance cost so for example you can have almost the same FPS at upscaled 4k as you had at 1440p or 1080p without upscaling. Without dedicated hardware you will lose some FPS but much less than you lose if you render the game at native 4k.
The bad thing is we don't hear any news from AMD and Microsoft ( because a lot of the upscaling problem AMD has, depends on Microsoft and Direct ML )

Fidelity FX is not the same thing, it is rendering the game at lower/dynamic res for increased or stable FPS.

Which makes it not a patch on nvidias dlss :p

Yes, you don't need dedicated hardware but the results won't be as good, there's only so much you can do through optimisation/drivers.

You would think if amd had anything remotely comparable to dlss upcoming, they would at least acknowledge it or give some info., which is why I can't see them releasing anything comparable to dlss any time soon, instead fidelityfx (and yes not the same thing but it's the next best thing for amd users atm) will have to do.
 
Which makes it not a patch on nvidias dlss :p

Yes, you don't need dedicated hardware but the results won't be as good, there's only so much you can do through optimisation/drivers.

You would think if amd had anything remotely comparable to dlss upcoming, they would at least acknowledge it or give some info., which is why I can't see them releasing anything comparable to dlss any time soon, instead fidelityfx (and yes not the same thing but it's the next best thing for amd users atm) will have to do.

The results won't be as good if we talk about FPS. There is no reason for why the results shouldn't be as good if we talk about IQ. But it won't matter too much. you'll have let's say 100 FPS without dedicated hardware and 120 FPS with dedicated hardware. The numbers are arbitrary but the idea is the cost of upscaling is far less than the cost of rendering at native resolution.
There are rumors that AMD will add dedicated hardware to the next generation but that is because something like the tensor cores in Nvidia are doing more than just helping at upscaling, they are also involved in RT denoising and probably AMD wants to close the gap there too. But as it is the case with upscaling, the denoising also can be done without dedicated hardware and you can see that in RDNA 2 ( again at a performance cost, even a huge performance cost when the denoiser used was made by Nvidia and was made to work especially with their dedicated hardware ).

The problem with AMD upscaling is that it depends more on Microsoft and the state of Direct ML than we think. Nvidia has a proprietary tech with a different set of instructions that works only if/when they have access to the game code and are able to put their instructions inside the game code. It is expensive but Nvidia can afford it because it helps their marketing.

Now let's assume that AMD also has the servers to do the AI upscaling and they come up tomorrow with their own version of DLSS. It will work just like it works on Nvidia, AMD will have to work for every game with the game developers and probably pay them to put their own code inside the game. I am not sure AMD can afford that atm.

Instead, if you go to the Direct ML route it means that in the future, the game developers will have a standard that they will all use inside their games. And probably in the future Nvidia will abandon DLSS and use the same Direct ML instructions for upscaling because they will get the same thing for free.
But i am not sure what is the state of Direct ML atm. I am sure that this is the future of upscaling and that you can do it even on older generations like the 5700xt or GTX1080, but i am not sure when we will see games that support Direct ML or how close are Microsoft from making it a standard set of instructions.
 
Yup it's no problem if you're already pushing 100+ fps (at which case, dlss or anything isn't really required then...) but what about games where fps is already less than 60? i.e. cyberpunk. 20 fps boost with dedicated hardware when in 30/40/50 fps range is quite a good chunk better.
 
Yup it's no problem if you're already pushing 100+ fps (at which case, dlss or anything isn't really required then...) but what about games where fps is already less than 60? i.e. cyberpunk. 20 fps boost with dedicated hardware when in 30/40/50 fps range is quite a good chunk better.
I used the numbers as an arbitrary example of the difference between upscaling with or without dedicated hardware. So let's say that on CP you get 75 FPS at 1440p on the 3080 and 70 FPS on the 6800xt.
Then at upscaled 4k you will get >70 FPS on the 3080 and around 60 FPS on the 6800xt. The RDNA2 card will lose more FPS but compared with 35-40 FPS you'll get at native 4k, it is still a huge gain.
Again the numbers i put here are just an example, i am not sure how close they are to the real performance or how big the performance penalty of upscaling without dedicated hardware will be. But in theory it is far less than rendering at native res.
 
Well, I got my 6800 and as i'm still waiting on my CPU and RAM to arrive I thought i'd stick it in my i5 3570k @ 4.5GHz machine for a play about in AC: Odyssey. For an Old CPU it's extremely playable at 3440x1440 with the settings cranked right up. GPU is sometimes dropping as low as 35% usage though with the CPU glued to 100% so i'd expect that when I get my Ryzen 5600X installed it should play very nicely.
 
I jumped on a 6800 MBA when I got the chance of one at RRP, even though I really wanted an XT. I'd already promised my old card (Radeon VII) to a friend and didn't want to let him down.

I'm really pleased with the 6800. Cool, quiet and a big performance upgrade from the VII. I bought it for Cyberpunk and am not disappointed at all, even though it's not the card I really wanted. I initially thought that I'd use it until prices normalised and then I'd buy a 6800XT or maybe a 6900XT, but I'm not at all disappointed with it in reality and I will most likely just keep it until the next upgrade cycle.
 
You don't really need dedicated hardware for upscalling because the cost of rendering something @4k is much higher than the cost of upscaling from a lower resolution to 4k. Dedicated hardware can do it faster and at a lower performance cost so for example you can have almost the same FPS at upscaled 4k as you had at 1440p or 1080p without upscaling. Without dedicated hardware you will lose some FPS but much less than you lose if you render the game at native 4k.
The bad thing is we don't hear any news from AMD and Microsoft ( because a lot of the upscaling problem AMD has, depends on Microsoft and Direct ML )

Fidelity FX is not the same thing, it is rendering the game at lower/dynamic res for increased or stable FPS.

Running 1440p on a 4k tv VS running 4k on a 4k tv with DLSS, the answer is 4K with DLSS, the quality degradation is hardly noticeable.

The performance improvement of DLSS when running at 4k can be as much as triple framerates, as you say the hardware lowers performance cost compared to a GPU with no dedicated hardware.

AMD need DLSS equivalent or they simply get blown out of the water for this generation where to play at 4k with the best IQ vs lowest potential framerate loss you need to be using this upscaling system.

And because AMD don't have dedicated hardware means it's hard to see how they can come up to Nvidia standard of higher res gaming performance.
 
Running 1440p on a 4k tv VS running 4k on a 4k tv with DLSS, the answer is 4K with DLSS, the quality degradation is hardly noticeable.

The performance improvement of DLSS when running at 4k can be as much as triple framerates, as you say the hardware lowers performance cost compared to a GPU with no dedicated hardware.

AMD need DLSS equivalent or they simply get blown out of the water for this generation where to play at 4k with the best IQ vs lowest potential framerate loss you need to be using this upscaling system.

And because AMD don't have dedicated hardware means it's hard to see how they can come up to Nvidia standard of higher res gaming performance.
Up to triple FPS comes from rendering the game at lower resolutions, the dedicated hardware won't give you more FPS, for example if you upscale it from 1440p to 4k and you get 100 FPS at 1440p you'll get up to 100 FPS at 4k.
Without dedicated hardware you won't get 100 but you will get 80 which is more than 60 that you can get rendering the game at native resolution.This is what i explained the upscaling cost is less than rendering at native cost.

So even if tomorrow they both go by the same standard, Nvidia will have more FPS at upscaled res. Not by a lot because AMD has an advantage at lower resolutions, the performance difference could be the same as it is now at 4k, maybe a little bigger.
But then you can always drop the rendered resolution and swear the image looks better than native and much better than it looks on Nvidia (or AMD) at higher resolution, because this is the nature of upscaling and image quality: One thinks it is better than native, another one thinks it is blurrier and so on. :D
Once we will stop judging performance based in native resolutions, anything is possible.
 
Up to triple FPS comes from rendering the game at lower resolutions, the dedicated hardware won't give you more FPS, for example if you upscale it from 1440p to 4k and you get 100 FPS at 1440p you'll get up to 100 FPS at 4k.
Without dedicated hardware you won't get 100 but you will get 80 which is more than 60 that you can get rendering the game at native resolution.This is what i explained the upscaling cost is less than rendering at native cost.

So even if tomorrow they both go by the same standard, Nvidia will have more FPS at upscaled res. Not by a lot because AMD has an advantage at lower resolutions, the performance difference could be the same as it is now at 4k, maybe a little bigger.
But then you can always drop the rendered resolution and swear the image looks better than native and much better than it looks on Nvidia (or AMD) at higher resolution, because this is the nature of upscaling and image quality: One thinks it is better than native, another one thinks it is blurrier and so on. :D
Once we will stop judging performance based in native resolutions, anything is possible.

No one cares if the game is actually rendered at a lower resolution, at least, no one should care unless waving epeen, if the game looks as good as it does native which as I say the difference between native and DLSS is so negligible if you sat most people in front of it with DLSS and said, "right this is native, now we'll put the DLSS ON", you then turn DLSS OFF and say "right this is with DLSS ON", you'd catch them out as they wouldn't realise.

So the fact is that the hardware enabling DLSS without the performance cost will allow an additional increase in framerate compared to a GPU without the hardware.

I'm not sure what you're saying, it's obvious that rendering at 1440p rather than 4k is going to cost less, DLSS allow the rendering of a lower resolution and upscaling to a higher resolution, this sort of thing can degrade image quality, but DLSS does the best job of retaining the image quality, to the point HWGAF if it's native or not when the quality is excellent.
 
Back
Top Bottom