• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

6600xt vs 3060 , 1 month in!

Image quality :D

Ray-tracing is a fake "feature". There is nothing in these games that you can't do without using the classic lighting environments.

But even there, Nvidia cheats and can't do anything properly.

At least learn a little on the subject before posting :rolleyes:
 
So I came to this thread hoping to find some more info about these two cards off the back of that video.

Instead what we get is people accusing others of being children, while all parties are arguing like children.

Any chance we could get back on topic please?
 
Can RDNA2 run CUDA code? Next, where did I infer CUDA was magical??

You did say CUDA was included as a modern feature and its not modern like someone said its been around for years.
RDNA2 cant run CUDA code because thats an Nvidia exclusive feature. Like AMD cards cant use GSync hardware in Gsync Monitors.

Nothing to do with RDNA2 not being "able" to run it.

Why not get back to discussing the actual video posted instead of the tit for tat which is going nowhere.
 
What is of interest is HUB saying that in the wider benchmarks, the 6600XT is not as `fast` as the AMD titles first reviewed, that at the lower end of the market RT works for both almost equally (unless you render the 3060 at 720p and upscaled with DLSS). But, the most relevant metric is cost; The 3060 is a lot more to buy (the ones in stock not the tiny drops of FE cards, which is something HUB agrees with).Overall the 6600XT is the better value option.
 
You did say CUDA was included as a modern feature and its not modern like someone said its been around for years.
RDNA2 cant run CUDA code because thats an Nvidia exclusive feature. Like AMD cards cant use GSync hardware in Gsync Monitors.

Nothing to do with RDNA2 not being "able" to run it.

Why not get back to discussing the actual video posted instead of the tit for tat which is going nowhere.

What I said was in responce to the claim that RDNA2 could do everything that RTX does -

Just becuase you refuse to accept facts doesn't make it a troll post. Show me RDNA2 running CUDA, DLSS, or even matching Ampere's RT performance. I admit it compete's in rasterisation, but this is on a smaller node with up to 50% increased clock speeds and sadly at a time where rasterisation is certainly fast enough with the focus moving to RT and AI based super sampling.

I agree the conetent of the video is what should be discussed.
 
What I said was in responce to the claim that RDNA2 could do everything that RTX does

No this was your distorted view on it and want to come out without admitting your accuracy (or lack of) of such a statement. If you do not program with it or use it then it would be barely used by many a gamer. Stop clutching at straws and arguing over petty factoids and get back OT.
 
Image quality is something that you see with your eyes, if you don't mind. Have you heard of 4K or 8K?
Have you seen how poor the image quality is on a GeForce card?

Post some screen shots from both 4 and 8k comparing Nvidia and AMD so that we can compare. Even better, start a new thread for it.
 
What is of interest is HUB saying that in the wider benchmarks, the 6600XT is not as `fast` as the AMD titles first reviewed, that at the lower end of the market RT works for both almost equally (unless you render the 3060 at 720p and upscaled with DLSS). But, the most relevant metric is cost; The 3060 is a lot more to buy (the ones in stock not the tiny drops of FE cards, which is something HUB agrees with).Overall the 6600XT is the better value option.

Summarised the video nicely, good post.

My brother with a Vega 56 is looking to upgrade to one (he runs 1080p 144hz). It's not a massive upgrade, around 30-40%, but if he can sell his card for around £400 on the fleabay with current distorted prices (is Vega good at minng?) then it'll be a cheap, value conscious upgrade.
 
No this was your distorted view on it and want to come out without admitting your accuracy (or lack of) of such a statement. If you do not program with it or use it then it would be barely used by many a gamer. Stop clutching at straws and arguing over petty factoids and get back OT.

No you know this is a troll post. But i'll bite

RDNA 2 has everything RTX can do but done differently. RDNA 2 is also more power efficient something new for AMD to be beating Nvidia is this regard, RDNA 2 rasta performance is also leading the way in most games.
Just becuase you refuse to accept facts doesn't make it a troll post. Show me RDNA2 running CUDA, DLSS, or even matching Ampere's RT performance. I admit it compete's in rasterisation, but this is on a smaller node with up to 50% increased clock speeds and sadly at a time where rasterisation is certainly fast enough with the focus moving to RT and AI based super sampling.
 
Summarised the video nicely, good post.

My brother with a Vega 56 is looking to upgrade to one (he runs 1080p 144hz). It's not a massive upgrade, around 30-40%, but if he can sell his card for around £400 on the fleabay with current distorted prices (is Vega good at minng?) then it'll be a cheap, value conscious upgrade.

Yes worth the switch if you can recoup cost of the vega sale. No brainer.
 

Shankly posted nothing about cuda type activities and was blatanly implying its a gaming card, not compute focused like AMD had beforehand. As @varkanoid pointed out correctly - we both know its a proprietary tech used by niche devs. Why would an AMD card need to replicate this especially for an average gamer?

Ahh ampere's great but it cant do ROCm stuff that I pretend I need it to do (to win arguments).. :cry:
 
Shankly posted nothing about cuda type activities and was blatanly implying its a gaming card, not compute focused like AMD had beforehand. As @varkanoid pointed out correctly - we both know its a proprietary tech used by niche devs. Why would an AMD card need to replicate this especially for an average gamer?

Ahh ampere's great but it cant do ROCm stuff that I pretend I need it to do (to win arguments).. :cry:

Well if we are just going to consider gaming. Ampere has DLSS, up to twice the performance with RT and excellent video encoding with voice cancellation making it the preferred choice for streamers. Is that more to your liking? I'd add more on the Geforce Experience tools, but I never install it.
 
Well if we are just going to consider gaming. Ampere has DLSS, up to twice the performance with RT and excellent video encoding with voice cancellation making it the preferred choice for streamers. Is that more to your liking? I'd add more on the Geforce Experience tools, but I never install it.

Did you watch the HU video at all? They mention the AMD version of nvenc was really good. 8m 11s.

"is also commonly pushed as a big plus for geforce graphics cards but i found very little between between shadowplay and relive for capturing gameplay. AMD's solution now appears to work very well and the quality is excellent.."

I get your point about noise cancelling and some extensive features but your coming across as trying to find flaws generally because you side with nvidia and want it to be the better solution (rather than accept HU view that the 6600xt is "the better value buy").
 
Last edited:
AMD has both excellent video encoding (you didnt watch the HUB video did you, they talked about this as well) and has noise cancelling as well. Feature thats both vendors have and work well. RT works well unless its on a cherry picked title (remember, a 3090 even at 4k doesn`t keep to 60 fps with RT enabled and it forced to use DLSS in CP2077)
 
AMD has both excellent video encoding (you didnt watch the HUB video did you, they talked about this as well) and has noise cancelling as well. Feature thats both vendors have and work well. RT works well unless its on a cherry picked title (remember, a 3090 even at 4k doesn`t keep to 60 fps with RT enabled and it forced to use DLSS in CP2077)

Nope, I flicked through a couple of time stamps and thought click-bait. I don't consider CP2077 to be cherry picked. It's just perhaps the only title so far using the complete suite of features offered by RT.
 
Just as you also didnt read the reddit thread or even the nvidia dev page, hence why you carried on with the trolling, even when proven so utterly wrong. CUDA is a software API for direct to gpu programming and ahs been in use for 14 years. Its Nvidia branded GPGPU , nothing more. Please read the above links and learn something today.
 
Nope, I flicked through a couple of time stamps and thought click-bait. I don't consider CP2077 to be cherry picked. It's just perhaps the only title so far using the complete suite of features offered by RT.

There will almost certainly be a bigger margin in RT between the cards using proper levels of ray tracing in games like Doom Eternal, Metro Exodus Enhanced Edition, Quake 2 RTX, etc. even ignoring CP2077 - even HUB's older videos show a massive margin and drivers and game updates, etc. aren't going to claw all that back a few weeks later.

They seem to have avoided enabling RT in a number of games in this latest video where it would show a bigger difference and used lower profile settings and/or been selective with the features enabled such as VRS so as to avoid hitting any RT bottlenecks.
 
There will almost certainly be a bigger margin in RT between the cards using proper levels of ray tracing in games like Doom Eternal, Metro Exodus Enhanced Edition, Quake 2 RTX, etc. even ignoring CP2077 - even HUB's older videos show a massive margin and drivers and game updates, etc. aren't going to claw all that back a few weeks later.

They seem to have avoided enabling RT in a number of games in this latest video where it would show a bigger difference and used lower profile settings and/or been selective with the features enabled such as VRS so as to avoid hitting any RT bottlenecks.


Unfortunately with the way RT is handled by GPU's (I.e it's not path tracing) - it means there are thousands of equilibrium points across games and settings that anyone can use to cherry pick the results they want.

For example anyone can make RDNA2 look very fast in RT by picking a game that's very light in it, playing at 1080p and with medium settings. Even though tech savvy peeps know Nvidia can do much more in the right scenario, we know that with the way RT is done these game settings is like putting a giant turbo on your car and capping it to 5psi.

And so conversely I can just as easy pick Cyberpunk with Ultra settings at 4k as an example of why RDNA is very slow at RT which completely contradicts any "findings" I may wish to talk about after the results in the first test
 
Back
Top Bottom