• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The thread which sometimes talks about RDNA2

Status
Not open for further replies.
I'm coming from a Sandybridge 2600K and GTX980TI anything is going to seem like an upgrade to me but I'm in a genuine bit of a pickle.

I'm pretty set on Gen3 / 500 series MB and I would like to pair this up with a 6800XT but my issue is that I'm a reasonably medium to heavy Vray user (Also TwinMotion / Lumion) and sadly AMD have simply stopped supporting open CL and so inclusion for AMD based rendering has been removed in VRAY Next, currently only CUDA (NVidia) rendering is supported.

I have some tough choices to make.
 
The good story in that point was correct for the early adopters, my observation was the price it commanded set precedent for the GPU predicament we are in today. The 1080Ti was what £800 or thereabouts, this £1200+ is this issue and now it wont go away. This has been a factor in creeping prices lower in the stack.
Yup you are correct, once a precedent with that kind of price is set, and people STILL buy it regardless, then that's a price that will set a new benchmark for the future and even be exceeded as we have seen in the case of the outrageously priced 3090. :mad:

I mean seriously wtf... just think about it... 150% extra cost for 15% extra performance. It would make me feel physically sick to my stomach and ashamed of myself to buy one and fuel that madness. :o

So many people in the AMD Big Navi thread were saying that if AMD released a 3090 competitor then it wouldn't be any cheaper as "they're not a charity and don't want to be seen as budget". Well, AMD have completely destroyed that logic by actually pricing the 6900XT FAIRLY using a far better engineered and more efficient gaming GPU that is cheaper to produce, far smaller, yet performs on a par with a card costing 60-70% more. :)
 
I play similar games to you (WoW, D4, Cyberpunk, CoD and soon Godfall)

To answer your questions

1. No, i think 14 so far and rumored to be more next year, it is in WoW, Warzone and Cyberpunk though, although in WoW it has almost unplayable impact on your FPS and negligible impact on visuals personally.
2. Yes you would more than likely turn it off, if you stop to admire the pretty lights, someone is putting a bullet in you :)
3. Currently i would say Raytracing is for Screenshot modes only, FPS games your too busy running around not really paying too much attention to the scenery, in something like WoW where your in a raid with 24 or so other people, you often dial down some settings to keep it playable, and WoWs implementation is not overly great looking either

Cyberpunk maybe the good one, however Raytracing even on Nvidia relies on the crutch of DLSS to make it useable really, and AMD are bringing their answer to DLSS as well via a driver update, I think Raytracing will be better used in less fast paced games where you have time to take in the scenery, so Cyberpunk may be a better option or something like Watchdogs Legions, something where your not having to rely on twitch reactions to survive.

AMD Demod what looked to be some kinda Futuristic ARPG yesterday, with Ray tracing enabled, looked ok, im hoping D4 gets an implementation as well.

Cheers dude. Pretty sure that ARPG it demoed was Godfall :)
 
AMD crushed Ampere like a bug, and we haven't even seen OC numbers yet! :D

As for people buying NV for RT performance, may I just - LOL!

3-3-3.jpg.webp

4K has always been more Epeen than a smart choice. Now the arguement is people will be using 4k TVs, but then they sit 2+ meters from the screen. Run it at a sensible resolution such as 1440p. Very playable.

Personally I would have preferred a drop in raster performance and more of the die be devoted to RT.
 
Yup you are correct, once a precedent with that kid of price is set, and people buy it regardless, then that's a price that will set a new benchmark and even be exceeded as we have seen in the case of the outrageously priced 3090.

I mean seriously wtf... just think about it... 150% extra cost for 15% extra performance. It would make me feel physically sick to my stomach and ashamed of myself to buy one and fuel that madness.

So many people in the AMD Big Navi thread were saying that if AMD released a 3090 competitor then it wouldn't be any cheaper as "they're not a charity and don't want to be seen as budget". Well, AMD have completely destroyed that logic by actually pricing the 6900XT FAIRLY using a far better engineered product that is cheaper to produce, smaller, yet performs on a par.

I think even AMD realise they will sell 6900 cards but not by the truckload but if there is a market there, well may as well join in. They done the smart thing by dropping the price as well. I still think the best card in the announced stack is the 6800XT though. Partner models could be something quite brilliant.
 
Cheers dude. Pretty sure that ARPG it demoed was Godfall :)

Yeah it was Godfall.. it kinda looks like a Division 1/2 type game with Swords and armour too me :) i have a copy of it once AMD send me my code, i will be honest though, its not entirely my cuppa tea, as im not overly fond of the whole Jap Anime huge sword dragging across the floor kinda thing, reminds me a bit of Monster Hunter World, or Dark Souls with the weapons, but more of an looter grinder system.
 
I think even AMD realise they will sell 6900 cards but not by the truckload but if there is a market there, well may as well join in. They done the smart thing by dropping the price as well. I still think the best card in the announced stack is the 6800XT though. Partner models could be something quite brilliant.
There is a 6800XT with my name on it. It also sounds like stock shouldn't be a problem if you order within the first couple of days, too. Then, 2 months or so before the next big releases I will sell it, stick in my 1660 Super and wait for the next big thing. :D

A 5900x and 6900XT are going to be the biggest overall performance upgrade I have ever done. I was using gaming laptops for the last 7 years and even when I was constantly staying at the bleeding edge during the glory days of PC upgrading (1996-2008) there was likely never a jump this big that I will have experienced. It's good to be back in the high-end desktop performance world and I will benefit as much for my older modded games like Skyrim as I will for new ones like CP2077. :)
 
Last edited:
Because until AMD Super Resolution is ready, Nvidia are "cheating".

Not cheating no... they are using DLSS as a crutch though to support Raytracing, but its extremely clever of them to do so, i have zero issues with it. But it does show that hardware is still not really at the point where it can use RT without having to have workaround etc to make it really playable. Again i will point out, AMD have seemingly beaten Nvidias first attempt at RT in Turing, with their method in RDNA2, i feel this needs more applause than it is getting, people seem to be overlooking this fact.

Yes they dont match Nvidias second go at RT, but they beat Nvidias first go.. this can only bode well for the future no? especially if they bring out a DLSS competitor, it will aid in the uptake of RT as a whole. Sure many will still turn it off (myself being one unless i want to take screenshots).
 
Not cheating no... they are using DLSS as a crutch though to support Raytracing, but its extremely clever of them to do so, i have zero issues with it. But it does show that hardware is still not really at the point where it can use RT without having to have workaround etc to make it really playable. Again i will point out, AMD have seemingly beaten Nvidias first attempt at RT in Turing, with their method in RDNA2, i feel this needs more applause than it is getting, people seem to be overlooking this fact.

Yes they dont match Nvidias second go at RT, but they beat Nvidias first go.. this can only bode well for the future no? especially if they bring out a DLSS competitor, it will aid in the uptake of RT as a whole. Sure many will still turn it off (myself being one unless i want to take screenshots).

They have done an ok job at RT from first impressions, they have had two extra years since Nvidia's first attempt so lets not go crazy with the praise just yet in relation to RT.
 
Watchdogs Legion uses DXR not RTX, but also supports DLSS although it has a few bugs. This video highlights its effectiveness and as AMD said DXR supported out of the box with no driver updates etc needed, I guess games like this is where it's at:

 
Because until AMD Super Resolution is ready, Nvidia are "cheating".
Credit where its due, the latest DLSS version is incredible tech. And the charts should include both DLSS on/off.

With how capable DLSS is, id rather play a game at 4k max detail 120FPS with DLSS, than 4k max at 60FPS with it off.

Obviously milage varies and those numbers were examples, but it is an amazing feature and should be considered where its available.
 
I'm coming from a Sandybridge 2600K and GTX980TI anything is going to seem like an upgrade to me but I'm in a genuine bit of a pickle.

I'm pretty set on Gen3 / 500 series MB and I would like to pair this up with a 6800XT but my issue is that I'm a reasonably medium to heavy Vray user (Also TwinMotion / Lumion) and sadly AMD have simply stopped supporting open CL and so inclusion for AMD based rendering has been removed in VRAY Next, currently only CUDA (NVidia) rendering is supported.

I have some tough choices to make.
Which CPU are you going to get? A 5900x/5950x would do a very good job at Vray on it's own.
 
I'm coming from a Sandybridge 2600K and GTX980TI anything is going to seem like an upgrade to me but I'm in a genuine bit of a pickle.

I'm pretty set on Gen3 / 500 series MB and I would like to pair this up with a 6800XT but my issue is that I'm a reasonably medium to heavy Vray user (Also TwinMotion / Lumion) and sadly AMD have simply stopped supporting open CL and so inclusion for AMD based rendering has been removed in VRAY Next, currently only CUDA (NVidia) rendering is supported.

I have some tough choices to make.
You could also switch render engines.
 
Watchdogs Legion uses DXR not RTX, but also supports DLSS although it has a few bugs. This video highlights its effectiveness and as AMD said DXR supported out of the box with no driver updates etc needed, I guess games like this is where it's at:


a Vs video without an FPS counter, how pointless. And where did he get a RX6000 card to benchmark?
 
They have done an ok job at RT from first impressions, they have had two extra years since Nvidia's first attempt so lets not go crazy with the praise just yet in relation to RT.
If anything, it's Nvidia who should be performing far better at RT than they actually are with how much additional experience they have had. They are now into their second generation and have managed only a 20% (from what I remember, I need to go check) or so increase over Turing in real-world gaming performance according to the benchmarks? That's hardly impressive.
 
Status
Not open for further replies.
Back
Top Bottom