• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia to AMD

Caporegime
Joined
8 Nov 2008
Posts
29,016
As mentioned, providing the independent reviews hold up ok then I'll be looking at one of the high-end ones seen in the video presentation.

I think the first thing I might be better off doing before shelling out for a card is changing the monitor to one which supports both FreeSync (2) and G-Sync, as well as also having a better panel for my needs - so probably IPS.
 
Soldato
Joined
25 Oct 2010
Posts
5,350

From everything I've heard "RAGE" adds around 2-3% at best.

I'm interested in the smart access memory tech, but again is it dependent on game developers to implement? I want to see more on it and while I'm wary of being locked into proprietary technology, I was planning on a Zen 3 and potential AMD GPU anyway.

That aside, I don't really give a damn about raytracing at the moment. It's fantastic technology don't get me wrong, it's going to be a big part of future releases, but I don't see it being a huge thing for another 2-3 years by which point I'll be upgrading anyway.
 
Permabanned
Joined
21 Jan 2012
Posts
1,640
Location
Doncaster
6800XT/3080 would probably help you maintain 144fps in newer titles, my 1080Ti before I sold it had issues keeping up sometimes).
Shame I won't get my hands on either this year eh haha.

I'll probably cancel the 3070. I'd imagine the 2080 would fetch 350ish so 170 for a upgrade of a few frame isn't value to me
 
Soldato
Joined
4 Feb 2006
Posts
3,204
From everything I've heard "RAGE" adds around 2-3% at best.

I'm interested in the smart access memory tech, but again is it dependent on game developers to implement?

That aside, I don't really give a damn about raytracing at the moment. It's fantastic technology don't get me wrong, it's going to be a big part of future releases, but I don't see it being a huge thing for another 2-3 years by which point I'll be upgrading anyway.

Rage mode is just increasing the power limit a bit so the core clock stays at a higher speed. The gpu probably uses 320W in Rage mode which is acceptable for a small gain.
RT should gain traction since most coinsole games will use it for reflections or shadows. The performance seems acceptable from what I've in PS5 games. RDNA2 should be pretty good with RT since it's the same architecture used in the consoles. Also AMD is developing a 'super resolution' feature which will compete with DLSS so when RT is enabled it will not impact performance that much.
 
Associate
Joined
14 Mar 2003
Posts
1,147
I'm going to keep an eye out for a 3080 drop but if I don't get one by 6800XT launch, I'll be going AMD. That's assuming there's no stock shenanigans again.
 
Associate
Joined
22 Jul 2012
Posts
1,182
Location
UK
Waiting for the independant benchmarks of all cards. I've got a 1080ti at the moment which has been a great GPU, but I'll be doing a full system rebuild to update my outdated build - mainly for VR.

The AMD synergy between CPU and GPU looks interesting. Depending on stock levels, reviews and prices I'm aiming for Feb 2021 before I decide on a new system.
 
Associate
Joined
3 Jul 2012
Posts
425
Shame I won't get my hands on either this year eh haha.

I'll probably cancel the 3070. I'd imagine the 2080 would fetch 350ish so 170 for a upgrade of a few frame isn't value to me

You might have more luck trying to buy from a physical shop that (I can't name here) only allows collection on these GPU's towards the end of November onwards.
 
Associate
Joined
8 Oct 2020
Posts
2,329
Moving back to a desktop so it’s a full build. Going Zen3 and the 6800XT looks like the right fit.

Haven’t had an AMD product in years, but I like the direction they’re taking.
 
Soldato
Joined
25 Oct 2010
Posts
5,350
Rage mode is just increasing the power limit a bit so the core clock stays at a higher speed. The gpu probably uses 320W in Rage mode which is acceptable for a small gain.
RT should gain traction since most coinsole games will use it for reflections or shadows. The performance seems acceptable from what I've in PS5 games. RDNA2 should be pretty good with RT since it's the same architecture used in the consoles. Also AMD is developing a 'super resolution' feature which will compete with DLSS so when RT is enabled it will not impact performance that much.

A lot of interesting stuff going forward that's for sure.

As I said I think RT being a big part of future gaming is a given, I just don't see it being a requirement for another couple of years or so. While the new consoles support it we're also looking at a big lack of launch titles for both the new Xbox and PS5 as well as probably limited launches for the hardware itself, I expect Covid has set back a lot of things by at least a year and those issues are ongoing even now to certain degrees. I think we'll be looking at a lot of 'updated' games and potential delays on anything somewhat new, and I do believe that a lot of potentially intended launch titles were simply never mentioned due to setbacks.

We've tons of interesting technology hitting the market at a horrible time that nobody could have accounted for.
 
Soldato
Joined
6 Jan 2013
Posts
21,845
Location
Rollergirl
I intend to switch out my 3080 for a 6900XT at the start of next year, if the following conditions are met.

  • It is available to buy at MRSP $999
  • The performance figures stack up - looking for 25% over 3080 in raster (**** RTX) @ 4k
  • I don't need to invalidate the warranty to get 25% better performance
  • I don't need to buy a new CPU to get 25% better performance
  • The card performs at full throttle and doesn't exceed 75C at room temperature
  • Drivers aren't a buggy mess
 
Associate
Joined
3 Jul 2012
Posts
425
I intend to switch out my 3080 for a 6900XT at the start of next year, if the following conditions are met.

  • It is available to buy at MRSP $999
  • The performance figures stack up - looking for 25% over 3080 in raster (**** RTX) @ 4k
  • I don't need to invalidate the warranty to get 25% better performance
  • I don't need to buy a new CPU to get 25% better performance
  • The card performs at full throttle and doesn't exceed 75C at room temperature
  • Drivers aren't a buggy mess

The 3090 is only ~15% faster than the 3080, and to match the 3090 AMD had SAM/RAGE turned on, where the extra 10% will come from idk...
 
Don
Joined
20 Feb 2006
Posts
5,226
Location
Leeds
I have a 3080 on the way and G-sync monitor. I also have a 3900x and a x570 motherboard.

I will try to resist but I think by early next year I will give Ryzen 3 and a 6800XT a go(as long as reviews back up the AMD presentation). More out of shiny shiny syndrome.

Part of me does think AMD are tying me in to their eco system, but I bought G-sync so who am I to think that:p
 
Associate
Joined
3 Mar 2015
Posts
385
Location
Wokingham
As mentioned, providing the independent reviews hold up ok then I'll be looking at one of the high-end ones seen in the video presentation.

I think the first thing I might be better off doing before shelling out for a card is changing the monitor to one which supports both FreeSync (2) and G-Sync, as well as also having a better panel for my needs - so probably IPS.

I do recommend the LG 27GN950-b.

Although, I wish they also made a '-w' version because my office room is small. I need everything light or white to avoid it feeling like a 'room of gloom' :) I'm long passed my teenage goth years of everything-in-morbid-black hah!
 
Soldato
Joined
25 Oct 2010
Posts
5,350
Part of me does think AMD are tying me in to their eco system, but I bought G-sync so who am I to think that:p

I find it sort of interesting that there's a bit of a turn around in that regard.

AMD is in a unique position when it comes to their new memory technology, while Intel is looking at launching their own GPU's I don't think anyone expects them to be any good at launch (Whenever/if that happens) and Nvidia simply doesn't have a comparable product in terms of CPU's.

I see it as the following:

1. Intel and Nvidia are forced to adopt some sort of open source technology similar to what AMD is doing.
2. They work together to do something similar in opposition to AMD.
3. They ignore it.

Lets face it, the number 2 option is never going to happen and number 3 would be a bit awful for everyone. It's a situation where AMD has two tech superpowers on the back foot to the point where they need to work with them rather than against them.

Interesting times.
 
Back
Top Bottom