• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The thread which sometimes talks about RDNA2

Status
Not open for further replies.
Associate
Joined
12 Jan 2003
Posts
2,009
I'm not saying that 3900 is good price ,just comparison is bad , 3090 is crap value . But I was hoping for something more or cheaper from 6800xt. There for I'll be waiting for 3080 tuf oc
Huh?
The 3800XT looks like it could beat (at 1440P) and match (at 4k) the 3080 and is cheaper (650 v 699 rrp) yet you are already committed to the 3080? That's not even acknowledging the extra benefits if you are upgrading to an AMD CPU or the fact that you simply can't get the 3080 for it's RRP at present.

Uphill battle for AMD regardless of performance, but I guess we knew that already.
nVidia won't drop priced if people buy their cards regardless of the value they offer.
 
Caporegime
Joined
8 Sep 2005
Posts
27,425
Location
Utopia
Ok I just did some quick and dirty figures and here are the relative price and performance figures across all tested games in the AMD slides, formatted in crappy and slightly awkward forum-o-vision: :o

-----------------------------------

6800 (Base 100%) / 6800XT / 6900XT

Price:

$579 / $649 / $999
100% / +12% / +89%


1440p avg frame all games

59.3 / 177.4 / NA
100% / +11% / NA


4kp avg frame all games

92 / 104.2 / 117.4
100% / +13% / +28%


-----------------------------------

6800
6800 XT +12% price + 11% 1440p + 13% 4k
6900XT + 89% price + NA% 1440p + 28% 4k

The 6800 and 6800XT are actually very close in terms of performance at both 1440p and 4k and you basically pay 12% extra for 11% performance, so a very linear step up.

The 6900XT is a far worse value proposition, giving you 28% additional performance over a 6800XT for an additional 89% cost. Still way better than the 3090 though. :D

Seems like you can't go wrong with either the 6800 or the 6800XT and it will be interesting to see how much difference overclocking makes to the gap! :)
 
Last edited:
Permabanned
Joined
24 Jul 2016
Posts
7,412
Location
South West
Currently using a 3950X and a Radeon VII. Planning to sell my 3950X and...

Think i will switch to either a 5800X or perhaps a 5900X (If I'm feeling frivolous) and a 6800XT.

6900XT is really tempting, but I'm going for the cost effective option for slightly less performance i think. :)
Do you not get a discount on retail prices then? Or do you not work for Amd anymore?
 
Soldato
Joined
18 Oct 2002
Posts
4,333
Those 6900XT clocks are utterly monstrous, @humbug wild calculations were correct and widely laughed at on the forum and beyond the expectations of pretty much everyone.

It makes the £1,700 2080ti cards look like one of the biggest turkey GPU releases of all time.

The biggest kick in the teeth to Nvidia is 16GB all round, they conned their own user base with that one (gimping the vram).

Well done to AMD. Game on.
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
Yes, today because we've been going off the baseline set by consoles. Gaming is revolved around PS4/XBOX at present.

Now we're making the move to PS5, you will see, just like every single console generation, a generational leap in game complexity and demand on GPU and VRAM.

"The XSX has 13.5GB of VRAM to allocate to games, and must share this amount with the CPU. Also only 10GB of the XSX's VRAM has the faster bandwidth of 560GB/s, the other 3.5 GB runs at 336 GB/s. Developers will not likely be allocating more than 10GB to the GPU, otherwise they run into a bandwidth penalty." ... so still more than 8GB. Lets also not ignore the optimisations and streamlined approach of consoles nor the fact that our windows PC also uses some of the GPU.

Right, with both new consoles the realistic expectation is that ~6Gb will be needed for both the OS+Game engine, which would typically run on a PCs CPU and use System RAM and that for video processing you have a max of 10Gb.

There's a leap in complexity but it wont be past the PC, it will be to try and catch up with the PC. They're leaping to GPUs which are about mid-tier GPUs on the PC at best. And memory usage that's no better than our high end. Also the simple fact is that as we start to cram more assets into vRAM on the PC the GPU load becomes too high to be playable, the only games we've got that exceed 8Gb of the PC (when measuring actual memory used, not memory allocated) tank the frame rate of even a 3090 into unplayable.

You're not going to cram 10Gb of assets into the vRAM equivalent for the consoles and have their pathetic GPUs use all of that at an acceptable frame rate. We've seen what happens when these GPUs try and play in 1440p or 4k and it's laughable, they have to use dynamic resolutions to scale back when they struggle with frame rate, and scale back all the quality options like shadow quality and asset complexity because the GPUs choke. DF did a review of the technical aspects of Dirt 5 which show all this in action https://www.youtube.com/watch?v=CF9A935XFkU

There's loads of info on this topic in the dedicated threads asking if 8Gb and 10Gb are enough.
 
Associate
Joined
22 Sep 2020
Posts
32
I just cancelled my 3080 Tuf OC order as AMD seems to offer enough of an upgrade with the 6800XT at a reasonable price point for me coming from an 1080ti.

Also I’m not impressed with what I see as a ‘botched’ launch by Nvidia that I think was just a cynical attempt to artificially create scarcity and drive up prices while also trying to get in first before AMD with a product that they’re going to replace sooner rather than later with ti models in just a few months.
 
Associate
Joined
12 Jan 2003
Posts
2,009
Ok I just did some quick and dirty figures and here are the relative price and performance figures across all tested games in the AMD slides, formatted in crappy and slightly awkward forum-o-vision: :o

-----------------------------------

6800 (Base 100%) / 6800XT / 6900XT

Price:

$579 / $649 / $999
100% / +12% / +89%

1440p avg frame all games

59.3 / 177.4 / NA
100% / +11% / NA

4kp avg frame all games

92 / 104.2 / 117.4
100% / +13% / +28%

-----------------------------------

6800
6800 XT +12% price + 11% 1440p + 13% 4k
6900XT + 89% price + NA% 1440p + 28% 4k

The 6800 and 6800XT are actually very close in terms of performance at both 1440p and 4k and you basically pay 12% extra for 11% performance, so a very linear step up.

The 6900XT is a far worse value proposition, giving you 28% additional performance over a 6800XT for an additional 89% cost. Still way better than the 3090 though. :D

Seems like you can't go wrong with either the 6800 or the 6800XT and it will be interesting to see how much difference overclocking makes to the gap! :)

Did you take into account that the 6800 benches uses the Smart Access Memory (4-13% uplift on the 6800XT) and the 6900XT benches use both that & Rage mode?
 
Associate
Joined
3 Mar 2012
Posts
293
Did you take into account that the 6800 benches uses the Smart Access Memory (4-13% uplift on the 6800XT) and the 6900XT benches use both that & Rage mode?

I was just about to post this, asking how much difference smart access makes because if you look at the 6800 slide minus smart access calling it 10% across the board, the 6800 would therfore be pretty close to 2080ti (3070) performance, discounting those AMD cherry picked games?

My gut is telling me the 6800 seems a bit of a pointless GPU, its not quite a 6800XT and its performance isnt that much further from a 2080ti to warrant the extra money. Its like you should either get a 3070 or a 6800XT. But then of course with a 3070 you have the issue of the 8GB Vram.

I just wish there was a 16GB 3070 coming tomorrow to make my mind up.
 
Associate
Joined
27 Apr 2004
Posts
475
Was talking about the mid-range mate :p

460 was £150-£180 ish.
970 was £250 ish.
1070 was £400 ish iirc.
2070 was £450 ish.
5700XT is about £400 ish but was more expensive at launch.
3070 is going to be about £500 when you can find one in stock :p
6700XT will be close to £500 I'd wager.

The UK prices can't really be considered though, Brexit ruined our exchange rate and prior to that the financial crash did a lot of damage too.

Look at the US launch prices for the 70 series from nVidia.

2012 GTX 670 Launch Price $400 / Adjusted $453
2013 GTX 770 Launch Price $400 / Adjusted $446
2014 GTX 970 Launch Price $329 / Adjusted $361
2016 GTX 1070 Launch Price $379 / Adjusted $411
2018 RTX 2070 Launch Price $500 / Adjusted $518
2020 GTX 3070 Launch Price $500 / Adjusted $500

The 970 was the outlier and one of the all time great cards. The RTX 2070 was a peice of sh!t.

Prices have crept up a bit but people are making out prices have doubled when they haven't!
 
Caporegime
Joined
12 Jul 2007
Posts
40,737
Location
United Kingdom
Those 6900XT clocks are utterly monstrous, @humbug wild calculations were correct and widely laughed at on the forum and beyond the expectations of pretty much everyone.

It makes the £1,700 2080ti cards look like one of the biggest turkey GPU releases of all time.

The biggest kick in the teeth to Nvidia is 16GB all round, they conned their own user base with that one (gimping the vram).

Well done to AMD. Game on.
Humbug deserves a lot of credit and I'm expecting a few grovelling apologies (who am i kidding? :D) from a few users. All hail the bug!

(note to Bug, don't forget to update your CBR20 thread :p)
 
Associate
Joined
10 Jan 2006
Posts
1,791
Location
Scotland
When current gen games that run on 8gb consoles already push towards 10gb vram on pc what do you think will happen with next gen games running on 16gb consoles? Its really simple logic...

My original point was that the new consoles DONT have 16GB VRAM, nothing more and nor did I imply that 8GB was plenty or anything of the sort. I was merely pointing out the blatantly obvious fact that consoles dont have 16GB VRAM as was being asserted on here. We are getting into an entirely different area here and one which I never even commented on if you look at the thread of my messages.
 
Permabanned
Joined
12 Sep 2013
Posts
9,221
Location
Knowhere
Based on the graphs I think an RX 6800 will do great with my 75hz 3440x1440 Freesync panel & AIB models might offer a little higher framerates.

I got a feeling rage mode will cause some massive rage driver problems.:eek:

Me too, I haven't been impressed by AMD's recent attempts at auto overclocking software so I'm not holding my breath on that one.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,933
Location
Greater London
Prices have crept up a bit but people are making out prices have doubled when they haven't!
It is because they cannot get their heads around the exchange rate being different and maybe some other bias I guess.

That is why I always look at the dollar value, cuts out all the rubbish.
 
Status
Not open for further replies.
Back
Top Bottom