• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The thread which sometimes talks about RDNA2

Status
Not open for further replies.
It would have lost out even harder without the cache though. GDDR6X wasn't available to them, so they had to come up with something to deal with regular GDDR6's lower bandwidth.

A lot of people seem to be grabbing the wrong end of the stick though by saying that the 6800 XT scales poorly at 4K. It doesn't. If you compare it to the 5700 XT and Turing, it sees the largest gains over those cards at 4K. Using TechPowerUp's numbers as an example, a 2080 Ti offers 84% of a 6800 XT's performance at 1440p, but only 80% at 4K, despite its wider memory bus and higher memory bandwidth. It's not that the 6800 XT is scaling poorly, it's just that Ampere is plain faster. In terms of scaling, it's more accurate to say that Ampere scales poorly below 4K (something we already knew), which is why AMD have been able to take the lead at lower resolutions. Whatever bottlenecks exist in Ampere's compute-focused architecture prevent it from achieving its full potential at lower resolutions, else it'd likely be just plain faster across the board.

That's a good spot. Nice one.

In the interview he said there is something else they will do with Infinity fab. What that is hasnt been releaved.

On a Blender forum, a developer from AMD mentioned that at some point there will be content/news surronding the RDNA 2 cards aimed at content creation. I wonder if any advantages it bring will be mainly for that.


That is seriously impressive. I wonder what FPS boost we will see from these AIB cards.
 

Blimey, the performance jump between bottom and top is pretty awesome.

a) The 6800 reference powerlimit is maximum +15%, if you write the bios value to the card over 15%, the AMD driver detects it and downclocks the card :D
b) At 2400mhz the 6800 packs a punch
c) The intentionally stock memory clock gimpage can be overridden at the bios level.

Interesting AMD did not driver check the clock speeds but it did check the power-limit, probably to keep the AIB's sweet this time around.

It will be really interesting to see what the AIB powerlimits provide percentage wise - caveat emptor if you are on liquid cooling peeps, feeling a bit happier i did not go reference this time around.
 
That is some strong denial you got going there. Is this how you cope? What does that have to do with the 6800 xt setting a world record on air? You are all over the place lol.


It is only defending when you can actually refute the evidence set before you. Since you have shown no capacity to engage why you feel its not true its only you defending nvidia as usual. :D


Well it does appear that you wont get any good decision from some of the more rabid defenders. They will simply call you a fanboy as they have no means to admit that games are not coded neutral.


it is truly staggering, specially when they can not refute the truth resorting to name calling.



And that is the secret to they lies they post. but don't worry all they can do is name call you because you exposed the truth.

It is amazing at what depths they will lie, hide the truth, feign ignorance, employ deception and name call in order to protect nvidia.

And what makes it so profound is that some of them that been here for years know no more about the ends and outs of pc gaming then someone who just joined the forum that came from console. And it does appear they are not capable of providing any legitimate feedback beyond name calling, lying, deception, etc.
:D

6800xt is massively slower in port royal as well. Port Royal is just a DXR benchmark and it mirrors Control for performance. The RTX 3080 has better RT performance and memory bandwidth which helps at higher resolutions like 4k. The 6800xt is a better 1080p/1440p card for high fps rasterization games. With performance just a little better than the 2080 ti in DXR stock and gets a little more performance from rage and overclocking but not enough to get near the 3080.

If the game is more rasterization than RT. The 6800xt can win but as soon as the game fully uses RT the 3080 will dominate. The RT cores are just far better on ampere, there is no excaping the performance lead here as its too large.
 
Last edited:
Why would it not beat the 3080 at 4k too? It has higher frequencies than 3080. If it`s not the bandwidth then there is no reason not to be better than 3080. But then again if it`s the bandwidth, why do you need 16gb? You will decrease settings long before you will fill 12gb of Vram because the fps will look bad.

AMD Fidelityfx LPM in Godfall 'could' be the reason the 6800/6800xt needs 12GB of vRAM for the highest settings in that game. https://forums.guru3d.com/threads/godfall.432745/page-2

https://www.dsogaming.com/pc-performance-analyses/godfall-pc-performance-analysis/

Graphics-wise, Godfall looks gorgeous. While this game does not take advantage of Ray Tracing, it can push amazing visuals. There are a lot of reflective surfaces, a lot of particles effects, and some truly amazing lighting effects. Furthermore, Counterplay Games has used a lot of high-resolution textures, making everything look crisp and sharp. Speaking of textures, the game did not require more than 8GB of VRAM at 4K/Epic settings. So yeah, despite Counterplay’s claims, Godfall can run smoothly on GPUs that have 10GB of VRAM.

Godfall requires only 6gb-8gb vram at 4k maxed out and not 12gb like claimed by amd
https://www.reddit.com/r/nvidia/comments/jspy61/godfall_requires_only_6gb8gb_vram_at_4k_maxed_out/


Godfall 4k epic settings(maxed out)


6g-8gb allocation, and only 5.5gb-6.5gb actual usage.

Sooooooo, 12gb my ass lmaaao

I am definitely buying the 3080 10gb now its not gonna be an issue for a while :p(1440p144hz)unless amd can come out with dlss of their own and the benchmarks are great and their drivers are awesome...

This is a next gen game... so we can be sure for 90% of the games 8gb won't cause huge issues..10gb is enough though for anything you throw at it..

EDIT- I would like to acknoledge that my title is a bit misleading, turns out that it was actually the developers that said it is using 12gb in the game not amd.

Also the game does allocate(im not sure how much it uses here) 11gb-12gb of vram when you turn FidelityEX LPM(provides an open source library to easily integrate HDR and wide gamut tone and gamut mapping into your game) which is basically an HDR implementation.
If you value this setting, 10gb is at the edge for this game though I think it should still run as the usage should be right around 10gb.
 
Last edited:
LOL 2.6ghz on air - that is totally bonkers @humbug

For reference, my liquid cooled 5700XT with binned 9900KF at 5.3Ghz gives a graphics score of 61% slower than this 6800XT bench run on air.

61% slower.

That score totally destroys anything posted in our own Firestrike thread, the highest posted score there is 41,787 !
Yea but that is AMD biased and anything AMD wins in is AMD biased. Clearly obvious and anything NVidia wins in is GPU agnostic and stuff :D

@TNA
 
6800xt is massively slower in port royal as well. Port Royal is just a DXR benchmark and it mirrors Control for performance. The RTX 3080 has better RT performance and memory bandwidth which helps at higher resolutions like 4k. The 6800xt is a better 1080p/1440p card for high fps rasterization games. With performance just a little better than the 2080 ti in DXR stock and gets a little more performance from rage and overclocking but not enough to get near the 3080.

If the game is more rasterization than RT. The 6800xt can win but as soon as the game fully uses RT the 3080 will dominate. The RT cores are just far better on ampere, there is no excaping the performance lead here as its too large.

turn on ray tracing cyberpunk = 1080p with quality low settings on big Navi, and 4K Ultra on big Ampere...ouch

Inb4 comments that CD Projekt Red is an Nvidia shill


https://www.tomshardware.com/news/c...equirements-official-plus-our-recommendations
 
Godfall requires only 6gb-8gb vram at 4k maxed out and not 12gb like claimed by amd

Had a quick play last night with the new patch and hit over 11GB np, should imagine it'd hit 12GB with a longer session. Looks absolutely fantastic too with all the bells and whistles on :cool:

(RT gave me about a 30fps hit on Ultrawide)
 
AMD DE cancelled my 6800XT order and refunded the Paypal transaction. No message, no excuse. I had my order confirmed 2 minutes after it went live. Cards were up for another 2-3 minutes afterwards.

I emailed asking for an explanation and they said something about not being able to confirm my details, and that I should make sure they're correct then pay again ...

LOL. Everything was confirmed and paid via PP. Also there's no option to amend orders or 're-pay'.

Called them out ... they change tack and say the money was only a pre-authorisation and no money was taken, and that the reason it was cancelled was because they couldn't get the money to go through. Again, a lie. It was taken. They claimed the 'pre-authorisation' would lapse in a few days and that no money had been taken. Again, a lie. They had already issued me a REFUND.

I know one other guy who had the same happen, with different ******** excuses, and his order was paid & confirmed 70 seconds after live. Another guy had his shipped and his order was shipped yesterday ... and his order was paid & confirmed 6 minutes after live.

The system obviously broke down and took too many orders. And then, unforgivably, they didn't allocate them in order .. they appear to have done it randomly.

Really makes me want to not buy any AMD products again for a long time.

P.S. Retailers I've spoken to in NL are expecting first stock of 6800/XT in mid January ... LOL.
 
Had a quick play last night with the new patch and hit over 11GB np, should imagine it'd hit 12GB with a longer session. Looks absolutely fantastic too with all the bells and whistles on :cool:

(RT gave me about a 30fps hit on Ultrawide)

There is a difference between what the game needs and what it allocates. Without FidelityEX LPM the vRAM usage is just like every other normal game at 4k. The youtube video proves that you dont need 12GB of vRAM for epic settings. You 'may' get away with 10GB if you use AMD FidelityEX LPM.

turn on ray tracing cyberpunk = 1080p with quality low settings on big Navi, and 4K Ultra on big Ampere...ouch

Inb4 comments that CD Projekt Red is an Nvidia shill


https://www.tomshardware.com/news/c...equirements-official-plus-our-recommendations

Cyberpunk 2077 looks like it has the same RT features from Control. It's will be hard going if your cards RT performance is not the best, like my 2060 but at lest I can use DLSS 2.x to get it playable. Control with DLSS (720p--4k) I can get 4k playable with all the RT features on. While I wait for a 3080 card.
 
That Red Devil 6800XT firestrike score was using max voltage slider to 1.15v and powerlimit to maximum 15% inside the AMD Radeon tweaker.

When he plays a game, the in game boost clock is set to 2390-2400mhz.

xqjfx6p.jpg
 
I don`t understand why Nvidia fanboys think that Radeon owners can`t play at 1440p while they wait for AMD to come up with their own version of fake 4k resolution.
I don`t always play at 4k but when i do i like to use fake 4k resolution.It looks so much better than real 4k. :)
 
Status
Not open for further replies.
Back
Top Bottom