• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The thread which sometimes talks about RDNA2

Status
Not open for further replies.
Associate
Joined
16 Aug 2017
Posts
1,049
The high clocks look really impressive but we've seen it all before - people already hit 3000mhz on Turing cards last year. The performance gained is more important anyway and the high clocks don't translate into that much more performance for whatever reason - comparing a 6800xt at 2.2ghz and one at 2.8ghz, the latter is just 10% faster in games despite having 25% higher clocks

3GHz on Turing? Exotic cooling perhaps, but definitely not on air. Here people are already hitting limits hard-set by AMD (2.8GHz) and likely soon 3GHz. On air. And the only limit so far, is said BIOS hard-limit, not actual chip's limitations.
 
Soldato
Joined
6 Feb 2019
Posts
17,705
3GHZ on Turing? Exotic cooling perhaps, but definitely not on air. Here people are already hitting limits hard-set by AMD (2.8GHz) and likely soon 3GHz. On air. And the only limit so far, is said BIOS hard-limit, not actual chip's limitations.

I wasn't referring to cooling, just that clocks in isolation don't mean much. I mean technically one of AMD's older bulldozer CPU's got close to 10ghz but it's still slower than a stock 5600x.

edit: and Intel's Xe HPG architecture GPUs are pushing close to 50tflops at 1.2ghz
 
Soldato
Joined
4 Feb 2006
Posts
3,219
3GHz on Turing? Exotic cooling perhaps, but definitely not on air. Here people are already hitting limits hard-set by AMD (2.8GHz) and likely soon 3GHz. On air. And the only limit so far, is said BIOS hard-limit, not actual chip's limitations.

Grim5 is the most extreme example of fanboyism I've seen on here. He is so insecure that he's constantly in the AMD thread trying to promote Nvidia rather than use his Nvidia card if he has one.
 
Last edited:
Soldato
Joined
19 Apr 2012
Posts
5,206
If you can handle some tweaking you will probably find the 6800 can be made to be a 6800XT or squeeze enough to make it almost as good as one.

Yeah I can do tweaking, I guess by the time they are all readily available then we will have loads of users utilising the cards and finding best settings and tweaks.
 
Caporegime
Joined
17 Mar 2012
Posts
48,006
Location
ARC-L1, Stanton System
Looks like AMD still has major driver overhead issues even with RDNA2 in DX11 titles, can even see the 6800xt beaten by 50% against the old 2080ti in some DX11 games

https://translate.googleusercontent...rowych&usg=ALkJrhhfVEUjVX2Cf6UU_CE1c3jpOjGbWA

Its a bit of a stretch to take two DX12 games, one of which is also Vulkan, run them in DX11 mode to find bad performance and with that conclude "AMD has DX11 driver overhead issues"

I bet there are a hundred other games that are perhaps also native DX11 that don't have any "DX11 driver overhead issues" it seems to me simply looking for any finding proof of something that they wish to make a clickbait article out of, its complete BS. And its obvious.
 
Last edited:
Soldato
Joined
4 Feb 2006
Posts
3,219
Its a bit of a stretch to take two DX12 games, one of which is also Vulkan, run them in DX11 mode to find bad performance and with that conclude "AMD has DX11 driver overhead issues"

I bet there are a hundred other games that are perhaps also native DX11 that don't have any "DX11 driver overhead issues" it seems to me simply looking for any finding proof of something that they wish to make a clickbait article out of, its complete BS. And its obvious.

Don't feed the troll mate.
 
Soldato
Joined
18 Feb 2015
Posts
6,487
Unfortunately the performance issues in KCD are very real. For whatever reason it seems worse on rdna2 than on vega. Mind you it's down to the game's dx11 implementation because it's not perfect on Nvidia either but for me this is a big disappointment with the new card because I consider it one of the best games of all time so I re-play it every year.
 
Caporegime
Joined
17 Mar 2012
Posts
48,006
Location
ARC-L1, Stanton System
Unfortunately the performance issues in KCD are very real. For whatever reason it seems worse on rdna2 than on vega. Mind you it's down to the game's dx11 implementation because it's not perfect on Nvidia either but for me this is a big disappointment with the new card because I consider it one of the best games of all time so I re-play it every year.

Its Cryengine, and from a time when they couldn't even afford to pay their staff, that engine has some serious flaws.

Far Cry New Dawn (Dunia) is a butchered branch of an older Cryengine revision.
 
Last edited:
Man of Honour
Joined
13 Oct 2006
Posts
91,475
You know I've been a member of all sorts of forums covering a vast range of different topics and I think you are possibly the saddest, most insecure 'member' of a forum I've ever come across. It's staggering in its patheticness.

I dunno I think there is some competition for that spot.
 
Caporegime
Joined
17 Mar 2012
Posts
48,006
Location
ARC-L1, Stanton System
Ubisoft have been known for decades now to make very broken games and then ignore it, it was a running joke as far back as my Silent Hunter III days, you run that today on a high-end system and it still runs like absolute crap.

They are incompetent, and they don't ever care, good artistry and story telling and that's where their budget goes, even one remotely competent coder / programmer? no.... The cinematic experience is the best experience, they actually used that excuse a few years ago to account for the horrendous performance of one of their games at the time.

As a driver developer they are a nightmare, thousands of hours trying to get around their buggy crap, i'm sure AMD will get around to it eventually but its not really worth spending too much time and money on fixing Ubisoft's crap for them after they baked all the bugs in.
 
Man of Honour
Joined
21 May 2012
Posts
31,940
Location
Dalek flagship
Because they are preparing XTX and they don't want the 6900xt to go too far. :)

In use the 6900 XT won't get anywhere near 3.0GHz the same as the 6800 cards don't get anywhere near 2.8GHz.

I think even extreme overclockers using LN2 will struggle to get close to 3.0GHz.

Having said that NVidia and 8nm Ampere are in serious trouble even if the 6900 XT is just a fraction faster than the 6800 XT on clockspeeds, even if end users manage just 2.6GHz or 2.7GHz it is going to be a major headache for the green team.
 
Caporegime
Joined
17 Mar 2012
Posts
48,006
Location
ARC-L1, Stanton System
In use the 6900 XT won't get anywhere near 3.0GHz the same as the 6800 cards don't get anywhere near 2.8GHz.

I think even extreme overclockers using LN2 will struggle to get close to 3.0GHz.

Having said that NVidia and 8nm Ampere are in serious trouble even if the 6900 XT is just a fraction faster than the 6800 XT on clockspeeds, even if end users manage just 2.6GHz or 2.7GHz it is going to be a major headache for the green team.

So far AIB ones are making <2700Mhz>

HUB has a Sapphire one that's 2660Mhz.
Der8re get his Powercolour Red Devil to just over 2700Mhz.
JayZ2cents has a XFX one that also ran at just over 2700Mhz.

These are consistent and stable.

The problem getting past that is twofold, 2800Mhz limit and the +15% power limit, if the 3090XT gets a higher power limit and 3Ghz i think they would get 2800Mhz+, some might make it to 2900Mhz, the 3900XT's are also going to be the best binned chips. Like the Zen 2 3950X was and Zen 3 5950X is.
 
Status
Not open for further replies.
Back
Top Bottom