• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The thread which sometimes talks about RDNA2

Status
Not open for further replies.
I also think the theoretical 3080 Ti for £999, as far more palatable as it would be than the disgraceful, is still a waste when compared against the 6800XT for £650... seems like it's still one for people who just really want an Nvidia card that competes in performance and VRAM with AMD's latest and greatest. We also don't know how far away RDNA3 and Hopper are at the moment and it is pure speculation as to whether we will actually see them in 2021 or not. They may decide to delay the launches due to the very late availability of their 2020 launches, as otherwise they are going to **** off a LOT of people who are not willing to upgrade to soon after buying an expensive card. But who knows, maybe at this point they simply don't give a **** and do it regardless.
The 6800XT is definitely the most attractive high end card from a price performance perspective. I would have got one if I did not have a g-sync monitor.

There is also the consideration that Nvidia and AMD have spent a lot of money on R&D for the current archs. I am sure they would have costed for a two year life span for these releases. So ultimately releasinga new model earlier will be eating into their bottom line as they will need to accelerate new GPUS.
 
Last edited:
Rumor back in the day had 5nm Navi23 but nothing has surfaced as of yet regarding it and remains unconfirmed rumor. Something sneaky about that. Because that's how the other AMD mega thread started if you read the title.

I've said it once and will say it again. Although I'm taking a guess, for me the writing is on the wall, if you want to seemless PC gaming in the not so distance future AMD's PC ecosystem is the way to go: Zen 3, RDNA/RDNA 2, 3600Mhz ram.
You know what I'd forgton about Navi23... yeah all quiet on that one... that will be interesting for 2021...

As for the eco system fully agree, you can see what theyre trying to do and good on them if it works... nVidia need that CPU hahaha
 
You know what I'd forgton about Navi23... yeah all quiet on that one... that will be interesting for 2021...

As for the eco system fully agree, you can see what theyre trying to do and good on them if it works... nVidia need that CPU hahaha
Navi23 is still the Nvidia killer. That hasn't changed. Its Not big navi and vise versa.

But yeah amd should have put together this ecosystem a long time ago with zen2 and rdna1. Why they didn't market it better is beyond me.
 
Last edited:
Well more (quite reliable) leakers have also stated the same https://twitter.com/kopite7kimi/status/1327125321422868480

Correct. I cannot stand over it yet as would have to see it with own eyes but the regular 5700XT can hash at 50MH/s so it would best be comparing to this, however it looks like 3080 does hash for 100MH/s which is a good number would have to see at what power consumption though.

It sounds like the story was concocted out of fantasy so that people made a bigger rush for the AMD stock likely hoping it would allow for 3080's to be in less demand. Or just generally internet skids **** stirring.
 
Correct. I cannot stand over it yet as would have to see it with own eyes but the regular 5700XT can hash at 50MH/s so it would best be comparing to this, however it looks like 3080 does hash for 100MH/s which is a good number would have to see at what power consumption though.

It sounds like the story was concocted out of fantasy so that people made a bigger rush for the AMD stock likely hoping it would allow for 3080's to be in less demand. Or just generally internet skids **** stirring.

If those 6800XT hit say 70MH/s at the rated power draw they will be useless for miners, even undervolted they will be bad return.
 
Correct. I cannot stand over it yet as would have to see it with own eyes but the regular 5700XT can hash at 50MH/s so it would best be comparing to this, however it looks like 3080 does hash for 100MH/s which is a good number would have to see at what power consumption though.

It sounds like the story was concocted out of fantasy so that people made a bigger rush for the AMD stock likely hoping it would allow for 3080's to be in less demand. Or just generally internet skids **** stirring.

If those 6800XT hit say 70MH/s at the rated power draw they will be useless for miners, even undervolted they will be bad return.

I dont do any mining.. just happened to look at the ethhahsh algorithm when the news broke out
And i seem to have a good idea how that rumour originated
AMD had advertised 1664GBps mem bandwidth for Big Navi but it was under the assumption that the infinity cache hit probability would stabilise at 58%, see fineprint below
Measurement calculated by AMD engineering, on a Radeon RX 6000 series card with 128 MB AMD Infinity Cache and 256-bit GDDR6. Measuring 4k gaming average AMD Infinity Cache hit rates of 58% across top gaming titles, multiplied by theoretical peak bandwidth from the 16 64B AMD Infinity Fabric channels connecting the Cache to the Graphics Engine at boost frequency of up to 1.94 GHz. RX-547

So someone just took 10% of that (which is a rough thumb rule for MH/s throughput) and blurted a hash rate of 1.5x 3090..
However the ethhash algorithm randomly calls a 128byte page from 3.5GB DAG (which has now grown to 3.95GB) so the hit rate falls to 3.7%
and the effective mem bandwidth for mining is now just 580Gb/s.. its probably even lower now as the DAG has increased

Also on topic, i am a bit concerned about big navi's transistor budget, with 7-8 billion of 26.8 billion transistors sunk into the Infinity Cache, its workload dependency presents a kind of risk that might unravel in a less than pleasant way.. someone with a better understanding of specific gaming workloads might be able to work out few interesting scenarios
 
Great analysis of the GPU in Xbox Series X towards the end.

tldr - In Legions with RT the GPU is slightly worse to the same than a 2060 Super.

 
Seems like i was right. Sadly RT performance doesn't seem so great. Xbox series X running RT at lower quality than PC's "lowest" and still underperforming. Raster - 2080 like, RT lower than a 2060 super.

Excerpt from the Digitalfoundry Watch Dogs video linked earlier :

I came away from this testing with several conclusions. We may well be seeing a different level of scaling from AMD's console GPUs with RT enabled. After all, in the non-RT Gears 5, an RTX 2080 is said to be comparable to Series X, yet here in Watch Dogs: Legion with RT features active, an RTX 2060 Super seems to be more performant. This means that consoles may require reduced resolution, distance, or material settings compared to mid-range PC graphics hardware. As for Ubisoft taking all of the console compromises and porting them back to PC, I'd say that this may well be a worthwhile option depending on how AMD RDNA 2 cards run the game. With Nvidia tech, there's no real need to reduce RT settings lower than medium - turn on DLSS and you get performance back that mostly covers the hit RT incurs. However, with that said, I found the test highly enlightening and I would love for PC versions to feature console configurations as an option - it would be great for our analyses, but more importantly it would greatly benefit users who just want an easy console-like experience without having to think about graphical settings too deeply. After all, consoles typically deliver the best bang for the buck and those optimisations do tend to transfer across nicely to PC as well.
 
Our money is worth less to them than new-customer money though... because new customer money can be turned into loyal customer money from this sale.. whereas we're already stuck as loyal customers :p

Yup, it's why big companies always have the best offers for new customers whilst spitting on existing ones :D
 
Seems like i was right. Sadly RT performance doesn't seem so great. Xbox series X running RT at lower quality than PC's "lowest" and still underperforming. Raster - 2080 like, RT lower than a 2060 super.

Excerpt from the Digitalfoundry Watch Dogs video linked earlier :
Even Ampere doesn't provide adequate RT performance without DLSS, so I can handle AMD having less RT and better rasterization performance because that's what matters more to me. I think in the future RT will be amazing, but for this this generation the hardware performance is not sufficient for anything more than mild RT effects anyway.

PS: The 6800XT and 6900XT are still significantly more powerful than the console GPU's.
 
Great analysis of the GPU in Xbox Series X towards the end.

tldr - In Legions with RT the GPU is slightly worse to the same than a 2060 Super.



Wow I missed a lot in my screen caps.

I did not expect it to be that bad, so the mighty rdna2 52CU 1.8ghz monster series x loses to a weak RTX2060

What I was right is that the consoles RT settings are lower than the lowest Nvidia allows on PC.

I fear for the desktop card RT performance, AMD needs DorectML like yesterday already

And I'm not saying it's all doom and gloom, The desktop cards will be better than the console no doubt but with paper specs, the 6900xt won't even be double the Series X performance and even if it was double that would still only put it on par with a 2080ti
 
Last edited:
We've been through this already
https://www.overclockers.co.uk/forums/posts/34162719

I also have the game on PC and haven't received an update for those expensive features yet. And there is no mention of it in the video description. And from the look of it doesn't appear to be using the updated version on console either.

Doesn't matter Microsoft keeps lying - next up in a few months from Phil Spencer: Halo 6 is 4k 60fps! After launch: review shows it's actually 720p with ray tracing on
 
Seems like i was right. Sadly RT performance doesn't seem so great. Xbox series X running RT at lower quality than PC's "lowest" and still underperforming. Raster - 2080 like, RT lower than a 2060 super.

Excerpt from the Digitalfoundry Watch Dogs video linked earlier :
If it's that bad then maybe RT won't be as big a thing this generation as some people have said, which could be a good thing with these new AMD GPUs.
 
Status
Not open for further replies.
Back
Top Bottom