• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The thread which sometimes talks about RDNA2

Status
Not open for further replies.
People seem to be ignoring the fact that the AMD gpu's all have 16GB VRAM and are capable of SAM if coupled with a Zen3 cpu. That is a significant bonus as more games start to use higher res textures. The SAM will allow devs to allocate some of the spare 16GB as cache for cpu operations which should greatly increase performance beyond what AMD has shown so far. The console devs will make use of the feature for sure.

Some will claim that RT and DLSS features are better but we now know AMD is preparing their own competitor to DLSS which promises to work with a wider range of games. RT will also be at least as good as on the consoles if not better than the RTX 3000 cards and that is good enough for me. I've seen RT in some of the console games and it's great.

Here's an analysis of RT on PS5. Especially watch the Pragmata bit starting at 10:37

After today I'm not sure why people would not be cancelling their 3080 orders to be honest :confused:.

That footage has bad shimmering, they need to replace that RT with some proper AA. SGSSAA bring it back.

The edges are horrific. I hope RT is toggle enabled.

Disappointed in DF (and other review outlets), loads of analysis on X and ps5, but nothing on the series S. No review units for the S either.
 
Some of the posts in here today were cringeworthy though. People posting in big caps and exclamation marks things like,

RIP NVIDIA!!!!

BOOM!!!!

NVIDIA ARE DEAD!!!!

Ridiculous. The main two GPU’s pretty much perform around exactly the same for the same price. It’s hardly going to kill Nvidia off ffs.
 
Ok so who was closest on pricing?
I appear to have called the 6800XT dead on. I did actually think 579 for the 6800 but seemed too close to the 6800XT so dropped to 549, so I was wrong there, but was fairly accurate with performance. Totally wrong on the 6900XT, didn't even think they'd announce one yet!
 
So as per my earlier post

https://videocardz.com/newz/amd-ray...dia-rt-core-in-this-dxr-ray-tracing-benchmark

AMD was slower back in august when testing Raytracing, we dont know the clockspeed, older drivers etc and to still beat Ampere handily at their first attempt is actually quite amazing.

With the way AMD are able to squeeze more perf with drivers over time, the 6800XT is shaping up to be quite something special
 
Nvidia is still looking good at the RTX 3070 price point, even affordable I dare say.

Theres been a shift with thr RTX 3070/3080, almost as if NV has become the company more interested in offering good value, with AMD now more concerned with offering the best performance.
 
From the bits and pieces that AMD has disclosed, the company has told me that the tech [sic: Smart Access Memory] adjusts how data is transferred between the GPU and the CPU by giving the CPU direct access to the full 16GB of the GPU’s VRAM, avoiding the usual 256MB aperature limitation between CPUs and PCIe devices. The net result is that Smart Access Memory will be able to reduce memory fragmentation within the VRAM pool, which will improve performance.
https://www.anandtech.com/show/1620...-starts-at-the-highend-coming-november-18th/2

These cards are poised to beat Ampere flat out on next gen ported games IMO. We will have to wait and see.
 
Last edited:
Some of the posts in here today were cringeworthy though. People posting in big caps and exclamation marks things like,

RIP NVIDIA!!!!

BOOM!!!!

NVIDIA ARE DEAD!!!!

Ridiculous. The main two GPU’s pretty much perform around exactly the same for the same price. It’s hardly going to kill Nvidia off ffs.

Didn't they knock off 500$ with the highest offering , personally I'm happy AMD are back competing at the top end only benefits us consumers

Somehow feel Nvidia would have priced the 3080 at £900-£1000 if AMD didn't have competing cards

Sorry I couldn't resist

hTVWyaX.gif
 
Some of the posts in here today were cringeworthy though. People posting in big caps and exclamation marks things like,

RIP NVIDIA!!!!

BOOM!!!!

NVIDIA ARE DEAD!!!!

Ridiculous. The main two GPU’s pretty much perform around exactly the same for the same price. It’s hardly going to kill Nvidia off ffs.
You must be fun at parties.
 
This...

Plus, we haven't actually seen any real reviews yet guys, it's brilliant AMD are back competing but until the real reviews come in we only have their word for the figures they quoted.

I hope it's all true, but until I see some actual reviews from someone like Gamers Nexus, or Hardware Unboxed etc... I'll hold judgement.


Some of the posts in here today were cringeworthy though. People posting in big caps and exclamation marks things like,

RIP NVIDIA!!!!

BOOM!!!!

NVIDIA ARE DEAD!!!!

Ridiculous. The main two GPU’s pretty much perform around exactly the same for the same price. It’s hardly going to kill Nvidia off ffs.
 
So in your opinion, the AMD 16GB VRAM is redundant and waste of space and NVIDIA have hit the sweetspot with 8GB and 10GB?

I think you need to have an amount of vRAM that is appropriate for the GPU that is on the card. Most of the assets you put into your vRAM is something the GPU is working on to produce the next frame and so there's a maximum realistic limit any specific GPU can use. We've seen this with testing on large vRAM cards like the 3090, if you find games which push vRAM above about 8Gb the frame rate becomes unplayable (FS2020, Avengers)

There's a bunch of caveats to this which you need to understand to make those claims credible, primarily that for years we've measured vRAM wrong, we've measured only what the game has requested to be allocated to it, not how much of that vRAM it's actually using and needs. Once you realise games today are barely breaking 5-6Gb for 4k Uttra, 8Gb and 10Gb seem more reasonable.
 
I'm seeing a lot about gaming performance. But what about the rest? How is RDNA2 at rendering, encoding, machine learning etc? Where is AMD on PyTorch, MXNet, TensorFlow? Does Nvidia have advantages beyond gaming?
 
Didn't they knock off 500$ with the highest offering , personally I'm happy AMD are back competing at the top end only benefits us consumers

Somehow feel Nvidia would have priced the 3080 at £900-£1000 if AMD didn't have competing cards

Sorry I couldn't resist

hTVWyaX.gif
That’s why I said the main two cards. The 3090 card was DOA imo. I knew this as soon as I saw the benches compared to the 3080. AMD’s offering was just the final nail in the coffin for it.
 
Status
Not open for further replies.
Back
Top Bottom