• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The thread which sometimes talks about RDNA2

Status
Not open for further replies.
They don't have 16GB of VRAM but they have more VRAM than 8GB lol.
3070 will be a very disposable GPU for 4K gaming IMO.

The 3070 can barely manage 4K already but that's not due to vram limitations.
Performance of the card will suffer before vram becomes an issue, just like with the 3080

It's nice the AMD cards come with 16GB instead of 10GB but in reality that extra 6GB will be placebo for the next 2 years minimum (apart from maybe 1 or 2 games) and by then people will be upgrading anyway
 
Ah I see, the "no answer to DLSS" argument is already in full swing. Nice deflection, boys n girls, nice deflection. I guess you didn't hear Scott Herkelman mention "Super Resolution is in the works" then. And I also see the intentional misreading of charts is in play too. Let's ignore the first batch of numbers and only focus on the Rage Mode and Smart Access scores like AMD are lying about something.

The more things change, the more they stay the same.


We have plenty of deflections.

DLSS in what? 10 games?
RT in how many games? And what does it do to your FPS?
And then finally we have the VRAM card.. we only need as much VRAM as NVIDIA tell us we need.


AMD just put 16GB of VRAM on their cards for fun. :D :D
 
The line up seems weird.

-Why charge so much for the 6800? Perfect opportunity to completely bury the rather pointless 3070 but got greedy.

-The 6800XT looks to be a 3080 killer but it's still only its equal and loses RT features (albeit with more RAM and slightly lower power consumption).

-The 6900XT could be awesome but the graphs aren't showing its true performance and it's still very expensive for a few extra fps when the goal is 4k60.

It's great there is competition but I am a little underwhelmed and will wait for proper reviews. Either way, I don't think I'll be getting any graphics card until next year even ignoring stock issues. I want to see how all this plays out before upgrading my 2080ti.
 
They don't have 16GB of VRAM but they have more VRAM than 8GB lol.
3070 will be a very disposable GPU for 4K gaming IMO.

the 3070 is not a 4k gaming card

and I will eat my left foot if both of the consoles are not heavily reliant on dynamic resolution scaling to perform at 4k / ray tracing etc which to me is not really native 4k. you cant just make a smaller console and say it does everything with all bells and whistles turned on same as a pc GPU.. consoles wont even be in the same league.
 
And due to that VRAM limitation its going to be even lower settings :)
hahah no... Setting will be reduced for Computational power way before vRam, the new 6x stuff doesn't work like normal 6 ram, AMD bring Infinity cache that does the same job as Nvidia just moving it in and out of memory on the fly at lightspeed/raw power, same job different approach.
 
I'm happy to wait and see. :)
NVIDIA owners are trying to convince themselves 8GB is enough but even some developers have said its not.

Maybe AMD just put 16GB of VRAM on their cards for fun.


Which proves decisively and once and for all that NV cheaped out on 70 and 80 VRAM, in line with their long held tradition of riipping people off with low VRAM. Case closed.
 
That's because they were highlighting the uplift in that particular slide.

So why is Forza Horizon 4 so much faster than the 3080 on that slide compared to the normal one :D

And whay does the 6800 slides clearly include the footnote.

uo5cMNR.png

hYxHmOs.png
 
I'm happy to wait and see. :)
NVIDIA owners are trying to convince themselves 8GB is enough but even some developers have said its not.

Maybe AMD just put 16GB of VRAM on their cards for fun.

I never implied that 8GB was "enough" and obviously RDNA2 having 16GB across the board is amazing. That doesn't however translate to consoles, its a 16GB shared system memory which has to take care of EVERYTHING including VRAM so your inane comments about consoles having more VRAM than the new RTX cards is just silly.
 
Which proves decisively and once and for all that NV cheaped out on 70 and 80 VRAM, in line with their long held tradition of riipping people off with low VRAM. Case closed.


Thank you.
People act as if historically NVIDIA haven't done this before.


Similarly with DLSS and RTX, they're just proprietry NVIDIA features which will at some point find an adopted open standard and then NVIDIA slowly let them go.
PhysX, Gsync.. This is their DNA.
 
Anyone looking at 3070 as a better card than the 6800 is just really really stupid, no other way to put it.

I also find it funny how obsessed some are with RT performance all of a sudden but in reality even Nvidia WITH DLSS on still can't deliver a playable experience at 4K (even though let's be honest - DLSS is sub-native res quality so not 4K anyway). So really - what are we talking about? You want to play with stutters & under 30 fps mins?? lol

Good luck to ya! : )))


oQNBXqk.png
 
Looks like good gaming competition from the Red team. What about other uses? Where are AMD on machine learning, support for PyTorch, MXNet, TensorFlow etc, what GPU accelerated libraries are available to AMD hardware?
 
Not sure what to think, it might or might not be faster than 3080, we need benches now, dlss on off, RTX on off etc...
 
Well, the echo chamber effect is strong. Nothing much will change anyway as much as you’d like to ignore DLSS and RT. Market share will most likely look the same in 2 years anyway.

If it would have been as good in RT as the 3080 and have an answer to DLSS only then it would have made sense to ask for the same price ( or well, 50$ lower, my bad )

either that or be way better in raster (+20%) like the rumors were suggesting.


Only a small percent of people seem to care about RT (Mostly the cult of Nvidia), and DLSS does have an answer from AMD, and tbh DLSS is pants on Nvidia atm too, I have had it for 2 years and used it urmmm NEVER!!
 
Status
Not open for further replies.
Back
Top Bottom