• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

More Memory Matters - AMD highlights the importance of more vram

Status
Not open for further replies.
You mean like this statement ? ;) (pot ... kettle)

Using just TLOU, what would you say has happened here with rdna 2 and the 3090:

8XawDJA.png


ca3TMbn.png


And it's basically the same for the other listed titles i.e. hogwarts, forgotten
 
Last edited:
Using just TLOU, what would you say has happened here with rdna 2 and the 3090:

8XawDJA.png


ca3TMbn.png


And it's basically the same for the other listed titles i.e. hogwarts, forgotten

You do realise the game was designed to be played at 30fps all the way back to the PS3 days, then the PS4 remaster version and only the PS4 Pro I think had the 60fps performance mode then the PS5 on the remake. That's also why the game is broken on a mouse when you look around as the game was never designed to be moved around that fast and why a controller is the best way to play it.

Let's move on from such things causing such threads to get locked. I get your point as you know but you have posted these graphs enough that we all know them by heart now. Let's try keep this topic open and on topic and not walls of text or images to show something I'm sure 99% in this topic know already.
 
I see all the talk is about Nvidia but lets not forget AMD are guilty of short changing customers on VRAM also,

This was what AMD said in 2020

Screenshot-521.png


Then over 2 years later the successor arrives with just 4gb and offers less performance for more money.

It's funny you posted that as I was going to do the same and remind everyone it's not an AMD vs NVIDIA thing but a thing they all do at some point be it VRAM or some other hardware or feature issue. Thanks for posting that as that was a classic that at one stage they tried to hide but got called out on it and it came back after "technical issues".
 
You do realise the game was designed to be played at 30fps all the way back to the PS3 days, then the PS4 remaster version and only the PS4 Pro I think had the 60fps performance mode then the PS5 on the remake. That's also why the game is broken on a mouse when you look around as the game was never designed to be moved around that fast and why a controller is the best way to play it.

Let's move on from such things causing such threads to get locked. I get your point as you know but you have posted these graphs enough that we all know them by heart now. Let's try keep this topic open and on topic and not walls of text or images to show something I'm sure 99% in this topic know already.

That's one way to avoid answering the question :D

Let me ask it a different way, why is the 4090 24gb getting considerably higher fps than a 3090 24gb? :p ;)



You're the one who started this side discussion with:

JxDg9sA.png


i.e. insinuating that the 3090, rdna 2 have enough grunt and saying I am misleading when as you can see, I'm not ;) If going to debunk, provide evidence :p
 
It's funny you posted that as I was going to do the same and remind everyone it's not an AMD vs NVIDIA thing but a thing they all do at some point be it VRAM or some other hardware or feature issue. Thanks for posting that as that was a classic that at one stage they tried to hide but got called out on it and it came back after "technical issues".

+1
 
It's funny you posted that as I was going to do the same and remind everyone it's not an AMD vs NVIDIA thing but a thing they all do at some point be it VRAM or some other hardware or feature issue. Thanks for posting that as that was a classic that at one stage they tried to hide but got called out on it and it came back after "technical issues".
The technical issue was that they had sent that slide to the shredder!

BoM obviously matters, but segmentation and planned obsolescence is why we are not given the choice of buying a 16GB 6600 XT or 3070, a 20GB 3080, or a 24GB 6700 XT.
 
Let me ask it a different way, why is the 4090 24gb getting considerably higher fps than a 3090 24gb? :p ;)

4090 =Transistors 76,300 million
Base Clock 2235 MHz
Boost Clock 2520 MHz
Memory Clock 1313 MHz 21 Gbps effective



3090 =Transistors 28,300 million
Base Clock 1395 MHz
Boost Clock 1695 MHz
Memory Clock 1219 MHz 19.5 Gbps effective


4090 has 2.7x the Transistors of a 3090 and much higher clocks on core and memory, but is 55-65% faster depending on the game. Both so far not suffering VRAM issues in games, but by next gen games they will be coming close for the extreme ultra settings on games then if not sooner as mods now for certain games can use more than 24GB too.


So with what you know now.. this is why the 4090 is faster as it should be as it's a newer gen with 2.7 times the transistors and higher clocks on core and memory...
 
Last edited:
4090 =Transistors 76,300 million
Base Clock 2235 MHz
Boost Clock 2520 MHz
Memory Clock 1313 MHz 21 Gbps effective



3090 =Transistors 28,300 million
Base Clock 1395 MHz
Boost Clock 1695 MHz
Memory Clock 1219 MHz 19.5 Gbps effective


4090 has 2.7x the Transistors of a 3090 and much higher clocks on core and memory, but is 55-65% faster depending on the game. Both so far not suffering VRAM issues in games, but by next gen games they will be coming close for the extreme ultra settings on games then if not sooner as mods now for certain games can use more than 24GB too.

So in other words, it has the grunt over the 3090 :p :D ;)



Which is going to be a very real problem for the pc gaming scene i.e. developers see people blaming hardware instead of their games thus they don't do anything to address the issues on their end thus we keep getting **** ports, which need 2 or more patches to be in a playable state.

Regarding the question raised in the other thread around what % of the market have more than 12gb vram, seems steam now offer this stat:

ZdMkdOp.png


So if 12gb cards are going to have issues going forward, well majority of pc gamers are screwed unless they pay ££££ to avoid issues.

Whilst pushing nvidia to provide more vram, why not also push developers to do a better job?

Keep Calm and Downplay the Lack of VRAM


also
  • It's a console port.
  • It's a conspiracy against nVidia.
  • 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  • It's completely acceptable to disable ray tracing on nVidia cards while AMD users have no issue.

:cry:

With points like this, this is where people are completely missing the point..... Why should people playing at 1080 with 8gb have to upgrade to a ££££ gpu with 12+GB vram just to avoid issues at 1080P? At 4k and even 1440, yeah 8gb is not a great idea regardless of game quality but 1080p.... really?
 
Last edited:
The technical issue was that they had sent that slide to the shredder!

BoM obviously matters, but segmentation and planned obsolescence is why we are not given the choice of buying a 16GB 6600 XT or 3070, a 20GB 3080, or a 24GB 6700 XT.

But remember we did get GA103 (the gpu that should have been the real 3080) in laptops as 3080ti with 16GB VRAM. Funny how 4080 16GB is on AD103 now huhh ?

As I said back then AMD threw a spanner at Nvidia with Ampere and why 3080 was on AD102 because 6800xt was faster than their original plan to make 3080 on GA103 and I bet you the original 3080 didn't have a 10GB VRAM in mind and would have been as now 16GB. Then the 3080 ti on GA102 with 16GB too or 20GB and the 3090 or/& 3090ti would have been their Ampere Titan(s).

AMD saved the day for some of us with 6000 series and helped many that wanted a Titan card but not willing to pay Titan prices get them at £1400 for the FE with a decent amount of VRAM and NVLINK.
 
So it should it's a card with 2.7x the transistor count and new gen replacing the 3090.. if it went backward and worse.. would you find that normal ?

Don't make me use the gif with "I feel the answer is in front of you" :cry::p

Finally we got there :D So 3090 has the vram but not the grunt :p

Think I could use that gif too tbh :p
 
Last edited:
Finally we got there :D So 3090 has the vram but not the grunt :p

Think I could use that gif too tbh :p
Never said it didn't have the grunt. Don't put words in my mouth :P..


I said a 4090 better well be faster with 2.7x the transistors and faster clocks.. but even with all that they are only 55-65% depending on the game.
 
Status
Not open for further replies.
Back
Top Bottom