• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
If AMD fans claim 12GB isn't enough show them this, https://youtu.be/jjBqaGLRycc?t=445

Better to remind them of - https://screenrant.com/amd-launches-gpu-4gb-vram-not-good-enough/

AMD is launching a new GPU today with 4GB of VRAM — something the company made fun of just a couple of years prior. With each year that passes, PC games become more and more demanding. Textures get sharper, ray tracing is more realistic, and developers can cram more things on the screen at once. All of this helps games feel more realistic than ever before, assuming someone has the necessary hardware to handle them.

Along with a capable CPU and ample memory, GPUs are a critical part of this process. They're the main component that brings a game's graphics to life, and there are a lot of things to consider when buying one — such as the core count, clock speed, memory type, etc. Another one of these factors is the amount of VRAM. The more VRAM a GPU has, the more graphics data the GPU has access to, which in turn means faster frame rates and better-looking games. VRAM amounts have gradually gone up over the years, with 4GB generally considered the absolute bare minimum in 2022.

That all leads us to today's news. On January 19, AMD launched its Radeon RX 6500 XT GPU. On the surface, it's a pretty modest card. It's based on a 6nm design, touts 16 compute units, and promises up to 108 FPS and 78 FPS in Resident Evil Village and Halo Infinite, respectively. However, it comes with just 4GB of VRAM. Not only is that a pretty small amount by 2022 standards, but as KitGuru cleverly pointed out, it's something AMD trash-talked a couple of years ago.

AMD Claimed 4GB Wasn't 'Enough For Today's Games'

Per KitGuru's sleuthing, an AMD blog post from 2020 had a very interesting comment from Radeon marketing specialist Adit Bhutani. Trying to hype up AMD's new cards with 6 and 8GB of VRAM, Bhutani said the following: "AMD is leading the industry at providing gamers with high VRAM graphics solutions across the entire product offering. Competitive products at a similar entry level price-point are offering up to a maximum of 4GB of VRAM, which is evidently not enough for today's games."

Yep, that's correct. In 2020, AMD flat-out said 4GB of VRAM wasn't good enough for gamers. Fast forward to 2022, and the company's now launching a new GPU with 4GB of VRAM. It doesn't get much funnier than that. The Verge reports that AMD quickly removed the old blog post after this news broke, but it's since been restored to its original state.

As poor of a look as this is for AMD, it's far from the first time a tech company has had to awkwardly walk back on something it said. Samsung is notorious for this kind of thing. Right after the iPhone X came out, Samsung had a whole ad campaign mocking the iPhone's notch. A couple of years later, Samsung launched the Galaxy S10 with a cutout for its selfie camera. Samsung's also made fun of Apple for removing the headphone jack and not shipping chargers with its phones — two things Samsung's since started doing, too. Ultimately, this isn't that big of a deal for AMD. The RX 6500 XT will launch, people will still buy it, and that's all there is to it. Still, it's always humorous when something like this happens. The internet is forever, and that's especially true when people want to hold companies accountable for overzealous remarks.


AMD hides 2020 blogpost claiming 4GB VRAM is ‘not enough for today’s games

Update, Jan 22 15:16 GMT: The original blogpost is now live (again). It was offline for at least four hours (as of when we spotted it) but possibly longer. We do not accept the timing was coincidental. Every other blog post we opened at the time from community.amd.com remained accessible and multiple media outlets have confirmed our story.

AMD has hidden a blog post from its own website that, in June 2020, claimed ‘4GB of VRAM… is evidently not enough for todays games.’ This comes ahead of the RX 6500 XT launch at 2pm today, a graphics card with just 4GB of video memory, that has already received heavy criticism online.

The blog, which was previously accessible via this URL, was public as recently as yesterday – which I know because I accessed it and took the below screenshot. When re-visiting the blog today, I was met with an initial login window which was not there yesterday, asking me to enter my AMD Community details. I registered an account, entered the login details, and was then met with an ‘Access Denied’ message. Another team member independently tried the same process and we were able to replicate the denial of access.

If you try and find this blog on Google, it appears as the top result, with the full URL showing as ‘https://community.amd.com/t5/gaming/game-beyond-4gb/ba-p/414776’. Today, the words ‘private-archive-blog’ have now been added to the URL, while it is no longer visible on the author’s AMD Community profile, confirming to us that AMD has hidden it from public view.

This does not look good for AMD, at all. We can’t comment on the RX 6500 XT, but it simply cannot be a coincidence that on the same day AMD is launching a 4GB graphics card, a previous blog post from the company saying ‘4GB of VRAM… is evidently not enough for todays games’ is hidden from the public.

The blog, written by Adit Bhutani, Product Marketing Specialist for Radeon and Gaming, also makes reference to ‘AMD… leading the industry at providing gamers with high VRAM graphics solutions across the entire product offering,’ which obviously doesn’t apply to the 4GB RX 6500 XT when the RTX 3050 will ship with 8GB VRAM, while it also points out that ‘gamers might expect several issues’ when playing with ‘insufficient levels of Graphics Memory, even at 1080p.’

This is in sharp contrast to the dialogue around the RX 6500 XT, with Radeon Vice President Laura Smith claiming that ‘the four gigs of frame buffer, that’s a really nice frame buffer size for the majority of triple-A games,’ with Smith also suggesting a ‘gamer-first’ mentality around this card.

Unfortunately for AMD, the article is preserved on Wayback Machine and was widely covered by the press at the time (see Tweaktown, OC3D, ExtremeTech), which only makes this attempt to hide the blog post even more baffling. If this really is an underhand attempt to hide comments AMD made about 4GB GPUs ahead of the RX 6500 XT launch, it really speaks volumes and is honestly shocking behaviour from the company.
 
Bunch a guys running an Nv on a budget gpu laughing at a 4gb AMD entry level card and embarrassing themselves mocking guys with more vram and more cash.:p:cry:

I was simply enjoying the irony of a AMD's 4GB U-turn rather than the 4GB card itself.

I don't see the problem mocking Grim as he's found himself in the position where his PC resets when idle, has no real use for the GPU he purchased, his CPU stutters more than King George VI and his OLED still displays what he did last week :D

Why so triggered?
 
The idiocy of jealousy, throw in some envy with an added sprinkling of insecurity, I'd be the last person to laugh at someone with a bigger shlong than me-I'd just do that homer in a bush thing and hope no one noticed.... :cry:

As usual people see what they want to see. Maybe put the bias away and read again, you will see we are not jealous of Grim but rather laughing at him for continually making an arse of himself. After all he is the one showing off he could buy multiple 3090’s every month and could smash his one with a hammer. Then when called up on it turns out he is full of **** :cry:
 
As usual people see what they want to see. Maybe put the bias away and read again, you will see we are not jealous of Grim but rather laughing at him for continually making an arse of himself. After all he is the one showing off he could buy multiple 3090’s every month and could smash his one with a hammer. Then when called up on it turns out he is full of **** :cry:
Bias?
I don't know Grim, couldn't care less about Grim either?:confused:

But the guys with 90's are laughing harder at the 'saved this much and can buy a faster 70 next gen', when they'll just go out and buy another 90 next gen anyway.
 
Bias?
I don't know Grim, couldn't care less about Grim either?:confused:

But the guys with 90's are laughing harder at the 'saved this much and can buy a faster 70 next gen', when they'll just go out and buy another 90 next gen anyway.
Given how hard said 3090 owners are trying to belittle the 3080 (and failing spectacularly), I think otherwise (especially when one even admitted they wanted 3080 more than the 3090 but had no choice but to overpay for the 3090 :cry:) ;) Must be gutted spending all that extra money to only get 10-15% more perf. if that.... and knowing that the 4070 is right around the corner about to drop the 3090 down to mid range tier :D
 
  • Haha
Reactions: TNA
But the guys with 90's are laughing harder at the 'saved this much and can buy a faster 70 next gen', when they'll just go out and buy another 90 next gen anyway.

Or mined and can afford 4 cards the guys in that club you are referring to, whichever way you spin it you only lost out if you didn't sell it on at the peak or didn't mine. Judging by those snipes, these fellas held onto the 3080's which is why they spout this weird click like they got the best and everyone else sucks.
 
As per all the gpu rewards from reputable tech reviewers and HUBs best gen GPU recently, 3080 MSRP owners did in fact get the best :cool:

I also find the claims of making a certain amount back from mining a bit far fetched, I'm not mining expert but a quick google suggests, it's not as profitable as people like to make out, at least if only using 1 gpu and only mining every now and then ;)


FiWfJuK.png
 
Bias?
I don't know Grim, couldn't care less about Grim either?:confused:

But the guys with 90's are laughing harder at the 'saved this much and can buy a faster 70 next gen', when they'll just go out and buy another 90 next gen anyway.

Not buying one does not mean one can’t buy one though does it? I sold my 3080 for £1620 and could have got a 3090 with it, instead I got a 3070.. Not that I needed to sell something to buy a 3090. Again as I said people see what they want to see.
 
The idiocy of jealousy, throw in some envy with an added sprinkling of insecurity, I'd be the last person to laugh at someone with a bigger shlong than me-I'd just do that homer in a bush thing and hope no one noticed.... :cry:
The 3080 shlong is pretty much the same as a 3090 with the main difference being the 3090 comes with elephant balls which outside of the odd scenario are not adding to performance.
 
As per all the gpu rewards from reputable tech reviewers and HUBs best gen GPU recently, 3080 MSRP owners did in fact get the best :cool:

I also find the claims of making a certain amount back from mining a bit far fetched, I'm not mining expert but a quick google suggests, it's not as profitable as people like to make out, at least if only using 1 gpu and only mining every now and then ;)
The quickest way to make money from mining was to sell cards to miners. That was why we had so much scalping.
 
Last edited:
As per all the gpu rewards from reputable tech reviewers and HUBs best gen GPU recently, 3080 MSRP owners did in fact get the best :cool:

I also find the claims of making a certain amount back from mining a bit far fetched, I'm not mining expert but a quick google suggests, it's not as profitable as people like to make out, at least if only using 1 gpu and only mining every now and then ;)


FiWfJuK.png
That thread was posted before the Crypto boom, I think it was very profitable at one point.
 
That thread was posted before the Crypto boom, I think it was very profitable at one point.
Was posted just before the massive boom tbf. Point still stands though, you would have only made thousands (after electricity costs) if you were mining pretty much 24/7 with just the one card. The people in proper/worthwhile profit would have had multiple gpus mining 24/7.
 
Status
Not open for further replies.
Back
Top Bottom