• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

When the Gpu's prices will go down ?

They would be surpassed in graphical fidelity by consoles and would be hard to sell GPU's for gaming.

AMD designs console GPUs.

Anyway I wasn't actually suggesting they would do it. Just an example of why GPUs will always have enough VRAM.

It's like the chicken and the egg scenario. The GPU came first and then the game was designed around it.
 
My answer to some of your content is people were gaming on 1080p monitors as the tech available was terrible compared to now. Fast forward to 2020 and beyond we have people into 144hz 1440p as our sample demographic now, we could make it worse and say people are on 4k but that wont be many, so whilst playing reboots of old classics remastered might cut it for some, modern games + either RT or HD texture packs will eat up vram for fun. We already have seen HU dig the state of affairs on this topic and as they are in the accepted sources list its time some of the credible YT channels put that to bed and state what is their opinion on it as lets face it it only gets stuck in a loop on here until these influencers and shills start to go deeper.
That's a reasonable point; although I was using 144hz 1440p in 2017, modern games are obviously more VRAM hungry. I'm still yet to have any issues though, whenever I've had a game approach 8GB VRAM I just lower the texture settings, it's super easy to resolve lack of VRAM in my experience, whereas resolving low framerate can be a lot harder without making severe compromises like dropping below native resolution. As CuriousTomCat says it's kind of a self-fulfilling prophecy though - so many gamers have 8GB VRAM or less (just checked Steam hardware survey, this is about 90% of users) that developers will rarely make games that can't be enjoyed to something approaching their fullset on 8GB cards if they have enough grunt. This will change moving forward obviously but I would love to get 6800 performance 'for cheap' even if it only had 8GB VRAM.
 
That's a reasonable point; although I was using 144hz 1440p in 2017, modern games are obviously more VRAM hungry. I'm still yet to have any issues though, whenever I've had a game approach 8GB VRAM I just lower the texture settings, it's super easy to resolve lack of VRAM in my experience, whereas resolving low framerate can be a lot harder without making severe compromises like dropping below native resolution. As CuriousTomCat says it's kind of a self-fulfilling prophecy though - so many gamers have 8GB VRAM or less (just checked Steam hardware survey, this is about 90% of users) that developers will rarely make games that can't be enjoyed to something approaching their fullset on 8GB cards if they have enough grunt. This will change moving forward obviously but I would love to get 6800 performance 'for cheap' even if it only had 8GB VRAM.

I understand, and for the entry/low end it isnt really an issue. Bear in mind we are on an enthusiast forum and have seen people SLI on the other end of the scale over the years. I think the point is more of an extension and you have left out the only segment that remains, is it ok to turn down settings when you have just shelled out £500+ on a new gpu? Particularly if the limitation you explained there would not be the case if the vendor soldered on a larger ram module? The answer in the past was releasing variants of the same card for example in pascal the 1060. Its also strange that you mention games developers a lot yet none of them seem to come out and declare what they target or get asked to work on. It seems the hardware vendors slap on what they think is the needed amount and as we have seen isn't very consistent. :)

Before I forget the point on the displays, we are now seeing decent panels from Alienware, Samsung and LG with UW 1440 or 4k. Its good to see and even better if they can bring down the cost. With more people upgrading to demanding panels, your not going to want to go backwards on the vram size.
 
Oh lordy, vram debate investing this thread now and what a surprise by the same people that don't even own said gpus insisting on there being "loads" of problems because of lack of vram.... :cry:

Can you elaborate on this please, if anything I've said the opposite in the final paragraph. But this agenda you seem to be pushing about the evil corporations isn't what I'm debating, I'm talking about why I don't want to get pay them even MORE money for VRAM I don't want [for the price premium].

I would rather pay £500 for 6800 performance with 8GB VRAM than say £700 for 6800 performance with 12GB VRAM. Simple as that.

Exactly, same card twice. VRAM just not worth it otherwise it wouldn't be the same card.

The reason why prices are high really doesn't matter, it's evilcorps filling their boots, whatever. Fact is you get offered a card with limited VRAM a LOT cheaper than a similar card with excess VRAM. So my money is on the better value based on current conditions even if it's gash value compared to historical pricing.

Really it's the suckers (I say this tongue in cheek, it's their perogative based on their needs) paying the premium for the high VRAM that are the problem, the people accepting the 'low' VRAM for hundreds of pounds less may be overpaying compared to history but they are not overpaying anywhere near to the level of the people dropping a grand on the "ooh loads of VRAM party party party" posse.

Just 2 of many of your well articulated points but very well said :)

Remember, nvidia bad, amd good!

Fact, as it is right now, 3080 has been from the start and still is the best value gpu you can get at MSRP, HUB even stated this in their recent video on what they thought was the best gpu this gen, there is nothing that can come close to the RT perf and rasterization perf. (which is still better in games "overall" compared to the amd equivalent 6800xt, especially as you move up to higher res., just see HUBs and TPU 50 games comparison [of which neither contain any ray tracing except for metro ee]) you get alongside having access to FSR 1 and eventually FSR 2 as well as DLSS and any other upsampling tech. coming i.e. intels one and TSR.

As many of us who have these gpus know and having been PC gamers/builders for years.... the time vram becomes a problem, new gpus will be out where the mid range beats current gen flagships for half the price.

I do think 8GB is a bit on the low side for 1440 and definitely 4k going forward though but as you and others have said, generally it is the grunt which is the problem first and with turning down settings to achieve said acceptable fps in the first, in return you are also reducing the vram usage, as demonstrated a 3070 is still kicking ass when compared to 2080ti 11GB....

As of right now, the 3080 10gb is still blitzing through 99.9% of games with ease at 3440x1440 144hz and 4k @ 60 (I have both displays and switch back and forth depending on what I'm playing or what kind of mood I'm in), of course dlss is required to get the best visuals AND perf. in certain RT titles. Only reason I'll be upgrading to RDNA 3/4080 will be for better RT as I'm not hitting FPS on my 3440x1440 144hz display that I'm personally happy with (I want a "constant" 100+ fps in RT titles like DL 2 and CP 2077)

If people are so desperate for vram, they can go and spend the extra £500+ for the 12GB 3080(ti) models... Or buy amd cards if they're happy to vastly reduce RT settings or turn them off entirely across several titles....
 
HUB review of the 2060 6GB v 12Gb, in the average game at 1080p or 1440p you will not be able to tell the difference between them with the naked eye, https://youtu.be/9BR9HtSe6H0?t=504
There is no need for people who want to play esports games like Overwatch etc at 1080p to buy a card like the 6700xt with 12GB. A card like the 6600 with 8GB meets a certain market, while the 6800xt meets the needs of another market segment.

Getting back to prices coming down, I hope that we get stock at good prices if the AMD refresh is next week as suggested, https://videocardz.com/newz/powercolor-teases-white-and-pink-radeon-rx-6650xt-hellhound
 
Feeling like the 3060/ti and 6700xt have plateaued the last couple of weeks. Do we think this is where they're likely to sit now? I'd rather they come down to RRP but if they're not actually gonna move I'd prefer to have one sooner than later :p

Though naturally the model I want is out of stock rn :(
Just noticed at least some 6700xt seem to have come down by £50 today, with this being a premium model makes the difference to MRSP much more agreeable, https://www.overclockers.co.uk/-pow...ddr6-pci-express-graphics-card-gx-1a3-pc.html

Prices also look to be down on 6800 and 6800xt today
 
Last edited:
is it ok to turn down settings when you have just shelled out £500+ on a new gpu? Particularly if the limitation you explained there would not be the case if the vendor soldered on a larger ram module?
Yes. I don't care what the label says in the menu, it's about whether a game looks good and performs well. Just imagine the higher texture setting wasn't there.

Some people think they have a divine right to run all games cranked up just because they have an expensive GPU, but this is a bit of a fallacy as Crysis proved. Game A on medium settings can look better (and be more demanding) than Game B on max settings.
 
Yes. I don't care what the label says in the menu, it's about whether a game looks good and performs well. Just imagine the higher texture setting wasn't there.

Some people think they have a divine right to run all games cranked up just because they have an expensive GPU, but this is a bit of a fallacy as Crysis proved. Game A on medium settings can look better (and be more demanding) than Game B on max settings.
Not to mention how many people are willing to spend an "extra" £500/600+ just so they can enable that one said setting? :cry:
 
AMD designs console GPUs.

Anyway I wasn't actually suggesting they would do it. Just an example of why GPUs will always have enough VRAM.

It's like the chicken and the egg scenario. The GPU came first and then the game was designed around it.

Actually MS and Sony design them not AMD.

AMD essentially gives MS and Sony a list of technology they can use, MS and Sony engineers take it to the drawing board and design the system they go back to AMD and ask how much it will cost.

AMD designs off the shelf parts and the technologies behind those components, custom SOCs etc are designed by MS, Sony and AMD says ok we can make it this is what it costs
 
There's an entire thread with dozens of pages of games that show 8 GB being very lacklustre and it being a problem - and that was for 3070 et al, but it's worse for AMD because their memory management actually requires more than Nvidia. Not sure why ppl pretend like it doesn't exist but the vram hunger is real, particularly for new games that push LODs properly and don't have textures from 2012. For the best vram investigations you gotta read ComputerBase tho.

Just a recent example:
Pretty textures need more than 8GB of VRAM
Forza Horizon 5 has really nice textures, but they also take up quite a bit of memory. 8 GB is not enough for this, although a distinction must be made between AMD and Nvidia graphics cards for two reasons. Because the game shows again that a GeForce has better memory management than a Radeon when things get tight.

The result is that GeForce graphics cards with 8 GB have no problems in Full HD and WQHD, and the 3D accelerators only run out of steam in Ultra HD. On a Radeon, on the other hand, there are already limitations in Full HD. When the time comes, the game displays a warning that recognizes fairly reliably when things become problematic. The good news is that the frame rate will simply drop. Forza Horizon 5 still runs well with the most beautiful textures up to WQHD, albeit slower.
 
There's an entire thread with dozens of pages of games that show 8 GB being very lacklustre and it being a problem - and that was for 3070 et al, but it's worse for AMD because their memory management actually requires more than Nvidia. Not sure why ppl pretend like it doesn't exist but the vram hunger is real, particularly for new games that push LODs properly and don't have textures from 2012. For best vram investigations you gotta read ComputerBase tho.

Just a recent example:


For these edge cases you can bend over, vaseline optional
 
Re-reading the OP title again all I can think of singing is.

"When the Gpu's prices will go down, you'd better be ready, when the price goes down....."
 
Actually MS and Sony design them not AMD.

AMD essentially gives MS and Sony a list of technology they can use, MS and Sony engineers take it to the drawing board and design the system they go back to AMD and ask how much it will cost.

AMD designs off the shelf parts and the technologies behind those components, custom SOCs etc are designed by MS, Sony and AMD says ok we can make it this is what it costs

There is also a shared resource I believe (which is greater than 10Gb) and this is where the similarities end with PC as the console can be manipulated far more under the hood, runs in one set ecosystem and overall ends up with mostly AMD friendly parts. Apparently nvidia seem to be influential here..
 
There are 2 easy ways to resolve a lack of vram, one to reduce textures, eg don't load a high res texture pack and/or turn down graphic settings, or use an upscaling tech to lower the resolution, https://youtu.be/f1n1sIQM5wc
Perhaps though now that the price of the 6700xt has gone down today more people might buy it for the vram, but it does have less grunt than the 3070 in most games despite having more vram, https://youtu.be/f0yo2Sc-DyI?t=601
 
The pivot being @ASMB was x enough? The denial was it doesn't happen. The resolution is exactly what you say (methods to get round it). The cause is lack of vram..
 
Back
Top Bottom