• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

When the Gpu's prices will go down ?

12GB for high end seems fine but 600 series is more mid-range I would say (albeit with prices like the old high end). What I really don't want is prices being propped up due to being layered in GDDR6, like not being able to buy 6800 performance for £500 because they've lumped on more VRAM. Don't get me wrong, I'd welcome more VRAM as much as the next person, but I also know I've never had a problem keeping below my VRAM limit by tweaking settings. Whereas I have suffered from lack of raw GPU power in the past.
Expensive GDDR6 is (one of) the potential reasons for high MRSP, not for pricing above MRSP, which was driven by other factors. If it's all a big scam we should be seeing prices slashed and saving £00s over MRSP soon.
 
12GB for high end seems fine but 600 series is more mid-range I would say (albeit with prices like the old high end). What I really don't want is prices being propped up due to being layered in GDDR6, like not being able to buy 6800 performance for £500 because they've lumped on more VRAM. Don't get me wrong, I'd welcome more VRAM as much as the next person, but I also know I've never had a problem keeping below my VRAM limit by tweaking settings. Whereas I have suffered from lack of raw GPU power in the past.
Expensive GDDR6 is (one of) the potential reasons for high MRSP, not for pricing above MRSP, which was driven by other factors. If it's all a big scam we should be seeing prices slashed and saving £00s over MRSP soon.

This is the problem, people have been brainwashed into thinking they can have either performance or today's VRAM - but not both, when cards have risen exponentially in price. There's no excuse for £500-£700 cards having the same VRAM as £200 from 2017, none whatsoever.

The 3080/80ti case study proves the RAM scam completely. They knew enthusiasts would feel the FOMO as soon as the ti came out and they could then sell them basically the same card twice. It's so obvious.
 
the demand isnt going to be there as it has been, and with intel arriving. i really cant see much higher prices, its amd refresh, intel Q2, with its paper launch, the low to low-middle end will be pretty competitive, i see the flagships taking the brunt of the price increases
 
The 3080/80ti case study proves the RAM scam completely. They knew enthusiasts would feel the FOMO as soon as the ti came out and they could then sell them basically the same card twice. It's so obvious.
Exactly, same card twice. VRAM just not worth it otherwise it wouldn't be the same card.

The reason why prices are high really doesn't matter, it's evilcorps filling their boots, whatever. Fact is you get offered a card with limited VRAM a LOT cheaper than a similar card with excess VRAM. So my money is on the better value based on current conditions even if it's gash value compared to historical pricing.

Really it's the suckers (I say this tongue in cheek, it's their perogative based on their needs) paying the premium for the high VRAM that are the problem, the people accepting the 'low' VRAM for hundreds of pounds less may be overpaying compared to history but they are not overpaying anywhere near to the level of the people dropping a grand on the "ooh loads of VRAM party party party" posse.
 
Exactly, same card twice. VRAM just not worth it otherwise it wouldn't be the same card.

The reason why prices are high really doesn't matter, it's evilcorps filling their boots, whatever. Fact is you get offered a card with limited VRAM a LOT cheaper than a similar card with excess VRAM. So my money is on the better value based on current conditions even if it's gash value compared to historical pricing.

Really it's the suckers (I say this tongue in cheek, it's their perogative based on their needs) paying the premium for the high VRAM that are the problem, the people accepting the 'low' VRAM for hundreds of pounds less may be overpaying compared to history but they are not overpaying anywhere near to the level of the people dropping a grand on the "ooh loads of VRAM party party party" posse.

Nah, that just goes to prove how they scam people with 2018's VRAM. Nothing to do with cost of VRAM, or whether 10GB was enough or not, just a way to scam people. And then people try and rationalise it as if they're somehow saving you money by skimping on the VRAM, while profits go to the moon... the naivete of people just blows my mind at times :D

There's just simply no way to justify 2017-18's VRAM on a cards in the 500-800 range. But people do because that's where their money went. Just cope.
 
the demand isnt going to be there as it has been, and with intel arriving. i really cant see much higher prices, its amd refresh, intel Q2, with its paper launch, the low to low-middle end will be pretty competitive, i see the flagships taking the brunt of the price increases

Scalper tubers and discord channels are salivating for the new 4000 series cards and profits to be made.
 
It's worth remembering that the VRAM in a particular GPU is related to the bus width. This means that the seemingly lower than ideal amount of VRAM present in a given card isn't necessarily just down to the cost of extra memory chips, it would affect other aspects of the design. I'm not 100% but I think to keep the same bus width you'd need to double the amount of chips (doubling the VRAM on board), and I think it's obvious to most people that 16GB on anything but a flagship/flagship adjacent card is overkill. Worth checking my facts as I'm not certain on the specifics, just tht it isn't as simple as "why not 12gb instead of 8gb grrr, must be because vram cost". I believe this is why the 3060 had 12GB, as the design leant itself to 6 or 12GB, with 6GB being obviously not practical nowadays.
 
Mining constantly has one benefit, less heat cycles. Constant temperatures for electronics is far better than constant heat and cool cycles.

As with most second hand electronics, very difficult to know if it's going to go poof.

Exactly. Something could die regardless of how it's used. It's a gamble buying out of warranty in general.
 
For the first batch as always but for the main batch I don't see it, there's no miners to buy only gamers imo

HuB are talking months after launch for product - when scalper bulk buy to constrain supply prices will go up , mining or not. Which is what happened this time around
 
Simple Google -

GeForce GTX 1080 8GB $599 May 27, 2016
GeForce RTX 2080 8GB $699 September 20, 2018
GeForce RTX 3080 10GB $649 September 17, 2020
Cool, now let's see below from a recent benchmark: https://www.techpowerup.com/review/msi-geforce-rtx-3080-suprim-x-12-gb/31.html

1080p: rx580 8GB is equal or slower to variants of 4GB and 6GB. vRAM is not the limit.
1440p: 2060 6GB beats the 3050 8GB. rtx2080 8GB beats other 8GB cards, meaning the vRAM is not limit.
4k: 3070ti beats the 2080ti 11GB, 6700XT 12GB and other 8GB cards. vRAM is not the limit.

So my question is... why the need of extra vRAM if the games make no real use of it?. You do need some amounts to fit everything with the bus width, but to go for weird amounts setups (like the gtx970), don't know if makes sense...

Maybe future games will actually require more vRAM, maybe 6800/XT will age better because of it (you don't mind the modest RT performance), but this "maybe" has been going on for a very long time. Funny since better textures are a relatively cheap way to get a game to look good - but since the consoles weren't that great at that...
 
Radeon 280X, Aug 2013, 3Gb
Radeon RX 480, Jun 2016, 8Gb/4Gb

Radeon Vega 64, Aug 2017, 8Gb
Radeon VII, Feb 2019, 16Gb
Radeon RX 6800, Nov 2020, 16Gb

*ed for 500-800 range. Most AMD cards were under this bracket tbh
 
Cool, now let's see below from a recent benchmark: https://www.techpowerup.com/review/msi-geforce-rtx-3080-suprim-x-12-gb/31.html

1080p: rx580 8GB is equal or slower to variants of 4GB and 6GB. vRAM is not the limit.
1440p: 2060 6GB beats the 3050 8GB. rtx2080 8GB beats other 8GB cards, meaning the vRAM is not limit.
4k: 3070ti beats the 2080ti 11GB, 6700XT 12GB and other 8GB cards. vRAM is not the limit.

So my question is... why the need of extra vRAM if the games make no real use of it?. You do need some amounts to fit everything with the bus width, but to go for weird amounts setups (like the gtx970), don't know if makes sense...

Maybe future games will actually require more vRAM, maybe 6800/XT will age better because of it (you don't mind the modest RT performance), but this "maybe" has been going on for a very long time. Funny since better textures are a relatively cheap way to get a game to look good - but since the consoles weren't that great at that...

I agree. Adding additional VRAM that is not required is simply madness, adding more cost that I'd rather not have to pay. It's far more sensible to balance VRAM with GPU performance which can already be seen with the 3080 10GB running out of GPU grunt before it runs out of VRAM.

I'd also add that RT is growing up fast thus it's just not feasable to hang on to current gen GPUs expecting them to run next gen games well, which is the TLDR 'Is 10GB enough thread'.
 
Its an interesting point, and one that I wont disagree on. To be fair though I never bought this gen for the ray tracing anyway, the way I see it is if the game can utilise it - happy days, if it tanks fps too much or the implementation is poor I never enable it.

However I think it is odd to assume everyone will be upgrading each gen anyway. Most people carry on beyond the gen after. If you broaden out to the audience like steam hardware has you believe then 1060 end of the sku's mean most wont be interested in best graphics and certainly wouldnt be into ray tracing.
 
Nah, that just goes to prove how they scam people with 2018's VRAM. Nothing to do with cost of VRAM, or whether 10GB was enough or not, just a way to scam people. And then people try and rationalise it as if they're somehow saving you money by skimping on the VRAM, while profits go to the moon... the naivete of people just blows my mind at times :D

There's just simply no way to justify 2017-18's VRAM on a cards in the 500-800 range. But people do because that's where their money went. Just cope.
The money hasn't gone anywhere yet because you simply can't preorder nevermind buy the card in question (7600XT) yet. And it's not being pitched as manufacturers "saving you money", it's more a case of wasting less money. The example you gave of the 3080ti is exhibit 1A, a slightly boosted speed with an extra 2GB VRAM and obscene pricetag. People aren't saying "I really want a card with 8GB VRAM, these lovely manufacturers are my heroes not making me pay 4 figures for a 3080ti by being absolute saints and releasing 8GB cards". They are saying "modern cards are expensive but some of these high VRAM cards are really taking the ****".
I keep coming back to the same point, you talk about 2017-18 VRAM but that's in the context of top tier cards compared to midrange cards. 8GB was a lot in 2017. I mean why aren't we saying whoa whoa whoa whoa whoa WHOA, hang on a minute, in 2015 we were packing 12GB cards like the Titan X, what the hell were these cards doing cheeking us in 2018 with their puny 8GB VRAM? The equivalent tier in 2017-18 was 2-4GB VRAM. It's now 8GB VRAM. Next gen I predict it will be 8-12GB VRAM. Titan class is not 12GB any more it's 24GB. No idea what 4000 series brings but heard rumours of 48GB 3090ti.

What you have to do is take a step back and look at things objectively, where do I want to be spending my money, not getting drawn into a debate about the morals of the people controlling the pricing. You always have the option of not spending money, what you don't have the option of is dictating how much VRAM card a card has, you just get to choose between cards at different price points with differing amounts of VRAM.
 
The money hasn't gone anywhere yet because you simply can't preorder nevermind buy the card in question (7600XT) yet. And it's not being pitched as manufacturers "saving you money", it's more a case of wasting less money. The example you gave of the 3080ti is exhibit 1A, a slightly boosted speed with an extra 2GB VRAM and obscene pricetag. People aren't saying "I really want a card with 8GB VRAM, these lovely manufacturers are my heroes not making me pay 4 figures for a 3080ti by being absolute saints and releasing 8GB cards". They are saying "modern cards are expensive but some of these high VRAM cards are really taking the ****".
I keep coming back to the same point, you talk about 2017-18 VRAM but that's in the context of top tier cards compared to midrange cards. 8GB was a lot in 2017. I mean why aren't we saying whoa whoa whoa whoa whoa WHOA, hang on a minute, in 2015 we were packing 12GB cards like the Titan X, what the hell were these cards doing cheeking us in 2018 with their puny 8GB VRAM? The equivalent tier in 2017-18 was 2-4GB VRAM. It's now 8GB VRAM. Next gen I predict it will be 8-12GB VRAM. Titan class is not 12GB any more it's 24GB. No idea what 4000 series brings but heard rumours of 48GB 3090ti.

My answer to some of your content is people were gaming on 1080p monitors as the tech available was terrible compared to now. Fast forward to 2020 and beyond we have people into 144hz 1440p as our sample demographic now, we could make it worse and say people are on 4k but that wont be many, so whilst playing reboots of old classics remastered might cut it for some, modern games + either RT or HD texture packs will eat up vram for fun. We already have seen HU dig the state of affairs on this topic and as they are in the accepted sources list its time some of the credible YT channels put that to bed and state what is their opinion on it as lets face it it only gets stuck in a loop on here until these influencers and shills start to go deeper.
 
Game developers want sales. They aren't going to release a game that needs 32Gb VRAM or whatever until graphics cards catch up. They look at GPUs that exist already, realise that most have 6Gb to 8Gb VRAM and design their games accordingly.

Nvidia, AMD, and Intel could use 10Gb VRAM on all future GPUs for the next 50 years if they wanted to. In that scenario we'll never get to a point where a game needs more VRAM during that time because game developers aren't stupid. They design games based on VRAM amounts that actually exist.
 
Back
Top Bottom