• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

When the Gpu's prices will go down ?

The correct answer is because Nvidia and AMD were too short-sighted and/or didn't care to design their lower end products with a bus size that would allow more appropriate Vram configurations, and the market has moved on and decided for them.

Can't say I get the objection to 16GB I see bandied about considering the above scenario. It is effectively being against having ample space with room to spare, and instead arguing no it should be halved and be a challenging tight squeeze, I mean why.

Above 8gb should be the standard now . Could say 8gb should be reserved for the very lowest tier

Adding another 8gb for extra $100 to product that's already over priced
 
LOL lower power draw ... lets look at the 4090... 450W card and comes with a 600w power cable and models with VBIOS that can use more.. Power draw for the high end cards has gone threw the roof they use more power now than what two high end cards did in SLI/Crossfire before.

Look at the average PSU now people are using compared to a few years ago.. PSU with 650W was classed as overkill a few years ago even with a high end card and cpu. Now people are saying get 1000W min to make sure your system doesn't shut down with the power spikes. Power use on PCs has become also silly as well as price of these high powered components.

Look at laptops with high end components now they use as much power now as your average high end pc did a few years a go.

4070 only has 200w for 3080 performance, march of progress and all that. In a couple of generations, we'll have cards with 4090 performance for half the power draw.
 
4070 only has 200w for 3080 performance, march of progress and all that. In a couple of generations, we'll have cards with 4090 performance for half the power draw.

I'm hoping next gen we will have 4090 perf at 300w and under £800.

I don't think that too outlandish unless Nvidia get mega greedy AGAIN (...ok this might be likely).
 
Last edited:
4070 only has 200w for 3080 performance, march of progress and all that. In a couple of generations, we'll have cards with 4090 performance for half the power draw.
Roll on 2024? That's when the new stuff is due.

However, both AMD and Nvidia have been very slow to release the full generation of cards. The stupidly priced "4050 Ti" will be released in November 2024, at this rate. Oh, just in time for us to get the RTX 5090 for £2,500.00...And Jensen will tell the consumer how lucky we are to get such cheap graphics cards. And people will be taking out small second mortgages to get one.

Personally, I am sticking with the RTX 3070 until a true value for money GPU comes along again.
 
Last edited:
Exactly, and that's why it was instantly sold out when it released, which was before the mining boom properly kicked off. Remember the thousands of people on ocuk's backorder list from release day?

Given that demand, maybe the 3080 was too cheap (it also was a visible outlier on the price to performance chart).

But Nvidia set the precident now as far as I'm concerned. £700 should get you the 2nd from top sku.
How was the 3080 a visible outlier on the price to performance chart? The 3070 was 26.7 % slower and 28.6% cheaper. The 3060 Ti was 36.6% slower and 42.9 % cheaper. Both the 3070 and 3060 Ti had better price to performance than the 3080.

Clever marketing has made people think "twice as much RAM!", "I must pay a significant amount extra for it", when in reality it doesn't add anything like as much cost.
Going to quote my own post.
Whatever it costs Nvidia to buy, you can double or even triple that for the cost to the end user because they're not going to lower their margins specifically for the higher VRAM cards.
People are deluded if they were expecting Nvidia to just slap on some extra VRAM at cost and lower their profit margins for the higher VRAM card.
 
Not true as nvidia has played a fast one at the top end cards... the 3090 had a 350w TGP, the 4090 is 450w TGP. Yes the 3090ti was 450w TGP too but not apples to apples.. Lets wait for 4090ti with 550w.. which will come for sure. Sneaking in them 100w power increases now every gen..


I also understand what you are saying but Nvidia at top end can not be said to be the same as last gen.. from 30 series to 40 series top end so far 100w increase in power use for same class card. Lets also not forget 2080ti was 250w.. See them 100w increases now from gen to gen in the high end ? Also from 2080ti to 3090ti it is a 200W increase... from 2080ti to 3090 100w increase.
During gaming the 4090 uses about the same as a 3090 though. all the others use less than there equivalent ampere cards as they are on a tier lower die/bus.
 
How was the 3080 a visible outlier on the price to performance chart? The 3070 was 26.7 % slower and 28.6% cheaper. The 3060 Ti was 36.6% slower and 42.9 % cheaper. Both the 3070 and 3060 Ti had better price to performance than the 3080.

Ah..any excuse to fish out my old chart. If you straight line between the 3070 and 3090, the 3080 is higher performance at lower cost than would be expected.


353f9gh-png.410856
 
Last edited:
Ah..any excuse to fish out my old chart. If you straight line between the 3070 and 3090, the 3080 is higher performance at lower cost than would be expected.


353f9gh-png.410856
The 3090 is the outlier card, so if you straight line from the 3060 to the 3090 then they all have higher performance at lower cost than would be expected.
 
The 3090 is the outlier card, so if you straight line from the 3060 to the 3090 then they all have higher performance at lower cost than would be expected.
Agreed, the 'correct line' is that drawn from the 3060 through to the 3080. The 3080 though was only £180 more than the 3070 for a performance nearly equal to the 3090, and is 'on the line'. Easily justifiable marginal cost for that performance leap. That also contributes to the judgement of value.

The 3090 would not have sold well if it wasn't for mining.
 
Last edited:
4070 only has 200w for 3080 performance, march of progress and all that. In a couple of generations, we'll have cards with 4090 performance for half the power draw.
Going on current trajectory they'll also have 4090 price tags. Can't wait to pay $1.5k for a 7060.
 
Agreed, the 'correct line' is that drawn from the 3060 through to the 3080. The 3080 though was only £180 more than the 3070 for a performance nearly equal to the 3090, and is 'on the line'. Easily justifiable marginal cost for that performance leap. That also contributes to the judgement of value.

The 3090 would not have sold well if it wasn't for mining.
I still don't understand how the 3080 was a visible outlier on the price to performance chart?
 
If you draw a line through all the 80 class cards it's right around where it should be before the greed went into overdrive

Yep. The 4080 should at most have been $799 if they wanted to play the inflation card.

$1199 is just hilarious.

The 4080 should be $799
The 4070ti should just be called the 4070 and be $549-$600
The 4070 should be renamed the 4060ti and be $399

That would be an acceptable stack in my eyes and give a decent price/perf boost over the 3xxx series in all tiers.
 
Last edited:
I still don't understand how the 3080 was a visible outlier on the price to performance chart?
Maybe outlier is the wrong term, but in terms of the optimum value for money product, to me that chart demonstrates that the 3080 (FE) was that optimum product.

Evaluating the optimum product means you want the most power and the cheapest price right? In terms of that chart that would mean that the more to the bottom right corner the card is, the more optimum it is. The 3080 occupies the position closest to the bottom right corner. Could also describe it as the card before the kink up in the curve, both serve the same result. In the previous gen, the 2070S would have represented the card closest to the corner. Before that, the 1080Ti. If Im not mistaken, both 2070S and 1080Ti were best sellers too.

If the 3080 was $1000, all three cards (3070, 3080 & 3090) would have been equidistant from the bottom right corner and so there would have been no clear winner.

Untitled.png


Basically what Im saying is line B is shortest so the 3080 represents the optimum card.
 
Last edited:
Back
Top Bottom