• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

When the Gpu's prices will go down ?

Soldato
Joined
2 Jan 2005
Posts
8,453
Location
leeds
Not sure how much you can optimise a game to run on things like 1050, 60, 3060 laptop GPU, 2060 and 3060 (those being the top 5 GPUs according to Steam HW survey). I'm not saying there's no game that will run on those but even though I've not looked at benchmarks for those i suspect they'd struggle with the newest triple A games at 1080p.
if anything it will be pointless for game developers to make games that take advantage of these newer graphics cards - i don't see how its financially viable when so few people can afford the cards. Might as well not bother.
 
Soldato
Joined
31 Dec 2010
Posts
2,559
Location
Sussex
if anything it will be pointless for game developers to make games that take advantage of these newer graphics cards - i don't see how its financially viable when so few people can afford the cards. Might as well not bother.
Cyberpunk's budget is allegedly €316 million (https://en.wikipedia.org/wiki/List_of_most_expensive_video_games_to_develop), yet it seems that the rendering engine is mostly being done to showcase Nvidia raytracing. With such a huge budget, even if Nvidia threw a $1 million at them, I cannot figure out why CDPR would want to spend so much effort doing a tech demo.

Plus, while the lighting and reflections are fancy, am I the only one who things the polygon count and model standards are actually rather poor in that game?
 
Soldato
Joined
16 Sep 2018
Posts
12,728
if anything it will be pointless for game developers to make games that take advantage of these newer graphics cards - i don't see how its financially viable when so few people can afford the cards. Might as well not bother.
True and that was sort of my point, it would be hard to argue that current price to performance is not harming PC gaming and probably to a lesser extent the entire PC ecosystem.
 
Soldato
Joined
6 Aug 2009
Posts
7,107
True and that was sort of my point, it would be hard to argue that current price to performance is not harming PC gaming and probably to a lesser extent the entire PC ecosystem.
Going by the tiny sample of people I know many have lost interest in PC gaming and hardware. Too expensive and very little excitement to upgrade, not helped by there not being many good games. Sadly once you're seen decades of games there's rarely much that surprises you, the industry seems to have gone very corporate, much like the film industry. I saw how Rare became a shadow of its former self. These things always go in cycles so hopefully we are at the low point now, the only way is up ;)
 
Soldato
Joined
6 Jun 2009
Posts
5,445
Location
No Mans Land
Going by the tiny sample of people I know many have lost interest in PC gaming and hardware. Too expensive and very little excitement to upgrade, not helped by there not being many good games. Sadly once you're seen decades of games there's rarely much that surprises you, the industry seems to have gone very corporate, much like the film industry. I saw how Rare became a shadow of its former self. These things always go in cycles so hopefully we are at the low point now, the only way is up ;)

I feel you on that one. My excitement for gaming just isn’t there anymore. I was a massive BF fan since 1942 came out in 2003 (I think!). Sadly the last couple of games have been total garbage, and I’m simply bored of BF4 now. Hence I’m not that bothered about upgrading.
 
Associate
Joined
2 Sep 2013
Posts
1,932
Have an RX580 8GB and technically could still run everything I play with (nearly) all the bells and whistles on and doesn't negatively impact on gameplay. However, as the rig I am now is GPU limited, and doing some of Deep Learning stuff is VRAM heavy, I jumped on the 7900 XTX when it became cheaper yesterday here on OCUK. Otherwise, I could easily have sat out another generation or two easily.

If I were to guess/predict, the first GPU reductions are in May, and major reductions in August and September. My guess is, that's as far as the companies can hold out for before biting the bullet and dropping prices, and also more games being released around then, so if they want to shift stock, a temporary reduction will do them good, whilst giving gamers something to grab. But again, that's just a guess here on my part. :)
 
Associate
Joined
3 May 2021
Posts
1,237
Location
Italy
Prices will go down under 3 conditions:

1) Datacenter demand slows down - less silicon competition from higher margin products (unlikely IMHO with the current rise of large language models)
2) Competition picks up - this means primarily Intel, with long trails by devices such as Steam Deck or Chinese new entries like Longsoon (we're looking at a 2-5 years horizon IMHO)
3) Further demand drops - With salaries hit hard by cost of living raises this is pretty likely

There is also an hidden "nuclear clause":
- China being banned from purchasing anything under X nanometers. With such a shock, everybody will have to rush to dump inventories elsewhere
 
Associate
Joined
14 Aug 2017
Posts
1,195
China being banned from purchasing anything under X nanometers
Wasn't there already a ban on them buying a lot of things ... yeah here we go - https://www.sec.gov/ix?doc=/Archives/edgar/data/0001045810/000104581022000146/nvda-20220826.htm

A100, H100, A100X and "any future NVIDIA integrated circuit achieving both peak performance and chip-to-chip I/O performance equal to or greater than thresholds that are roughly equivalent to the A100"
I don't know how that translates to desktop GPUs, but presumably after a few generations even a mid-low end card should have similar performance.
 
Associate
Joined
18 Sep 2012
Posts
515
Cyberpunk's budget is allegedly €316 million (https://en.wikipedia.org/wiki/List_of_most_expensive_video_games_to_develop), yet it seems that the rendering engine is mostly being done to showcase Nvidia raytracing. With such a huge budget, even if Nvidia threw a $1 million at them, I cannot figure out why CDPR would want to spend so much effort doing a tech demo.

Plus, while the lighting and reflections are fancy, am I the only one who things the polygon count and model standards are actually rather poor in that game?

Yeah, there is something Cyberpunk that doesn't quite do it for me. I'm not sure if it's the seeing things appear/disappear, lack of range with sniping or the graphics themselves. Maybe the graphics are just too polished and clean if that makes sense? I actually prefer ARMA 3's graphics. RDR3 as well.
 
Soldato
Joined
9 Nov 2009
Posts
24,981
Location
Planet Earth
Cyberpunk's budget is allegedly €316 million (https://en.wikipedia.org/wiki/List_of_most_expensive_video_games_to_develop), yet it seems that the rendering engine is mostly being done to showcase Nvidia raytracing. With such a huge budget, even if Nvidia threw a $1 million at them, I cannot figure out why CDPR would want to spend so much effort doing a tech demo.

Plus, while the lighting and reflections are fancy, am I the only one who things the polygon count and model standards are actually rather poor in that game?

Yet,the NPC AI sucked,the NPCs all re-used the same few voice lines,most of the shops were static,etc. The world as a whole seemed very empty.
 
Soldato
Joined
31 Dec 2010
Posts
2,559
Location
Sussex
Well haven't played it, so mainly going by the videos.
For me it wasn't the AI so much but the settings. Too clean for a run-down place and clean means low polygon flat with not even bumpmapping. If we were talking about a Star Trek Next Gen utopia that would be fine, but a gangster filled dystopian city?

But talking about AI on the other graphic card thread, I really hope that the console makers put in some kind of AI framework in the next consoles. Not just for path finding but for some more interactive things too. Just hope it isn't as shallow as Bethesda's Radiant AI.
 
Associate
Joined
3 May 2021
Posts
1,237
Location
Italy
Wasn't there already a ban on them buying a lot of things ... yeah here we go - https://www.sec.gov/ix?doc=/Archives/edgar/data/0001045810/000104581022000146/nvda-20220826.htm

A100, H100, A100X and "any future NVIDIA integrated circuit achieving both peak performance and chip-to-chip I/O performance equal to or greater than thresholds that are roughly equivalent to the A100"
I don't know how that translates to desktop GPUs, but presumably after a few generations even a mid-low end card should have similar performance.
I meant on consumer electronics as well.
This is why Chinese companies are investing a lot on packaging, they expect to have to work with >14nm stuff in the near future and it's the only thing that can make that viable.
 
Soldato
Joined
9 Nov 2009
Posts
24,981
Location
Planet Earth
Well haven't played it, so mainly going by the videos.
For me it wasn't the AI so much but the settings. Too clean for a run-down place and clean means low polygon flat with not even bumpmapping. If we were talking about a Star Trek Next Gen utopia that would be fine, but a gangster filled dystopian city?

But talking about AI on the other graphic card thread, I really hope that the console makers put in some kind of AI framework in the next consoles. Not just for path finding but for some more interactive things too. Just hope it isn't as shallow as Bethesda's Radiant AI.

Its a beautiful game,and a lot of effort has been put into the art direction/design and different parts of the city are distinctive. Some parts do definitely look far more rundown,if you venture to those areas.The character models are also very well done,especially the facial animation(uses technology from JALI Research). But the rest of it feels empty once you go outside the main quests. In that sense Skyrim,Fallout,etc seem more lived in and dynamic worlds. So as much as Bethesda Gaming Studios,has buggy games,using janky,ancient technology they seem to still do certain things better and it made me appreciate some aspects of their games more.
 
Associate
Joined
2 Dec 2022
Posts
619
Location
-
I think a price drop on the 4080 to £1000 will be inevitable within the next few months, people who want the best will prefer a 4090 and people who care about price/performance will prefer a 4070 Ti. The 4090 will probably only get a small price cut when the 4090 Ti is released and it's no longer top dog. I gave in and got a 4070 Ti after putting it off for a while because it looked like it was selling reasonably well at £800, so I'm not convinced it will get much of a price cut if any until the next gen cards in 2024/2025. I was tempted by the 7900 XTX, but the power efficiency of the Lovelace cards meant that the overall price/performance was going to be much better with the 4070 Ti.
 
Soldato
Joined
9 Nov 2009
Posts
24,981
Location
Planet Earth
I think a price drop on the 4080 to £1000 will be inevitable within the next few months, people who want the best will prefer a 4090 and people who care about price/performance will prefer a 4070 Ti. The 4090 will probably only get a small price cut when the 4090 Ti is released and it's no longer top dog. I gave in and got a 4070 Ti after putting it off for a while because it looked like it was selling reasonably well at £800, so I'm not convinced it will get much of a price cut if any until the next gen cards in 2024/2025. I was tempted by the 7900 XTX, but the power efficiency of the Lovelace cards meant that the overall price/performance was going to be much better with the 4070 Ti.

According to TPU,in Cyberpunk 2077,the RTX4070TI isn't any more efficient than a reference RX7900XTX:

Even an aftermarket RX7900XTX is not massively off an aftermarket RTX4070TI:

They measure dGPU power consumption directly at the power connectors. Now considering the RX7900XTX is using a 384 bit memory controller with 24GB of VRAM,and the RTX4070TI is using a 192 bit memory controller with only 12GB of VRAM,a lot of that extra power consumption is probably the VRAM.

The RTX4080 FE and RTX4090 FE are the efficient SKUs in the Ada Lovelace line-up.
 
Last edited:
Associate
Joined
2 Dec 2022
Posts
619
Location
-
According to TPU,in Cyberpunk 2077,the RTX4070TI isn't any more efficient than a reference RX7900XTX:

Even an aftermarket RX7900XTX is not massively off an aftermarket RTX4070TI:

They measure dGPU power consumption directly at the power connectors. Now considering the RX7900XTX is using a 384 bit memory controller with 24GB of VRAM,and the RTX4070TI is using a 192 bit memory controller with only 12GB of VRAM,a lot of that extra power consumption is probably the VRAM.

The RTX4080 FE and RTX4090 FE are the efficient SKUs in the Ada Lovelace line-up.
The 4070 Ti is pushed hard out of the box compared to other cards, when you apply a sensible power limit it becomes one of the most efficient cards with very little performance loss. Even if you don't limit the power manually, an in-game frame rate limiter largely achieves the same thing which most people will be using to prevent screen tearing on a VRR display. I know the 12GB VRAM will likely become limiting in a couple of years, but I'd most likely want to upgrade and sell it anyway when the next gen GPUs are out. Another thing that swayed me is that it is massively overbuilt even on the cheapest £800 card, so it stays extremely quiet under load. Not forgetting DLSS of course, which is still superior to FSR 2.1 and supported on a lot of older games that FSR isn't, and reflex low latency which is important on competitive games to keep input lag low in GPU bound scenarios.


Also it's not just about gaming power efficiency, my PC spends more time playing videos than gaming and just look at the atrocious power consumption on the 7900 XTX.
power-video-playback.png
 
Last edited:
Soldato
Joined
9 Nov 2009
Posts
24,981
Location
Planet Earth
The 4070 Ti is pushed hard out of the box compared to other cards, when you apply a sensible power limit it becomes one of the most efficient cards with very little performance loss. Even if you don't limit the power manually, an in-game frame rate limiter largely achieves the same thing which most people will be using to prevent screen tearing on a VRR display.


Also it's not just about gaming power efficiency, my PC spends more time playing videos than gaming and just look at the atrocious power consumption on the 7900 XTX.
power-video-playback.png

You can do the same with AMD cards too - their driver suite has had things like Radeon Chill for years from the Polaris days. It's been baked into their drivers for years:

Nvidia basically copied it. That TPU review is still using the launch press driver for the RX7900XTX figures(22.40.00.57 Press Driver according to the test setup page). AMD already started fixing the idle power,and made Radeon Chill work better,a few weeks after launch:

Around 15W~25W worse than the RTX4070TI. Most likely to get improved over the next few months.

Also,as a person who has only built SFF systems since 2005,I have told people plenty of times,you need to look at the other parts in your system too(not just the CPU or GPU). For example:
1.)Motherboards can apply different voltage profiles,even at stock.
2.)Motherboards can also have inefficient VRM setups.
3.)A lot of the 80+ certifications don't cover lower loads,ie,under 20% effiency. So you really need to check measured curves to see you don't get caught out.
4.)Monitors can wary in power consumption! It can be as little as 10W to even more. It's something reviewers don't really measure.

So even if you just look at the CPU and GPU idle figures,not properly looking at other metric can easily add dozens of watts to idle and low load power consumption.

Nvidia inventory is increasing:

That can only be down to newer generation cards also staying unsold. The last time this happened,IIRC,with Turing Nvidia made a massive rejig of the whole range.

They basically relaunched the range by pushing the dGPUs one tier lower. This is why the RTX2070 which had a TU106 chip was replaced by the RTX2070 Super with a TU104 chip.

The RTX2060 6GB was then replaced by the RTX2060 Super 8GB,which was essentially a cut down RTX2070 8GB. So what I can see happening is that Nvidia pushes tiers down another level.

So the AD104 RTX4070TI 12GB gets replaced by an AD103 RTX4070 Super 16GB,which is a cut down RTX4080 16GB. The AD106 RTX4060TI 8GB gets replaced by an AD104 RTX4060 Super 12GB,which is a cut down RTX4070TI 12GB and so on. Also an AD102 RTX4080TI replaces the AD103 RTX4080. Might stay at 16GB at that tier though.

Personally both companies can do with a price cut and re-organisation of their ranges. They are basically chancing this current generation to see what they can get away with.
 
Last edited:
Back
Top Bottom