Relationship between PC specification & games

@starshshock.

Question 2 op asked ... prebuilt gaming machines from £300 to £3000 will play all of the games reasonably well? "
You said yes to within reason but a £300 machine is not going to play all games reasonably well unless I'm misunderstanding the question .

If it were possible to quantify all the properly put together machines from £300 to £3000 on a scale of 1 to 10, i.e. going up by £300 each time ... where would you need to be to ensure any game would play reasonably well. I realise it would only be a rough estimate.

Can an ordinary human being tell the difference between an 8 and a 10 (£2400 & £3000) or a 3 and a 5 (£900 & £1500)
 
If it were possible to quantify all the properly put together machines from £300 to £3000 on a scale of 1 to 10, i.e. going up by £300 each time ... where would you need to be to ensure any game would play reasonably well. I realise it would only be a rough estimate.

Can an ordinary human being tell the difference between an 8 and a 10 (£2400 & £3000) or a 3 and a 5 (£900 & £1500)

Manufacturers generally quantify gaming stuff by resolution nowadays, even games developers have started to give their minimum and recommended specs per each resolution (1080p, 1440p & 4K).

For a 'proper' gaming PC (i.e. new AAA games and games up to 5 years old, which is when Intel expects you to upgrade) then you need an entry-level 720p/1080p PC, which is a current-gen i3 or equivalent (approx £100) and a low-end graphics card (though laughably that's currently £200). All-in around £600-£700.

For a decent gaming experience (the minimum is again, 1080p) which might last a year or two at the top, with new AAA games, then you need an i5 and a midrange graphics card (currently £400). All-in around £800-£1000.

For 1440p and 4K, an i5 is still fine really, but you need a better graphics card. For 1440p at least a 3060 Ti or 6700 XT.

PCs have been like this for a long time, where a decent gaming experience at the 'standard' resolution for the day, is an i5 and a midrange GPU. The only inconsistency is that graphics cards are far too expensive, which is dropping performance down and will make a GPU last for fewer years than previously.

As for the 900 & 1500 and 2400 & 3000, it really depends what experience you're going for. If you try to game at 4K on a £900 PC, then good luck! But, from a value perspective the old rule of thumb is still true, you pay a big premium to have the best and it's rarely worth it financially (e.g. there's usually very little difference between a i7-12700K and i9-12900K). Though, since graphics card prices are ridiculously exaggerated on everything, the high-end isn't as bad as it used to be. Most reviewers have FPS per dollar / pound charts you can look at for that. It's almost always better to buy say... a £1000 PC every 5 years (with what reviewers say are the value picks for the time), then spend £3000 now and expect it to last 10.
 
On the subject of heat ... what is it on a graphics card that produces so much of it that it can't contain itself.

While everything else seems to be getting smaller, video cards are making their way towards being hooked up to an air conditioning unit.
 
The problem is the manufacturers are competing on performance. So they're always going to shove as much voltage through it as possible to gain performance. CPUs are the same. GPUs are just bigger.

You can play at a lower resolution and cap framerate to restrain the GPU if you want to use less power and keep it cooler and quieter, and not require yourself to buy the top end gpu.

"everything else seems to be getting smaller" isn't quite right. Manufacturing processes improve, but they just cram more of it into a GPU/CPU and stick whatever voltage through it they want, instead of claiming the efficiency gains. So performance improves and heat ends up the same.
 
You also get stupid stuff like this ... it make it look more impressive than it is. And they charge loads extra for the privilege. Should be around £250

I wonder if that principle is working throughout the range ... if it were possible to make a video card that handled all the current games and a few future ones but was only the size of a modern hard drive I’m not sure you would want buy it.
 
If the excessive heat is being caused by ever higher voltage is it possible to explain the relationship between the higher voltage and a better picture ... the two things don’t naturally go together, do they?
 
I wonder if that principle is working throughout the range ... if it were possible to make a video card that handled all the current games and a few future ones but was only the size of a modern hard drive I’m not sure you would want buy it.

They do, in the past this would have been the GT 1030 and RX 550, but since the low-end was abandoned for a few years, these cards have got a lot slower (relatively speaking) than they should have been at their price point.

Since the low-end is getting attention again now, the RX 6500 (for example) probably won't have the high clocks (and voltage) of the RX 6500 XT and it will handle most moderns games at playable (ish) frame rates, while having no power connector.
 
I think you're probably overstating the aesthetic angle. A large number of gamers would happily buy a hard drive sized gpu if it played games well. However it's not possible to match top-end gpu performance with a miniturised version, as the chip and pcb itself needs a certain amount of room, and the coolers need to be a certain size to dissipate the heat generated. Cooling-wise the only solution would be a very high speed fan too, which would generate a lot of noise. As technology advances you can certainly match top-end performance from 5 or 10 years ago with smaller and more efficient cards though.

There is a relationship between voltage and performance because higher voltages let the gpu run faster. You get to a point of diminishing returns though, where for a given silicon chip any increase in voltage will result in far more power use and heat generation than can be justified by a minor increase in clock speed. Many modern graphics cards sit quite far along that voltage / performance curve, which is why they are so power hungry.
 
If the excessive heat is being caused by ever higher voltage is it possible to explain the relationship between the higher voltage and a better picture ... the two things don’t naturally go together, do they?
The higher end gpus are physically larger chips with many more processing units on them. Which allows them to process more data faster. Nvidia calls their main processing units cuda cores the bottom of the 3000 series, RTX 3050 has 2560 on them. The top end 3090 has 10496. There's more complexity that just that secondary processing units, clock speed ect. But thats the biggest difference. 4 times as many cores requires 4 times as much power to run.

In theory you can turn all the settings on with a low end card and it would look just as good. But it would be a slide show. So the higher end cards allow good performance with all the pretty turned up.

One thing that I don't think anyone has delved into with any detail is frame rates. Console games are tuned to run at set frame rates, usually 30 or 60 frames per second. PC games are not, and PC gaming monitors tend to run at much higher refresh rates, and these days will usually have variable frame rates. So having more horse power translates to higher frame rates, which makes for a much more fluid and responsive gaming experience. Which is a difficult thing to explain. But playing a game at 100+ fps feels very different to playing at 30 fps. Specially fast paced games.
 
If the excessive heat is being caused by ever higher voltage is it possible to explain the relationship between the higher voltage and a better picture ... the two things don’t naturally go together, do they?

It's not just the voltage, it is the whole card. The non-xt versions are usually more efficient than the xt versions, for example (because of their lower clocks and voltage), but the die size, the memory, it is all a factor. As suggested above, they tend to push them harder to compete and that makes them less efficient, but they'd still use a lot of power. 4K computing just needs a lot of hardware to power right now and that's a fact. In 10 years time, it won't need so much, but then we'll have 8K so :D

I personally don't think what they're doing right now is sustainable, I wouldn't be surprised if gaming monitors / PCs get locked at 4K by regulation, because this level of power draw is getting attention.
 
Thank you for going to all the trouble to enlighten me ... I feel more enlightened.

The thing that set me off was being informed an acceptable gaming PC now needed a water cooling system for the graphics card, along with a case full of illuminated cooling fans and a PSU using the same energy as a storage radiator.

I couldn’t understand how it related to the actual playing of a game.

While I can see how all the experienced technical people on here get pleasure from building, modifying and fine tuning gaming machines I don't think what they end up with is an example of what a newcomer needs to play games on.

I imagine the technical part is something technical people drift into over time
 
The thing that set me off was being informed an acceptable gaming PC now needed a water cooling system for the graphics card, along with a case full of illuminated cooling fans and a PSU using the same energy as a storage radiator.

You were incorrectly informed by whoever gave you that advice. These are only rough figures to give you ballpark figures to how much power the most power hungry hardware uses.

Overclocked i9 12900K uses upto 400 Watts when fully stressed and doing productivety tasks. It uses far less power when gaming. This type of power usage can be dissipated by the very best air coolers but it is getting close to the edge of what is reasonable without having very loud fans. 360 AIOs cool these chips with relative ease.
You get excellent gaming performace from lower specced cpus such as i 5 12600 and Ryzen 5800x and these use a lot less power.

RTX 3090 upto 500w spikes when pushed to the max. Needs a huge heatsink and fan array and parts of the card do get very hot but it is certainly possible to cool it with air and thousands of people do just that. No need for a watercooled setup but once again it is getting close to the reasonable limit.
You can get excellent gaming performance from lower specced gpus such as 3080/3070ti/3060 or even AMD 6000 series cards. I have AMD 6800 and I have never seen it use over 200w and it gives very good performance.


It is rumored that the next generation of gpus will be chiplet designs, be much larger and be a lot more power hungry and expensive at the top end. If the rumours are true we may see the next gen gpus using over 600w and this may require a rethink in regard to the cooling and may need new psus with a new spec connector. Until they are actually released it is all just talk so wait and see.
Of course you do not need to buy the very top end to get a very good gaming experience and the mid range cards generally use a lot less power.

You do not need watercooling and you do not need any flashy lights if you do not want them. The joy of PC gaming is partly the choice and variety to make the PC how you like it.
 
Back
Top Bottom