Soldato
- Joined
- 14 Aug 2009
- Posts
- 3,250
But isnt this because developers need to make their games in mind with what specs most gamers already have? Developers had to stick to 4c/8t max as that was a norm, similarly to how little Vram graphics cards used to have - for a long time Nvidia only put up to 1.5 Gb on their highest end cards.
If a developer wanted to make a game that could utilize 8 cores and 4 Gb Vram, then no one would have been able to play it so it wouldn't have sold. Hardware isn't meant to be made based on what current software demands are, the point is that with better hardware becoming the norm, software developers can then utilize those features.
Almost all of them were limited by the consoles and their own bare minimum mentality. Even hardcore PC developers like Bohemia (ArmA series), didn't care about their engine that much. Why bother when you could get away with less?
You could put it that way I guess . Vram is a little more difficult. As Gpu's have got faster Vram had increased accordingly. There isn't much point in increasing vram unless you have the gpu processing power to actually make use of it. In all my years of pc gaming (my first 'gaming' pc was a 486sx25) I've never had a problem with a graphics cards vram (or lack of it).
Because developers never bother to offer higher quality textures for their assets.
![Smile :) :)](/styles/default/xenforo/vbSmilies/Normal/smile.gif)
Last edited: