Is there any point in upgrading anymore?

surely nvidia and ATI are going to have to persuade the game developers to start making more demanding games or there sales are going to plum it once everyone realises that there's no point in upgrading?


LOL. they do that with drivers.
 
I think the benefits from upgrading (assuming you have something reasonable to start with) aren't too great at the moment because modern games are starting to be much better optimised in terms of multi-core support. This means that quadcore cpus have (in gaming terms) a reasonable shelf life, it isn't like the old days where clockspeed (and performance per clock) was king and new games would really chug on old cpus. In other words even if modern games are more demanding on the cpu, some of the slack can be picked up by excess capacity on some of the cores. Hence a (overclocked) Q6600 for example can still cut the mustard in most cases.

As mentioned by others the fact we are nearing the end of a console cycle has an impact too, i.e. most games are multi-platform and therefore written with PS3/XB360 hardware in mind (DX9, modest memory/cpu requirements).
 
once my GTX670 arrives, I won't be upgrading for a good while. Once I can't play new games at high settings above a constant 30 FPS, I'll I look at getting another.
 
I upgraded my 5970 to the new 670 SLI, and i love it. I can play bf3 in ultra, and also shogun 2 total war looks amazing. I wont need to upgrade for atleast 2 years now. But i do like to have all my settings at max else i get that expensive itch.

Sadly I was the same but its a very expensive habit to have! Only justified it by getting a 120hz monitor lol. Even then my computer turned out more of a expensive project rather than gaming machine.
 
I think waiting for the retina displays etc and cards to be on par supporting them

I hate it when BS marketing works. This guy obviously thinks retina is some kind of new tech instead of just market babble that means nothing. Makes me really mad :o
 
I dont think i see the point anymore :(
got a Phenom x6, 16gb ram, 2gb 6970.

and all i seem to be playing is ****** console games.

is there ever going to be a game that uses the hardware i have?

oh! in ten yrs...ok lmao

Get a better monitor & yes... there's definitely need.

Gaming at 2560x1440 needs 2x 680 or a 690
 
I only have a 5770 and I'm not a hardcore gamer so probably won't be upgrading anytime soon. I'm happy with the performance of it (play high settings 1080p in most games) and in a couple of years it's probably going to cost me about what £50? to get a decent second hand gpu.
In the past 5 years the performance with components have been insane. I did upgrade once when I made a new pc 2 years ago. But now it has reached a point where daily tasks, applications and most games don't see as a huge difference in performance by jumping from one generation to the next.
 
Most PC screens are nowhere near that res yet, so he's kinda right.

Maybe so, but you don't need the same pixel depth on a big monitor. Text is already crisp as it is. You only benefit from it on a tiny screen, where text (and lines, etc) have to be drawn much more crisply. Even there, the so-called retina display might be overkill.
 
I only upgrade when i consider it to be big a upgrade from example last year i had a E8600 and 260 gtx in sli i had that rig for 3 years or so which is still a decent set up today but the E8600 was showing its age a bit.

I figured a 2500k and 480 gtx soc was a big enough upgrade to warrant it and its been worth the money cause i have had a good year of gaming out of it and i feel it will be ok for another 1 or 2 years at least.
 
I haven't upgraded for a while and don't really see the need.

I personally believe consoles have held back PC game development but purely down to money. Why waste more time than you need making the PC version (and presumeably the least purchased version or the most copied version if we are to believe the media) look better than the console when paying the staff is on the console version selling. Far easier to optimise for the consoles then shoot out a ropey PC port which may or may not work out.

I'd be interested to see what impact this has had on PC component sales.
 
Sadly I think this is true of so many modern games.

I would say it is only true of the "pop culture" tripe which the so called "AAA" devs market as being "The Next big thing".

The indies and smaller publishers are still producing games which are awesome.
 
If game developers would take the time to optimise their games, instead of throwing them out of the door asap, hardware would last a damn sight longer than it does now.

Quad support, HT support, SLI and Crossfire should all be standard these days.

Developers should even be considering 64 bit games.. Blizzard did a WoW 64 bit client and the performance increase was huge, especially during chaotic events in raids.

RAM is cheap and most people are running 64 bit operating systems according to the latest Steam survey, so it's safe to say that most people have plenty of RAM sat doing nothing while gaming.
 
Nice thread :)

Im in the same boat, was going to take the plunge with the 7970, then didnt. Was going to take the plung with the 680, then didnt. Last week was about to go 670, but didnt - I think the overriding reason has been like so many others have said, exactly what games would it be for? even looking at the release schedule for the rest of 2012, there doesnt seem to be many titles on the horizon that you could guess would need the latest and greatest to play.

I recently got ME3 and Shogun, my 6970 runs them @1920x1200 both beautifully, I really am struggling to see the point in upgrading!
 
As many others have said, it all depends on the resolution.

At 1920x1200.... yeah, there's no real need to upgrade from the previous generation... perhaps with the exception of Metro 2033 / Crysis 2

Anything higher and you need more gpu power to maintain a solid 60fps... BF3 is wonderful at 2560x1440 at Ultra with 4xAA, FXAA & 4xSSTAA... even a single GTX680 wouldn't be able to maintain a good frame rate at those settings...
 
You ever wonder why games are shorter than they used to be? Ten years ago you could spend days on the campaign, now it's over in six hours. That's because all the time went into the graphics.

Neah, not all games are made on shiny new engines. CoD is using the same old engine on which developers are used to work with. UE 3 is also a popular one and all the games on it look pretty much the same. All three Stalker games run on the same engine and using the same technology.

I think a more plausible answer is "because they can afford it". CE 2 and CE 3 are there for the taking. The first, never saw a truly good game beside Crysis.

Anyway, my 6970 it struggles (can't keep a solid 60FPS@full details) with GTA 4, BF 3, ArmA 2, Metro 2033, Skyrim and a couple more, on a single 1680x1050 monitor. I don't even bother to mention Eyefinity. :D
Still, no upgrade, not until AMD and nVIDIA bring the prices down and seriously increase the performance.
 
Last edited:
I dont think i see the point anymore :(
got a Phenom x6, 16gb ram, 2gb 6970.

and all i seem to be playing is ****** console games.

is there ever going to be a game that uses the hardware i have?

oh! in ten yrs...ok lmao

OP, no there isn't any poing in upgrading if you are using single 1080p monitor.

but if you are using 2-4 screens and you love to game on all of them, then yes you will need to push the graphics card every year for next gen game to run at man setting full res.

2-3 years upgrade cycle is a must, if the developers keep on pushing their games with high end techs,

I read somewhere Microsoft are working on DX12 for 2013-14 so yeah next gen card with DX12 is a must upgrade!
 
Back
Top Bottom