• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is Crysis a reason to upgrade?

if you can get a good return on your current hardware, or you can not play at all, then i say yes. otherwise just play at a lower setting to get smooth framerate
 
Imo no one game is reason enough to upgrade. I know a lot of people did before crysis came out and got sorely dissapointed as the poorly coded pos was making the majority of systems look laughable.
 
I play Crysis very smoothly on all high 1680x1050, why would I need to upgrade? :)

I think people don't realise there are options other than "very high" in the menu and expect too much.
 
Crysis is the reason nVidia need to upgrade. But as said, lower the settings to high and it plays fine. Or use some custom configs and you can play close to Very High with an "acceptable" framerate.
 
If you're running everything else at the settings you want then no, I wouldn't consider spending hundreds of pounds on one game a good investment. If you were running everything on low and on the verge of upgrading anyway then yes.
 
I play Crysis very smoothly on all high 1680x1050, why would I need to upgrade? :)

I think people don't realise there are options other than "very high" in the menu and expect too much.

Indeed, played the game through on my GTX on High at 1680x1050 with no problems at all.

I think that is the main problem, we all knew we couldn't run Crysis Maxed on current tech, Crytek told us that... Yet people still try to run the game on Very High when it just isn't possible with today's hardware unless you enjoy watching slide shows.
 
Is Crysis a reason to upgrade?

No, and it annoys me ever time i see somebody wanting to upgrade just to play crysis at 'max' settings or at a smoother framerate.
Its not difficult to just turn the settings down or tweak various other graphics settings in the game, it will still look good and run at a decent framerate
and after all that the gameplay isnt much good anyway in my opinion
 
As an aside, I don't remember any of this drama when Oblivion came out. People just accepted that you will need more grunt to play it without all this drama and accusations of Bethesda ruining the coding, or being liars.

Oblivion came out in March 2006 and no card out could run it at high framerates maxed out, even in Crossfire IIRC. It was only when I went from an X1900XT to an 8800GTS in February of last year that I got both maxed out settings and truly smooth framerates, not one or the other. So come on people, cut the drama.

In fact I suspect this thread was made for easy to light the fuse and watch the action as usual.
 
Last edited:
As an aside, I don't remember any of this drama when Oblivion came out. People just accepted that you will need more grunt to play it without all this drama and accusations of Bethesda ruining the coding, or being liars.

Oblivion came out in March 2006 and no card out could run it at high framerates maxed out, even in Crossfire IIRC. It was only when I went from an X1900XT to an 8800GTS in February of last year that I got both maxed out settings and truly smooth framerates, not one or the other. So come on people, cut the drama.

In fact I strongly suspect this is why this thread was made, easy lights the firework and watches as usual.


Great post mate.

respect:)


Last sentance is dissapointing
 
Great post mate.

respect:)


Last sentance is dissapointing
Just seems that way with you lately. You made a thread about Crysis and the CPU which turned into a flame war between the "Crysis fanboys" and the "Crytek haters", you started one the other night which got deleted, and now this which is going to turn into a flame war, guaranteed.

It's a clear pattern, but I'd be the first to apologise if I am truly wrong.
 
well i get 20 k + in 3dmark06 and i play crysis fine with everything on high + config hdr and etc.

but if you think logically for 1 game spend thousands of £ it is not worth it
 
well i get 20 k + in 3dmark06 and i play crysis fine with everything on high + config hdr and etc.

this is the thing, people see crysis running on these tech demo vids and expect it to run that well on their pc out of the box

when it is tweaked with custom configs you can get optimal performance matched with great graphics, but youre average user wont know how to do this thus would go and spend some £ on new graphics in an attempt to get better performance
 
I don't understand all this crysis rubbish...

I think a better comparison would be Far Cry, not oblivion.

That game ran like crap when it first came out, give it 6 or 7 months and we'll all be enjoying crysis at 60fps+ for the most part.

We just need to wait for nVidia to release their proper next gen cards wihich will prolly be around the end of summer or something...

There's nothing out yet that can run it at full anyway so pointless thread.
 
yea i know, cryteck should make a patch with some configs or something in and of the day we are there costumers, but you know they sold amount of copys they wanted and they happy thats how the market goes
 
As an aside, I don't remember any of this drama when Oblivion came out. People just accepted that you will need more grunt to play it without all this drama and accusations of Bethesda ruining the coding, or being liars.

Oblivion came out in March 2006 and no card out could run it at high framerates maxed out, even in Crossfire IIRC. It was only when I went from an X1900XT to an 8800GTS in February of last year that I got both maxed out settings and truly smooth framerates, not one or the other. So come on people, cut the drama.

In fact I suspect this thread was made for easy to light the fuse and watch the action as usual.



I also dont recall the developers of oblivion publically stating that the game was being made with scalability in mind and would make proper use of multicore cpu's and sli\crossfire. Its been proven that throwing gpu's at crysis does crap in terms of performance, just have to look at tri sli benchmarks with a third gpu (3 clocked 8800 ultras) adding a measly 7% peformance gain.

Same with quad crossfire anything after 2 gpu's and the performance scaling just dies...and this is the game that was built with scalabilty in mind?
 
Last edited:
Back
Top Bottom