Soldato
- Joined
- 19 Oct 2002
- Posts
- 6,578
- Location
- Torbay, Devon, UK
Set mine to the highest settings, runs perfectly.
Op is running sli 670's TwsT. Not sure how a 650w unit would cope with theese and the cpu oc'd. But for reference, with the cpu in sig and a gtx 670 @1310/7586 mhz, my power draw at the wall is 340w in games such as bf3.mine runs it on high settings without any optimisations didn't know so many people were having problems.
3570k @ 4.5 single 680 @ stock
IRT drtoffnar
a 650w psu shoudl power any single card system with a moderate overclock.
People complain if it's too demanding, but then people also complain if its not demanding enough!
First of all, I would say that the game has only just come out.
Wait for a few more driver revisions and patches and I'm sure performance will improve a lot.
I'm not even confident that it has a fully working SLI profile yet??
Secondly, overclock your cards and cpu! You should be able to get those to about 680 standards, and that cpu is designed to be overclocked, get on it!
I run with 680sli on an i7 ivy bridge at 4.4ghz, and I am not getting the performance I would hope, but I am also running at 2560x1440, but I am sure that with new driver releases and patches the performance will improve.
I dont think that's the case. People who'v spent thousands of pounds on there rigs should be able to feel comfortable knowing they can run almost anyhitng without the pain of horrific drops. It's not a great deal to ask. I know it's one of the risks you take when deciding to be a PC gamer but still... Sometimes I think the 'PC master race' is so obnoxious that we can't see when were being ripped off...
True...but I felt the same when GW2 first came out.....yes it isnt as grahically amazing, but its far bigger.
Few patches and driver updates later and I run solid at 60fps maxxed.
People just need to give it time.....but then they probaly would have completed the game by then![]()
I remember this being posted on here a while back, havent used it myself though as i have coretemp set to display load etc on my G15 keyboard lcd.First time I played this I just used the default settings, my second run through I am using the Nvidia Experience profile. Have not noticed much of a difference. Using AB to monitor temps and fps, all look good. I will however, have a play around with the settings on the third play to see what my cards and Cpu max capabilities are, to see how far it goes before crashing.
Can someone tell me what is the best way to monitor Cpu temps and load, whilst in game.
I only play single player campaign, never online. Am also playing @1920x1080.
To answer the original OP question,
No, what Crysis 3 is , is ground breaking and the new benchmark in gaming visuals.
A certain well known internet retailer that rhymes with Blamazon has a marketplace seller who has this new for £6. Too good to be true, surely?
Looks like it, think that offer just disappeared.
Either that or I'm blind.
Yes they were, and it was. Crytek admitted this and said Warhead expansion would be better optimized.
Game runs fine on console, but even on 1280x1080 resolution on PC runs like a dog on high. I'm going to have to tweak a lot of settings to get it right then eventually be able to push onto ultra where it should be.
What are people on about saying it's not poorly optimized? Yes it is, a game of this visual quality isn't going to release perfectly is it, patches + driver updates will vastly improve performance in the first few months, nobody could bring a game this pretty out and have it fully optimized out of the box...
As for Crysis 1 not being poorly optimized, are people drunk? It runs badly even on high end hardware.
You feel Crysis 1 runs poorly on your set up? I know its still a great benchmark but if your having issues then somethings wrong.