Discussion in 'PC Games' started by Neil79, May 30, 2012.
What are we thinking the specs will be to run this on Minimum and Maximum?
Depends on resolution. Also no doubt the RT features will kill fps.
Hard to say really, still a little early.
The trailers look so good that in game will look graphically rubbish in comparison imo.
A complete guess but I reckon quad core/8GB/580 or 1060 for minimum and powerful octa core/>16GB/2080Ti for maximum.
For maximum, in a 4K 60 target, I'd say nothing short of 2 2080 Ti's, and ofc 16GB Fast Ram, 9900k etc.
Minimum has a lot of lee way. I wouldn't be surprised if something like an RX 570 could do 1080p 60 fps with low/medium settings. If you wanted to go full potato I would imagine it would run even on a Ryzen APU >30fps @ 720-900p.
It's being made for the current generation of consoles so I expect it would be similar in requirements to any other modern fps game released this year/early next year.
It is looking very good - i just hope it lives upto the hype.
I only hope they don't pull another Witcher 3, looked utterly amazing during E3/Gamescom but the released version had significantly inferior graphics.
Maybe on consoles. On PC you could always push it back up to E3 level, it's just that, how many people had Titan X in SLI to enable that? Most changes were simply stylistic in nature, and in some trailers you also had video editing to contend with, but overall you got the experience advertised - if you had the hardware to run it.
This is not what a downgraded game looks like!
The final version had less particle effects than E3 had, such as smoke blending into fire realistically, radically less detailed skybox, no volumetric cloud except in one small location, vastly lowered LOD at long distance, less complex meshes on buildings and rocks and simplified vegetation and flora. You cannot turn these up to E3 levels. They did not release the RedKit editor or Linux version as promised either.
I doubt there'll be anything current that can run maximum well.
Just my 2 cents, asking for specs at this stage is entirely pointless, whilst we all joke about it needing the biggest card possible. The realistic nature is... They don't even know, they likely have a goal and will polish and optimise to steer towards that, which has likely not been completed yet nor will it until later this year.
Any guesses thrown out here from questions are entirely speculation based upon almost no information.
"Cyberpunk 2077 is targeting current gen consoles" should give you an idea on whether it will be hugely demanding or not.
Am I going to need a new GPU if I'm running at 1440p 144hz
Depends on how much eye candy you're willing to forego, I'd have thought?
I have a GTX 1060 6GB laptop and not looking to upgrade. If I can simply play it at all I'll be happy.
I have a 1070 and I fully expect to play this at 100fps @ 1440P UW with all the settings maxed out.
Yeh, check my location.
Probably 1440P with max settings but at 30fps.
I shouldn't have, but I chuckled way too much at your comment.
Well the issue is the availability of higher end GPU's in terms of pricing .
It's all well and good saying a Vega 64 won't cut it at the highest end but the upgrade choice is pretty crap.
Would be surprised if the RTX 3070 won’t offer a decent upgrade path to be honest. It is what I have my eye on and might be the last gpu I get in a while.
Separate names with a comma.