• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Cyberpunk 2077 Ultra performance

I think Atomic Heart will be that game - JHH showed off a scene from the game back in 2018 during the Turing reveal.


The 2020 trailer looks funky:
https://www.youtube.com/watch?v=FJ7cCN-DmFY


https://www.youtube.com/watch?v=kxdWSyoBcH0

I don't know if it will be actually a good game,but if people think Cyberpunk 2077 is the new Crysis,I think Atomic Heart will prove that theory wrong! :p

It's not looking good according to the minimum and recommended specs. Fingers crossed that these are just placeholders.

https://www.dsogaming.com/news/atomic-heart-early-official-pc-system-requirements/#more-149750

Atomic Heart Early Official PC System Requirements

Mundfish has shared some early official PC system requirements for Atomic Heart. According to the PC specs, PC gamers will at least need an Intel Core i5-2500K or AMD FX 6300 with 6GB of RAM.

Mundfish has listed the Nvidia GeForce GTX 780 and the AMD Radeon R9 290 as the minimum GPUs, and the game will require at least 22GB of free hard-disk space.

The team recommends using an Intel Core i7-4770K or Ryzen 5 1500X with 8GB of RAM. Moreover, Mundfish recommends using an Nvidia GeForce GTX 1060 or AMD Radeon RX 580.

Atomic Heart will be using Unreal Engine 4, and will support real-time Ray Tracing effects (there is an RT demo that you can download). As such, these early PC requirements may change. After all, this first-person shooter is not coming soon (at least from what we know so far).

Stay tuned for more.

Atomic Heart Early PC System Requirements
MINIMUM:
  • OS: WINDOWS® 7, 8, 8.1, 10 (64-BIT Required)
  • Processor: Intel Core [email protected] or AMD FX [email protected]
  • Memory: 6 GB RAM
  • Graphics: Nvidia GeForce GTX 780 (3 GB) or AMD Radeon R9 290 (4GB)
  • Storage: 22 GB available space
RECOMMENDED:
  • OS: WINDOWS® 7, 8, 8.1, 10 (64-BIT Required)
  • Processor: Intel Core [email protected] or Ryzen 5 [email protected]
  • Memory: 8 GB RAM
  • Graphics: Nvidia GeForce GTX 1060 (6 GB) or AMD Radeon RX 580 (8GB)
  • Storage: 22 GB available space
 
I assume you are looking at the vram requested rather than actually used. Because if that is the case I had nearly a full 12gb used on my Titan XP on Final Fantasy 15 a few years ago. But that is not the actual usage. I think you need the latest afterburner with a plug in or something like that to see actual usage. @PrincessFrosty knows more about it.

Nah, I'm looking at both overall VRAM and the new Afterburner beta VRAM measurements.
 
Games needing more that 10GB is inevitable. We are just waiting to laugh at the people who thought that it isn't possible. Lol

Edit: And laugh at the notion that VRAM is hard tied to performance. Lol
Who thought it was not possible? Don't think anyone said that. But nice try.


Yeah it doesn't actually use much vRAM at all, I think my real vRAM usage was around 6-7GB. The latest MSI afterburner doesn't need any modding or setting altering now, the real per process vRAM usage is in the list of metrics, you can enable both side by side to see real usage vs allocation. The game suffers from the same thing a lot of games do, they just arbitrarily request a large amount of vRAM based on how much you have available on your card, but the actual usage is way lower. Especially if you have a high vRAM card like 16-24Gb

I've not revisited the vRAM issue since the 3080 launch but to my knowledge nothing is really using more than 10Gb.
Funny how some read what we are saying and what they take away from it is we will never need more than 10GB. Lol.


Nah, I'm looking at both overall VRAM and the new Afterburner beta VRAM measurements.
Cool. So in conclusion then if what you observed is right Cyberpunk can use more than 10GB at 4K when RT is on and DLSS is disabled, however that makes the game unplayable anyway. Let's see if any game that comes out this year actually uses more than 10GB before performance tanks like the example above. I would be surprised if it is more than a handful as I predicted last year. I would actually be somewhat surprised if we even get a single one this year :p
 
Nope. I'm terrified of burn in. My PC is on 24/7 with work, gaming, TV, etc. I have an old 1440p/60Hz IPS panel (DGM 2701 https://www.tftcentral.co.uk/reviews/dgm_ips-2701wph.htm) that I've been meaning to upgrade for almost as long as my 3770k. Input lag is bad, but the panel is very nice. I've recently been playing with Dynamic Contrast Ratio that gives a little pop to colour and I have Corsair's LS100 lighting behind the panel to throw screen colours on to the wall.

I've been after HDR since it was first announced, but it looks like it's only now it has become a proper supported standard?
Yup I wouldn't use one for desktop usage, keep a monitor for desktop, work, browsing etc. usage and then oled for all media/gaming.

Ooooooof, time to upgrade that display! It's not a wonder games look so "cardboard" like on that :p

Wouldn't bother with dynamic contrast either, you'll either end up crushing blacks or/and blowing out bright highlights. Backlighting does help give a perceivable boost to the contrast ratio though, more so for LCDs.

In terms of HDR, pretty much any OLED supports the main ones, dolby vision, HDR, HDR10, HDR10+ etc. Obviously with LCD, you'll need to read more into it as you get many "fake" HDR monitors i.e. they only support the bare minimum spec such as brightness, to see HDR real benefits you need oled or a "really" good FALD LCD display.
 
We already know not one game needs over 10gb to date anyway. Trust me, there are a few users out there waiting for the day so they can post in the 10gb is not enough thread. Lol.

Lol.

Call of Duty Black Ops maximum settings with the HQ Texture pack says hi.

Has anyone tried out Call of Duty Black Ops Cold War? Noticed the game has an optional download HQ Texture pack and am seeing 16GB total video memory usage at 5120x1440, even with 85% max video memory usage set in the game options. OS is using about 750MB of video memory, so the game appears to be using at least 15GB.
9POGYnx.jpg

The above is using the Dedicated video memory allocation metric from HWINFO64.

EDIT

I know someone will ask for it, so here's a per process video memory usage from MSI Afterburner. I'd only been playing for a minute or so, but usage was already past 14GB actual usage.

xkec6Jg.jpg

EDIT 2

Also, you don't need this per process monitoring, you can just check what video memory is in usage by the OS prior to launching the game, and then subtract that off the dedicated usage as shown in HWINFO64. I've validated the results using MSI Afterburns per process and it's spot on.

You start alt tabbing when a game is using that much video memory with a browser window or two open in the background, it's not a pleasant experience going back to the game after either.

The game has an option where you can limit VRAM usage to 70% maximum of what you have, but it starts reducing texture quality and you get some pop in. Not nice at all if your GPU is lacking girth in the video memory department. :p
 
Who thought it was not possible? Don't think anyone said that. But nice try.

Your right i exaggerated it was 8GB. In the early days of thread posters were talking like games wouldn't grew in size but that argument was quickly nipped in the bud. However there are some people who seem to think games growing in size is a rare occurence rather than the norm. *Insert pointless comment about selling card*

The irony being that with Nvidia reshuffling their line up to have more VRAM, the size increase in games is going to happen a lot sooner than I expected.

Funny how some read what we are saying and what they take away from it is we will never need more than 10GB. Lol.

I didn't have time to read his? essay, I had to go for my morning run;).
 
Last edited:
The irony being that with Nvidia reshuffling their line up to have more VRAM, the size increase in games is going to happen a lot sooner than I expected.

Does it matter though when the current Ampere GPU is going to be left in the dust with the next gen release, Lovelace/Hopper? Even 1TB of VRAM is not going to help Ampere.
 
Lol.

Call of Duty Black Ops maximum settings with the HQ Texture pack says hi.



You start alt tabbing when a game is using that much video memory with a browser window or two open in the background, it's not a pleasant experience going back to the game after either.

The game has an option where you can limit VRAM usage to 70% maximum of what you have, but it starts reducing texture quality and you get some pop in. Not nice at all if your GPU is lacking girth in the video memory department. :p


I play black ops with the settings on 90% memory allocation so the game uses 22gb vram, beat that :p

This is now a contest, to see who can have the most vram allocated by a game
 
It's not looking good according to the minimum and recommended specs. Fingers crossed that these are just placeholders.

https://www.dsogaming.com/news/atomic-heart-early-official-pc-system-requirements/#more-149750

Probably placeholders,but it could be just for the version without the RT effects.

I think the same. And i think the big innovations will come from consoles because there you need to make beautiful games with far less resources so they will have to use a lot of tricks to make good looking games. Tech companies are making money from selling you their latest products so the games they sponsor will only work well on their latest products. ( Nvidia are masters here, hell CP only works well on the top 3000 series cards and that is if you enable DLSS :D ). Sony and MS are making the most money by selling huge numbers of games, they make almost no money from selling the hardware.

Sony and MS have huge investments in the hardware,so over time they need to get the most of out what they have. With PC,its more the period between new hardware releases which are more frequent.
 
Cyberpunk made me get a 65CX tbh, since i already had a 2080 for RTX. Now had the situation been different ( a 5700xt for ex ) i would have had to buy a new gpu too prolly postponing the TV.

So yeh jensen was right, the more you spend the more you save :D

jk bout the jensen quote but its funny to see people get riled up bout silly things
 
I play black ops with the settings on 90% memory allocation so the game uses 22gb vram, beat that :p

This is now a contest, to see who can have the most vram allocated by a game
Does not surprise me as the Texture pack really bumps up the video memory usage. It would be interesting to know what is actually required to maintain the texture quality, no hitching, no degraded textures or texture pop in.

Anyway this is the Cyberpunk thread so I'll bow out here to keep it on topic.
 
Looking at these threads and reddit, it feels like there's a sliding scale of hardware capability vs people having different types of issues.

Stupid cards (3090 etc) least issues
V Good cards (2080/3070/3080) some issues
Lower cards, more issues depending on features used

Just wondering if the vram in the stupid cards stops certain bugs from happening or crippling performance in CP2077. Don't think I've had any of those outside of one location in game which is hard to render.
 
The only thing I've really said is that the RT is cyberpunk is not that great as the game was not developed to take full advantage of the hardware currently available(because its still based of non RT anhd reduced in quality to allow RT to work with it). This leaves the game in a funny place where some things are better with and some are better without.
But that pattern exists throughout the entire game world itself and it not specific to just RT.

It's not even about AMD Vs Nvidia but some just like to pick up on it. My sig still says Vega 64 and multiple times already I have been told I can't enjoy the game or it's looks bad to me because I don't have the "hardware".
A non issue and everyone is entitled to their opinion and I respect yours. I am an older gamer, so tend to look positively at newer techs and accept that there will be instances of error but still look at the pro's and con's with a leaning toward the pro's. I remember TressFX in Tomb Raider and was blown away by how good hair could look and wanted to see more of it. There were times Lara's hair would fall into her shoulders, which could break the immersion but at the same time, I looked at it as a new tech and loved what I was seeing. I do the same with RT in CP and with over 100 hours played, I still stop and look at reflections and mince about moving around to see how they react.

The biggest issue I see today is "you are with us or against us" and that leaves no room for valid discussions/middle ground. RT for me is awesome in CP2077 and I am loving what I am seeing and I want AMD users to experience the same thing that I can.
 
A non issue and everyone is entitled to their opinion and I respect yours. I am an older gamer, so tend to look positively at newer techs and accept that there will be instances of error but still look at the pro's and con's with a leaning toward the pro's. I remember TressFX in Tomb Raider and was blown away by how good hair could look and wanted to see more of it. There were times Lara's hair would fall into her shoulders, which could break the immersion but at the same time, I looked at it as a new tech and loved what I was seeing. I do the same with RT in CP and with over 100 hours played, I still stop and look at reflections and mince about moving around to see how they react.

The biggest issue I see today is "you are with us or against us" and that leaves no room for valid discussions/middle ground. RT for me is awesome in CP2077 and I am loving what I am seeing and I want AMD users to experience the same thing that I can.

I think you forgot the majority of Nvidia users as well as the AMD users. CDPR added RT but forgot the majority will only see the very average looking game they created for everoyone else.
 
‘Average looking game’ lol. Talk about hyperbole.

Even without RT its one of the best looking games on the market. I’ve played all the ‘lookers’ both on console and pc but hey, gotta **** on the product so people see i mean business while supporting my failed narrative.

:D
 
A non issue and everyone is entitled to their opinion and I respect yours. I am an older gamer, so tend to look positively at newer techs and accept that there will be instances of error but still look at the pro's and con's with a leaning toward the pro's. I remember TressFX in Tomb Raider and was blown away by how good hair could look and wanted to see more of it. There were times Lara's hair would fall into her shoulders, which could break the immersion but at the same time, I looked at it as a new tech and loved what I was seeing. I do the same with RT in CP and with over 100 hours played, I still stop and look at reflections and mince about moving around to see how they react.

The biggest issue I see today is "you are with us or against us" and that leaves no room for valid discussions/middle ground. RT for me is awesome in CP2077 and I am loving what I am seeing and I want AMD users to experience the same thing that I can.
Ironic that you replied to my comment, I have a 3070. I think you are part of the us v them since you already decided to create that barrier once again.

Doesn't like the graphics in Cyberpunk must be an AMD user.
 
I think you forgot the majority of Nvidia users as well as the AMD users. CDPR added RT but forgot the majority will only see the very average looking game they created for everoyone else.
Well this is where we disagree and I see it as a graphically impressive game and bugs aside (there are many), I am blown away by the visuals.
 
Ironic that you replied to my comment, I have a 3070. I think you are part of the us v them since you already decided to create that barrier once again.

Doesn't like the graphics in Cyberpunk must be an AMD user.
I replied to your post as it fitted my train of thought. There is nothing at all I accused you of and in the first line, I said I respect your opinion. Not much I can say to someone who doesn't even read or digest what someone else has written.
 
I replied to your post as it fitted my train of thought. There is nothing at all I accused you of and in the first line, I said I respect your opinion. Not much I can say to someone who doesn't even read or digest what someone else has written.


I read it and said it was ironic because of the below, even though you may have said this in jest others will feed off it and my ability to even try discuss performance and features in this game is lost.

Maybe if you could run RT, you would actually like it.......

:D :D :D
 
I'm not seeing any "you are with us or against us", all I'm seeing are RTX or/and nvidia diehard fans ignoring peoples posts simply stating that without RTX, cyberpunk is terrible compared to what has been done many times before in other games, as soon as someone posts any screenshots/evidence showing games like alien isolation, rdr 2 reflections etc. The RTX PR team come in and go "but but but, RTX looks so much better in cyberpunk, anything non RTX sucks massively"....

Then when asked for a simple yes or no, you get a huge TLDR drivel post dissecting different methods/optimisations etc. i.e. not relevant to the simple question that has been asked "forget RTX, does cyberpunks default SSR look as good as what has been done in multiple games over the last few years?"


Like I posted a page or 2 back, it's exactly like monitor PR teams trying to highlight differences:

OVA8E1X.png

When in reality, the difference isn't that big unless the product/image has been sabotaged in order to make the advantages of said new item/tech stand out even more.
 
Back
Top Bottom