• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Cyberpunk 2077 Ultra performance

I assume you are looking at the vram requested rather than actually used. Because if that is the case I had nearly a full 12gb used on my Titan XP on Final Fantasy 15 a few years ago. But that is not the actual usage. I think you need the latest afterburner with a plug in or something like that to see actual usage. @PrincessFrosty knows more about it.

It could be that it does go above 10gb with rt on and no dlss as you say though. But that is pointless as you run out of performance way before vram.

Yes and no,I have certainly seen instances where if there is a memory leak in a game,performance has tanked,and once its fixed performance was fine. So running out of VRAM isn't a pleasant experience IMHO! Plus if you do like modding games,extra VRAM is certainly useful.

Having said that you can manage settings if its a problem I suppose.
 

correct because you have been repeated the same stuff over and over yet still have not realised what people are actually saying and rush to defend.

You are of the opinion that only AMD owners could say anything bad about how cyberpunk uses RT and non-RT techniques in game resulting in a mixed bag of better and worst.
yet you just spam RT is better, AMD owners can't see it. followed by a random quote from google that actually backs up everyone else's argument.
 
Cyberpunk 2077 continues to remain popular, taking the number 1 seller spot on Steam for the 7th week running.

Cyberpunk also brang a swathe of new users to the Steam platform. Cyberpunk has been a massive hit in China - millions of new accounts were created to play the game. Steam has reported a 13% increase in active users, most of which have come from China.

The in flux of Chinese gamers also skewed the December hardware survey - AMD CPU's and GPU's took a significant market share nosedive, this is because AMD remains unpopular in China and a 13% increase in users (most of which are using Intel+Nvidia) widened the gap.
 
Last edited:
You are of the opinion that only AMD owners could say anything bad about how cyberpunk uses RT and non-RT techniques in game resulting in a mixed bag of better and worst.
yet you just spam RT is better, AMD owners can't see it. followed by a random quote from google that actually backs up everyone else's argument.

There is a pattern of certain AMD owners and/or pro-AMD people who in several cases have bought that mysterious AMD RX3090... who do RT down, then when challenged slink back to criticising RT in current games but as soon as they think people aren't paying attention go back to slinging mud at RT in general, but it is very transparently because AMD isn't currently in a good place RT wise and you can guarantee when things change they will suddenly be all super pro-RT and hyping it left, right and centre.
 
correct because you have been repeated the same stuff over and over yet still have not realised what people are actually saying and rush to defend.

You are of the opinion that only AMD owners could say anything bad about how cyberpunk uses RT and non-RT techniques in game resulting in a mixed bag of better and worst.
yet you just spam RT is better, AMD owners can't see it. followed by a random quote from google that actually backs up everyone else's argument.

I am working based on the evidence, my personal knowledge. My opinion is you know you are lying.
 
There is a pattern of certain AMD owners and/or pro-AMD people who in several cases have bought that mysterious AMD RX3090... who do RT down, then when challenged slink back to criticising RT in current games but as soon as they think people aren't paying attention go back to slinging mud at RT in general, but it is very transparently because AMD isn't currently in a good place RT wise and you can guarantee when things change they will suddenly be all super pro-RT and hyping it left, right and centre.
The only thing I've really said is that the RT is cyberpunk is not that great as the game was not developed to take full advantage of the hardware currently available(because its still based of non RT anhd reduced in quality to allow RT to work with it). This leaves the game in a funny place where some things are better with and some are better without.
But that pattern exists throughout the entire game world itself and it not specific to just RT.

It's not even about AMD Vs Nvidia but some just like to pick up on it. My sig still says Vega 64 and multiple times already I have been told I can't enjoy the game or it's looks bad to me because I don't have the "hardware".
 
Yes and no,I have certainly seen instances where if there is a memory leak in a game,performance has tanked,and once its fixed performance was fine. So running out of VRAM isn't a pleasant experience IMHO! Plus if you do like modding games,extra VRAM is certainly useful.

Having said that you can manage settings if its a problem I suppose.
Which part is no? I am not saying running out of vram is a good thing am I? All I am saying is there are no games out there where you run out of vram on a 3080 just yet before performance tanks, outside of maybe modding.

As you say though, not hard to manage it via settings on the odd occasion if ever needed before more powerful next gen cards are out either later this year or next :)

Hopefully the next gen cards actually have a much bigger boost in RT performance as I was expecting more from Ampere. We really need to get to a point that (at least on current games that are out) where turning RT on maintains fps and does not tank it.
 
Which part is no? I am not saying running out of vram is a good thing am I? All I am saying is there are no games out there where you run out of vram on a 3080 just yet before performance tanks, outside of maybe modding.

As you say though, not hard to manage it via settings on the odd occasion if ever needed before more powerful next gen cards are out either later this year or next year :)

Hopefully the next gen cards actually have a much bigger boost in RT performance as I was expecting more from Ampere. We really need to get to a point that (at least on current games that are out) where turning RT on maintains fps and does not tank it.

The issue with 8GB we thought the same,but there are times when I have seen it happen. I just feel Nvidia were a bit stingy this generation so far,but AMD is no better as they want to charge a premium for the extra VRAM,despite having the RT performance deficit,etc.

But then I am the kind of person who would keep such an expensive GPU for 3~4 years,so probably its more of a concern for me than many here! Maybe I need to wait for the Super refreshes and whatever AMD has to counter with! :p
 
The issue with 8GB we thought the same,but there are times when I have seen it happen. I just feel Nvidia were a bit stingy this generation so far,but AMD is no better as they want to charge a premium for the extra VRAM,despite having the RT performance deficit,etc. But then I am the kind of person who would keep such an expensive GPU for 3~4 years,so probably its more of a concern for me than many here! :p
I agree they were stringy. But had they put more then they would have charged more. I am happy they did not as otherwise I would probably not have gone for a 3080, so worked out for me. End of the day they have an option for people who want more vram with the 3090 and soon a 3080 Ultra or some crap, those people can feel free to cough up the extra money.

I have zero plans on keeping the GPU for that long. As soon as I get a whiff off next gen cards coming I will likely sell this to break even or at worst lose £100. By then I will have played all the graphically demanding games and can go back to a cheap GPU for 6 months or so while playing the less demanding games in my steam library. This is probably when I will buy something like Hades which I hear is a great game.
 
I agree they were stringy. But had they put more then they would have charged more. I am happy they did not as otherwise I would probably not have gone for a 3080, so worked out for me. End of the day they have an option for people who want more vram with the 3090 and soon a 3080 Ultra or some crap, those people can feel free to cough up the extra money.

I have zero plans on keeping the GPU for that long. As soon as I get a whiff off next gen cards coming I will likely sell this to break even or at worst lose £100. By then I will have played all the graphically demanding games and can go back to a cheap GPU for 6 months or so while playing the less demanding games in my steam library. This is probably when I will buy something like Hades which I hear is a great game.

The problem is I need something at least the level of an RTX2060/RX5700 with 8GB of VRAM minimum,and so far even those GPUs are stupid money! :(

If not I might have tried doing that myself.
 
The problem is I need something at least the level of an RTX2060/RX5700 with 8GB of VRAM minimum,and so far even those GPUs are stupid money! :(

If not I might have tried doing that myself.
Yeah, not an issue for me. I can go back to a RX 580 or GTX 1070. Most my steam library will work just fine with those cards. Works for me as I don’t play online so I am flexible and can pick which games I want to install and play :)
 
Yeah, not an issue for me. I can go back to a RX 580 or GTX 1070. Most my steam library will work just fine with those cards. Works for me as I don’t play online so I am flexible and can pick which games I want to install and play :)

I mostly play single player games too,but I do mod games,so want a certain minimal level of performance and VRAM for the longterm ones.

The few online games I play are probably more CPU limited,and don't really need an uber GPU anyway. However,secondhand GPU pricing is screwed up currently - a GTX1070 goes for between £240~£300.
 
I mostly play single player games too,but I do mod games,so want a certain minimal level of performance and VRAM for the longterm ones.

The few online games I play are probably more CPU limited,and don't really need an uber GPU anyway. However,secondhand GPU pricing is screwed up currently - a GTX1070 goes for between £240~£300.
RX 580 it is then :p

Besides we both know you could get a 1070 for under £200 on members market. One sold for £170 recently and even that was way over priced.

If second hand prices are silly when the time comes, I would easily break even selling and I can pick up a RTX 3060 brand new for £369 which would not depreciate much if second hand prices are so silly at the time.
 
I assume you are looking at the vram requested rather than actually used. Because if that is the case I had nearly a full 12gb used on my Titan XP on Final Fantasy 15 a few years ago. But that is not the actual usage. I think you need the latest afterburner with a plug in or something like that to see actual usage. @PrincessFrosty knows more about it.

It could be that it does go above 10gb with rt on and no dlss as you say though. But that is pointless as you run out of performance way before vram.

Yeah it doesn't actually use much vRAM at all, I think my real vRAM usage was around 6-7GB. The latest MSI afterburner doesn't need any modding or setting altering now, the real per process vRAM usage is in the list of metrics, you can enable both side by side to see real usage vs allocation. The game suffers from the same thing a lot of games do, they just arbitrarily request a large amount of vRAM based on how much you have available on your card, but the actual usage is way lower. Especially if you have a high vRAM card like 16-24Gb

I've not revisited the vRAM issue since the 3080 launch but to my knowledge nothing is really using more than 10Gb.
 
Yes. Most no way near 10gb. Only seen a couple go over 9gb one being Cyberpunk. Even a graphically awesome game like RDR2 does not use much from what I can see.

We already know not one game needs over 10gb to date anyway. Trust me, there are a few users out there waiting for the day so they can post in the 10gb is not enough thread. Lol.

Games needing more that 10GB is inevitable. We are just waiting to laugh at the people who thought that it isn't possible. Lol

Edit: And laugh at the notion that VRAM is hard tied to performance. Lol
 
Last edited:
The consoles already have access to 12GB VRAM for the game's assets, so why would developers not make games that need at least 12GB VRAM. The current 3070 and 3080 wont age like fine wine
 
The consoles already have access to 12GB VRAM for the game's assets, so why would developers not make games that need at least 12GB VRAM. The current 3070 and 3080 wont age like fine wine

Why 12GB? The Xbox has a performance split between 6 and 10GB. PS has no split, but to asume 6GB OS, housekeeping and game+data leaving 10GB for graphics again doesn't seem unreasonable. Plus the big perfomance gain for graphics data comes from streaming.
 
Its going to be the same with RDNA2 and Ampere. If RDNA3 and Hopper land next year,expect RDNA2 and Ampere to also suffer the same fate. Instead of trying to maximise performance,Nvidia and AMD will use the brute force method to emphasise the improvements in their new GPUs. This is the big thing holding back PC gaming - BOTH Nvidia and AMD need to push new hardware sales,so realistically performance is being left on the table,at the altar of new GPU sales.

Many tech companies play these tricks to sell their latest and greatest - even Apple making sure their newer versions of iOS start eating up more RAM,so they can emphasis how much better their new generation is.

I think the same. And i think the big innovations will come from consoles because there you need to make beautiful games with far less resources so they will have to use a lot of tricks to make good looking games. Tech companies are making money from selling you their latest products so the games they sponsor will only work well on their latest products. ( Nvidia are masters here, hell CP only works well on the top 3000 series cards and that is if you enable DLSS :D ). Sony and MS are making the most money by selling huge numbers of games, they make almost no money from selling the hardware.
 
Back
Top Bottom