• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Cyberpunk 2077 Ultra performance

Lol the guy has a 3770K and is endlessly promoting Nvidia on here. No wonder he only has a 3080 in his sig.
Maybe Jensen will send him a brand new 10900K for his effort since he obviously won't be able to bring himself buy an AMD cpu. :cool:

I've been talking about the best card between the 3080 and 6800XT which are around the same price. That just happens to be the 3080. I don't care what team makes the better card. If the 6800XT were it then that's what I'd be using right now, but it has no dedicated hardware AI support and garbage RT performance. I upgraded from a 1080Ti just for RT.

I know how these cards perform versus the rest of the system and up until now I have been happy with the cool performance. Visual Studio, Sandboxed web browsers with 2-300 tabs open, Skype calls, MS Office, TV tuner running in a window if not 2nd monitor while I work etc. all performing well together. The 3770k has far outlived expectation, even for 1440p/60Hz gaming.

Right now I'm waiting for the 5950x to be more available and fingers crossed a decent motherboard without the fan, though I just may wait until we hit PCIe 5/DDR5.
 
Last edited:
Yes actually and it was a close call because I had one situation where the 10GB VRAM was a problem. VR supersampling on my Pimax 8KX in DCS was causing stutter in some scenarios. I had to drop SS down to remove the stutter even though I had the GPU and CPU horsepower to drive higher supersampling. I understand this is an extreme outlier but you can joke all you want, 10GB will be a problem soon enough.
So how comes you downgraded then? :p

I mean I assume your thought process was similar to mine. By the time 10gb not being enough becomes a regular issue on new releases you will be rocking a new next gen card?
 
So how comes you downgraded then? :p

I mean I assume your thought process was similar to mine. By the time 10gb not being enough becomes a regular issue on new releases you will be rocking a new next gen card?

I'm not sure about that. 10gb is not holding up well as it is..Doom eternal allready uses over 9GB at 4k. if a game already uses over 9GB..thats not promisiing for future titles and its a safe bet that more then 10gb is going to be needed anyways since the consoles are using more then 10GB. as far as 1440p goes its fine. but im not so sure about 4k and higher resolutions. a 3080ti seems like the better upgrade for 4k over the 3080
 
I don't see 10gb being an issue for at least 2 years @ 4k, 2023 is when we'll get proper next gen games with the old consoles being dropped.

I game at 1440p, even on my TV as it does 1440 120 or 4k 60.
 
I think it's a bit of reach to assume money was an issue when we are talking about the same person buying a ~ £700+ GPU. More likely just biding his time until he gets the CPU he actually wants.

Fact remains though.

I’m in a similar position too albeit a bit more balanced. Don’t really want to do a platform upgrade until I really need to or there is a bit more value in the market. This time may never come now I appreciate.
 
So how comes you downgraded then? :p

I mean I assume your thought process was similar to mine. By the time 10gb not being enough becomes a regular issue on new releases you will be rocking a new next gen card?

The 6800 non XT was ~10 - 15% slower than my 3080, and at MSRP that makes it decent value if RT is not a requirement. Though yes overall I felt the same really when it came to RAM or RT. I had said post 6800XT release that it was "meh" and to sum up why. Let me clarify the "meh" does not mean bad because it is a good GPU, just the release was poor.
  • 6800XT has good rasterisation performance, about 5-10% slower than AMD claimed compared to the 3080 (resolution dependent). Yet still trades blows and is close enough to call it a draw.
  • AMD lied about availability.
  • MSRP is $50 too high because overall it offers less than a 3080 apart from more VRAM. Though AMD are selling every one they make, so what do we know about pricing.
  • Ray tracing is pointless without a DLSS equivalent.
  • Their DLSS equivalent is not ready and potential means nothing.
 
I'm not sure about that. 10gb is not holding up well as it is..Doom eternal allready uses over 9GB at 4k. if a game already uses over 9GB..thats not promisiing for future titles and its a safe bet that more then 10gb is going to be needed anyways since the consoles are using more then 10GB. as far as 1440p goes its fine. but im not so sure about 4k and higher resolutions. a 3080ti seems like the better upgrade for 4k over the 3080

By the time what you say happens I will be rocking a 4080 though ;)


The 6800 non XT was ~10 - 15% slower than my 3080, and at MSRP that makes it decent value if RT is not a requirement. Though yes overall I felt the same really when it came to RAM or RT. I had said post 6800XT release that it was "meh" and to sum up why. Let me clarify the "meh" does not mean bad because it is a good GPU, just the release was poor.
  • 6800XT has good rasterisation performance, about 5-10% slower than AMD claimed compared to the 3080 (resolution dependent). Yet still trades blows and is close enough to call it a draw.
  • AMD lied about availability.
  • MSRP is $50 too high because overall it offers less than a 3080 apart from more VRAM. Though AMD are selling every one they make, so what do we know about pricing.
  • Ray tracing is pointless without a DLSS equivalent.
  • Their DLSS equivalent is not ready and potential means nothing.
Yeah can’t argue with that. But point is the vram issue which some blown way out of proportion is not as much of a big deal or the 3080 would not be as big of a seller. Most people will just upgrade in 12 months or so when the new cards come out. Problem solved. How many games will come out that will need more vram between now and then that you will want to play? Not many :D
 
So most people that buy 3080 will upgrade in 12 months but those who buy the big navi will keep their cards for years.
I guess heavy RT performance is not such a big deal anyway or the 6800xt would not be as big of a seller. Most people will upgrade in 12 months anyway. How many games will Nvidia sponsor before that? Not many. :D
 
So most people that buy 3080 will upgrade in 12 months but those who buy the big navi will keep their cards for years.
I guess heavy RT performance is not such a big deal anyway or the 6800xt would not be as big of a seller. Most people will upgrade in 12 months anyway. How many games will Nvidia sponsor before that? Not many. :D

For most people it is not a big deal. I know this because I know enough people who really don't give a crap about RT at all. So everyone is different, but for me when I play Watchdogs Legion I can't do it without RT enabled. WDL RT reflections are done far better than those in CP2077 and the game looks far more life-like than the cartoonish CP2077.
 
I think that if AMD has put twice the number of RT cores PS5 have, in their PC card flagship, that is a very good design decision. Probably the 6800xt is as powerful as PS5 pro will be when/if will be launched. So the hardware for beautiful games is there, what we miss is good games developers like those who are working for Sony.
UE 5 will launch this year and they will have Lumen - real time GI without RT. And if the devs can do good real time GI at less than half of performance cost, then why do we need a lot of RT cores?
I think the problem with AMD is that they became as greedy as Nvidia, without offering the support Nvidia is offering. Yes they don't have the resources Nvidia has but that's not our problem. If you are unable to work with game devs and to offer the same ammount of quality software Nvidia is offering, then your prices should be much lower than they are.
 
Last edited:
Didn't you all know by now that nvidia users "have" to upgrade more often where as amd cards age like fine wine..... ;) :D :p

But in all seriousness, AMD does age better, you just have to look at previous cards to see that AMDs cards all out compete their nvidia counterparts in the long run, especially in newer games. Whether that be nvidia sabotaging their previous cards to sell new cards, nvidia driver team just not optimising to the same level for older cards, amds card not being released with the best optimisation to start of with, having amd powering consoles helping out here.... take your pick.

For example, my veg 56, originally was a match for the 1070 on release, 1070 winning more than it was losing, where as now, a stock vega 56 is on the heels of a 1080, and in some cases, even beating a 1080 (most recent titles being cyberpunk and assassins creed vahalla), throw in a good undervolt and overclock and you're laughing even more.


This time though, I'm wanting a 3080 and that's for 3 reasons, pretty much the same as idcp):

- ray tracing (if game developers are going to start sabotaging the default/game engine effects to push rtx ray tracing instead, well as you know what they say, if you can't beat them, join them)
- DLSS (still not as good as native but as idcp said, without this, ray tracing is pointless)
- price (mainly for the FE, £650 is good value "nowadays" compared to the competition)

Ultimately, AMD fall short far too much for their lack of ray tracing perf. and no alternative to DLSS (fidelity cas etc. is not the same)

However, if more games go down the way like assassins creed vahalla, godfall, dirt etc. then I'll be kicking myself and trading the 3080 for a 6800(xt) asap :D
 
So most people that buy 3080 will upgrade in 12 months but those who buy the big navi will keep their cards for years.
I guess heavy RT performance is not such a big deal anyway or the 6800xt would not be as big of a seller. Most people will upgrade in 12 months anyway. How many games will Nvidia sponsor before that? Not many. :D

Big Navi, RDNA2, is already a poor choice. We will be upgrading to Lovelace/RDNA3 for better GPUs. As you can see from the Cyberpunk 2077 thread RT is a big plus and requires AI SS. RDNA2 has garbage RT performance and no AI SS solution. Sadly RDNA2 looks to be what it was designed to be, console class. I'm sure many of those who have bought a 6800XT bought it thinking the RT performance was there and the 'we are working on it' for AI SS was good enough. Ask them where that AI SS solution is today, or what the RT performance is like and we get the 'I don't care about RT' BS AKA buyer's remorse :D
 
But in all seriousness, AMD does age better

My problem with this, unless people are diehard holding on to cards, it is too little too late - my 780GHz was showing custom 290X models on release the door, took a few months before they were catching up and 2-3 years later before the results were inverted (and a part of that was the Kepler architecture falling down a bit in some newer games) - by which time both cards were old news. Same with my GTX1070 - the equivalent AMD cards are overhauling now but it took nearly 4 years for that to happen - by which point I've moved on heh.
 
Big Navi, RDNA2, is already a poor choice. We will be upgrading to Lovelace/RDNA3 for better GPUs. As you can see from the Cyberpunk 2077 thread RT is a big plus and requires AI SS. RDNA2 has garbage RT performance and no AI SS solution. Sadly RDNA2 looks to be what it was designed to be, console class. I'm sure many of those who have bought a 6800XT bought it thinking the RT performance was there and the 'we are working on it' for AI SS was good enough. Ask them where that AI SS solution is today, or what the RT performance is like and we get the 'I don't care about RT' BS AKA buyer's remorse :D
Sounds like you are talking about CP 2077- expectations vs reality. :)
RDNA2 is good enough, it just doesn't run Nvidia's garbage very well.
 
Sounds like you are talking about CP 2077- expectations vs reality. :)
RDNA2 is good enough, it just doesn't run Nvidia's garbage very well.

DirectX Raytracing Feature Test

1 GPU
  1. Score 65.23, GPU 3090 @2250/5512, CPU 10900k @5.3, Post No.0491, Jay-G25 - Link Drivers 460.89
  2. Score 64.34, GPU 3090 @2235/5344, CPU 7820X @4.7, Post No.0489, anihcniedam - Link Drivers 460.89
  3. Score 63.87, GPU 3090 @2205/5328, CPU 6950X @4.401, Post No.0496, FlyingScotsman - Link Drivers 460.89
  4. Score 63.14, GPU 3090 @2265/4876, CPU 5950X @4.8, Post No.0462, OC2000 - Link Drivers 460.79
  5. Score 62.98, GPU 3090 @2205/5328, CPU 9900KF @5.0, Post No.0379, spartapee - Link Drivers 457.09
  6. Score 62.38, GPU 3090 @2160/4976, CPU 9900k @5.0, Post No.0480, Raiden85 - Link Drivers 460.89
  7. Score 61.61, GPU 3090 @2115/5128, CPU 9980XE @4.5, Post No.0487, Greebo - Link Drivers 460.89
  8. Score 60.23, GPU 3090 @2145/5176, CPU 3175X @4.8, Post No.0415, sedy25 - Link Drivers 457.30
  9. Score 59.34, GPU 3090 @2070/4976, CPU 5950X @4.965, Post No.0474, Grim5 - Link Drivers 460.89
  10. Score 58.58, GPU 3090 @2100/5276, CPU 3600X @4.4, Post No.0445, Bickaxe - Link Drivers 457.51
  11. Score 55.57, GPU 3090 @1980/4876, CPU 5950X @4.1, Post No.0429, Kivafck - Link Drivers 457.30
  12. Score 55.57, GPU 3090 @1995/4876, CPU 10900k @5.1, Post No.0357, Sedgey123 - Link Drivers 457.09
  13. Score 55.50, GPU 3090 @2085/5076, CPU 3800X @4.7, Post No.0450, ChrisUK1983 - Link Drivers 457.51
  14. Score 55.47, GPU 3090 @2040/4876, CPU 5900X @3.7, Post No.0423, atomic7431 - Link Drivers 457.30
  15. Score 54.39, GPU 3090 @1905/5176, CPU 10900k @5.2, Post No.0446, kipperthedog - Link Drivers 457.51
  16. Score 52.24, GPU 3080 @2235/5252, CPU 3900X @4.649, Post No.0413, haszek - Link Drivers 457.09
  17. Score 50.56, GPU 3080 @2145/5248, CPU 3600 @4.4, Post No.0411, TNA - Link Drivers 457.30
  18. Score 34.15, GPU 6900XT @2625/4280, CPU 5800X @5.049, Post No.0477, 6900 XT - Link Drivers 20.12.2
  19. Score 33.31, GPU 3070 @2085/4050, CPU 3175X @4.12, Post No.0392, sedy25 - Link Drivers 457.09
  20. Score 32.54, GPU 2080 Ti @2130/3500, CPU 3950X @4.301, Post No.0357, Grim5 - Link Drivers 452.06
  21. Score 29.91, GPU 2080 Ti @1980/3500, CPU 8700 @4.3, Post No.0391, Quartz - Link Drivers 456.55
  22. Score 23.96, GPU 6800 @2295/4220, CPU 3900X @4.541, Post No.0459, Chrisc - Link Drivers 20.12.1
  23. Score 21.36, GPU 2080 @2025/4050, CPU 9900k @5.0, Post No.0365, Cooper - Link Drivers 457.09

Where are we with AMD's AI SS solution? Is it still the case that AMD are incapable of developing their own solution and are waiting on Microsoft? Does RDNA2 have dedicated hardware to run AI based workloads or will it detract from existing raster/RT workloads as RT already does?
 
The flip flop is complete. The RDNA2 thread turned into a cyberpunk can't be played without ray tracing and now the cyberpunk thread is an RDNA2 Thread with added vram is enough waffle.
 
My problem with this, unless people are diehard holding on to cards, it is too little too late - my 780GHz was showing custom 290X models on release the door, took a few months before they were catching up and 2-3 years later before the results were inverted (and a part of that was the Kepler architecture falling down a bit in some newer games) - by which time both cards were old news. Same with my GTX1070 - the equivalent AMD cards are overhauling now but it took nearly 4 years for that to happen - by which point I've moved on heh.

Yup no doubt we are at the stage where the likes of vega 56/1070/1080 etc. aren't enough for certain games @1440p or higher and an upgrade is required anyway i.e. cyberpunk (unless the developers can do a far better job of optimisation in future patches....)

Obviously if you are someone who upgrades every year regardless then this won't be a factor but if you tend to keep a card for at least 2-3 years then it can be a pretty big bonus. It didn't take too long for vega cards to be out performing their counterparts in most/all titles though (2018 is where people started catching on)




Also, to add to the vram debate, with cyberpunk, I noticed it was a little hitchy/stuttery and noticed that my VRAM was showing maxed out, so going by the advice on reddit, I enabled HBCC, which in theory allowed my GPU to now have 11GB VRAM instead of just 8GB and this got rid of all the stuttering.

8GB VRAM defo is not enough going forward, I think 10GB will be enough for the vast majority of games @1440P for the next 2 years at least though.
 
Where are we with AMD's AI SS solution? Is it still the case that AMD are incapable of developing their own solution and are waiting on Microsoft? Does RDNA2 have dedicated hardware to run AI based workloads or will it detract from existing raster/RT workloads as RT already does?
You are missing a lot of points. No one said that Big Navi is better at RT than Nvidia, i only said it is good enough to run beautiful games. It is not good enough to run Nvidia's bloatware and it will never be good enough for that, even if they put twice the RT cores Nvidia puts in their cards. Then Nvidia will stop sponsoring heavy RT games and move to the next feature that gives them a big advantage. So why should any manufacturer put a crap ton of RT cores in their cards if they never get used outside Nvidia sponsored games? Why should they do that since UE5 will come with a much cheaper and good enough tech to do real time GI?
AI SS depends of course on Microsoft, just like Nvidia's "RTX I/O" depends on Microsoft. If they have dedicated hardware or not for that, we will see if/when the AI SS will come. But do you think Microsoft or Sony will not ask for dedicated hardware in their consoles if that was needed? We will have to see when it comes.
 
Yup no doubt, we are at the stage where the likes of vega 56/1070/1080 etc. aren't enough for certain games @1440p or higher and an upgrade is required anyway i.e. cyberpunk.

Obviously if you are someone who upgrades every year regardless then this won't be a factor but if you tend to keep a card for at least 2-3 years then it can be a pretty big bonus. It didn't take too long for vega cards to be out performing their counterparts in most/all titles though (2018 is where people started catching on)




Also, to add to the vram debate, with cyberpunk, I noticed it was a little hitchy/stuttery and noticed that my VRAM was showing maxed out, so going by the advice on reddit, I enabled HBCC, which in theory allowed my GPU to now have 11GB VRAM instead of just 8GB and this got rid of all the stuttering.

8GB VRAM defo is not enough going forward, I think 10GB will be enough for the vast majority of games @1440P for the next 2 years at least though.
I can also see some fine wine pouring from my poor 5600xt but tbh what sells a card and what drives the narative are the launch day reviews. And AMD is not so great on launch day.
 
Back
Top Bottom