• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Cyberpunk 2077 Ultra performance

You are missing a lot of points. No one said that Big Navi is better at RT than Nvidia, i only said it is good enough to run beautiful games. It is not good enough to run Nvidia's bloatware and it will never be good enough for that, even if they put twice the RT cores Nvidia puts in their cards. Then Nvidia will stop sponsoring heavy RT games and move to the next feature that gives them a big advantage. So why should any manufacturer put a crap ton of RT cores in their cards if they never get used outside Nvidia sponsored games? Why should they do that since UE5 will come with a much cheaper and good enough tech to do real time GI?

If AMD want to compete on price then they have to compete on performance and features. Since when was 3DMark's benchmark considered to be Nvidia's garbage? Why would Nvidia want to back away from RT? Do you not understand RT has long been the holy grail of computer graphics.

DirectX Raytracing Feature Test

1 GPU
  1. Score 65.23, GPU 3090 @2250/5512, CPU 10900k @5.3, Post No.0491, Jay-G25 - Link Drivers 460.89
  2. Score 64.34, GPU 3090 @2235/5344, CPU 7820X @4.7, Post No.0489, anihcniedam - Link Drivers 460.89
  3. Score 63.87, GPU 3090 @2205/5328, CPU 6950X @4.401, Post No.0496, FlyingScotsman - Link Drivers 460.89
  4. Score 63.14, GPU 3090 @2265/4876, CPU 5950X @4.8, Post No.0462, OC2000 - Link Drivers 460.79
  5. Score 62.98, GPU 3090 @2205/5328, CPU 9900KF @5.0, Post No.0379, spartapee - Link Drivers 457.09
  6. Score 62.38, GPU 3090 @2160/4976, CPU 9900k @5.0, Post No.0480, Raiden85 - Link Drivers 460.89
  7. Score 61.61, GPU 3090 @2115/5128, CPU 9980XE @4.5, Post No.0487, Greebo - Link Drivers 460.89
  8. Score 60.23, GPU 3090 @2145/5176, CPU 3175X @4.8, Post No.0415, sedy25 - Link Drivers 457.30
  9. Score 59.34, GPU 3090 @2070/4976, CPU 5950X @4.965, Post No.0474, Grim5 - Link Drivers 460.89
  10. Score 58.58, GPU 3090 @2100/5276, CPU 3600X @4.4, Post No.0445, Bickaxe - Link Drivers 457.51
  11. Score 55.57, GPU 3090 @1980/4876, CPU 5950X @4.1, Post No.0429, Kivafck - Link Drivers 457.30
  12. Score 55.57, GPU 3090 @1995/4876, CPU 10900k @5.1, Post No.0357, Sedgey123 - Link Drivers 457.09
  13. Score 55.50, GPU 3090 @2085/5076, CPU 3800X @4.7, Post No.0450, ChrisUK1983 - Link Drivers 457.51
  14. Score 55.47, GPU 3090 @2040/4876, CPU 5900X @3.7, Post No.0423, atomic7431 - Link Drivers 457.30
  15. Score 54.39, GPU 3090 @1905/5176, CPU 10900k @5.2, Post No.0446, kipperthedog - Link Drivers 457.51
  16. Score 52.24, GPU 3080 @2235/5252, CPU 3900X @4.649, Post No.0413, haszek - Link Drivers 457.09
  17. Score 50.56, GPU 3080 @2145/5248, CPU 3600 @4.4, Post No.0411, TNA - Link Drivers 457.30
  18. Score 34.15, GPU 6900XT @2625/4280, CPU 5800X @5.049, Post No.0477, 6900 XT - Link Drivers 20.12.2
  19. Score 33.31, GPU 3070 @2085/4050, CPU 3175X @4.12, Post No.0392, sedy25 - Link Drivers 457.09
  20. Score 32.54, GPU 2080 Ti @2130/3500, CPU 3950X @4.301, Post No.0357, Grim5 - Link Drivers 452.06
  21. Score 29.91, GPU 2080 Ti @1980/3500, CPU 8700 @4.3, Post No.0391, Quartz - Link Drivers 456.55
  22. Score 23.96, GPU 6800 @2295/4220, CPU 3900X @4.541, Post No.0459, Chrisc - Link Drivers 20.12.1
  23. Score 21.36, GPU 2080 @2025/4050, CPU 9900k @5.0, Post No.0365, Cooper - Link Drivers 457.09

AI SS depends of course on Microsoft, just like Nvidia's "RTX I/O" depends on Microsoft. If they have dedicated hardware or not for that, we will see if/when the AI SS will come. But do you think Microsoft or Sony will not ask for dedicated hardware in their consoles if that was needed? We will have to see when it comes.

No AI SS does not depend on Microsoft as Nvidia have proven years ago. Indeed Microsoft have been working with Nvidia to devlop their Direct ML solutions, while AMD relies on Microsoft as it is unable to develop a solution on it's own. It's nothing like RTX I/O, which is a feature enhancement that sits on top of Microsoft's Direct IO. You don't need to wait to see if AMD have dedicated hardware for ML as we already know, they don't. Microsoft and Sony had proprietary extensions added to their console hardware as AMD's RDNA2 on it's own was not good enough. You don't get these performance extensions with your PC RDNA2 card.

AMD need to be more realistic with RDNA2, price the 6800XT at £450 to make up for it's lackluster feature set.
 
I can also see some fine wine pouring from my poor 5600xt but tbh what sells a card and what drives the narative are the launch day reviews. And AMD is not so great on launch day.

Yup exactly, but if people are buying based on those day 1 reviews, they are obviously happy with their choice at that time, so ultimately, they are going to be delighted to see their cards still getting some love every year with further performance increases etc. and not being left behind.

Where as if you buy a card only to see it not getting the same attention just because the new products are out and having it get outmatched by quite a decent margin compared to its counterpart not long after purchase, well, I personally wouldn't be the most pleased...
 
So most people that buy 3080 will upgrade in 12 months but those who buy the big navi will keep their cards for years.
I guess heavy RT performance is not such a big deal anyway or the 6800xt would not be as big of a seller. Most people will upgrade in 12 months anyway. How many games will Nvidia sponsor before that? Not many. :D
Why do you have to turn everything into a AMD vs Nvidia thing? :p

I am glad I went 3080 this time around and hindsight I would have done the exact same thing. I upgrade all the time as I enjoy doing so. If I had a 6800 XT the same would be happening. Hell, I had a Radeon R9 Nano, Radeon RX Vega Liquid (both bought from LtMatt) and had a Vega 56 and a Vega 64. All same gen cards, so to upgrade only once this gen is actually doing quite well for me. Nothing to do with what you are trying to twist things into ;)
 
If AMD want to compete on price then they have to compete on performance and features. Since when was 3DMark's benchmark considered to be Nvidia's garbage? Why would Nvidia want to back away from RT? Do you not understand RT has long been the holy grail of computer graphics.
No you are wrong, RT isn't any holy grail. The holy grail of gaming graphics is realistic visuals. RT can be the best way to the holy grail but it is not the holy grail. And if there is a huge resource cost for real time realistic graphics done trough RT, then you need to find better ways to get closer to realism.
Who cares about the 3d mark benchmarks? In theory you can make a card as big as a truck that eats as much power as a whole town. With one million RT cores. And it will score very high in 3d mark. But we will never see so many RT cores used in a game (unless it is an Nvidia card and Nvidia sponsors that game ). :D
We will have to see in the next 3-5 years how the games that come from PS5 will look. Because that will tell us if the RT resources PC cards have are well used or we only get some brute force RT used in PC games, for marketing purposes, with very little or no optimization.
Let's see if we will have any independent developer that will use RT inside his games without being sponsored by Nvidia or AMD. And let's see what is the level of RT in that game. My bet is they will use very limited RT.

But i agree that AMD cards are expensive, hell i think even the Nvidia cards should cost half than they cost today. AMD should cost like 30% of the prices they have today :D.
 
Oddly enough that system once complete will benefit from hardware RT cores. It reminds me of Nvidia's VXGI and may well be based on it.
Why don't you just say that Nvidia worked with Epic to help them develop it because they were unable to do it, just like you think they helped Microsoft with their Direct ML? :D
Is there something in the graphics that Nvidia hasn't invented and good-hearted Jensen hasn't shared with the dummies that run the other companies? We shall all be grateful for that.
 
No you are wrong, RT isn't any holy grail. The holy grail of gaming graphics is realistic visuals.

Techniques like UE5's Lumen engine are great and all but they are still limited compared to RT - it uses a limited amount of ray tracing techniques using voxels, etc. as a guide so has lower fidelity for indirect light, more limited in what objects can and can't be processed, etc. can't possibly do some of the more advanced caustics and refraction, etc. that ray tracing can do.

RT is the holy grail for achieving a certain level of realism in visuals.
 
Techniques like UE5's Lumen engine are great and all but they are still limited compared to RT - it uses a limited amount of ray tracing techniques using voxels, etc. as a guide so has lower fidelity for indirect light, more limited in what objects can and can't be processed, etc. can't possibly do some of the more advanced caustics and refraction, etc. that ray tracing can do.

RT is the holy grail for achieving a certain level of realism in visuals.
But i am talking about the performance cost. RT can't also do a lot of things, that's why we have denoising. Unless Nvidia makes that card as big as a truck, RT is not the way to more realism. If you can do almost the same at half the costs, then you'll do it at half the cost.
And again realism is the holy grail.
 
But i am talking about the performance cost. RT can't also do a lot of things, that's why we have denoising. Unless Nvidia makes that card as big as a truck, RT is not the way to more realism. If you can do almost the same at half the costs, then you'll do it at half the cost.

It is coming - we might not be able to achieve the fidelity of offline ray tracers any time soon but we can feasibly exceed the quality of something like Lumen with acceptable denoising and performance with a bit more work hardware and software wise and with all due respect the nay-sayers on this haven't been playing with the latest developments.
 
No you are wrong, RT isn't any holy grail. The holy grail of gaming graphics is realistic visuals. RT can be the best way to the holy grail but it is not the holy grail. And if there is a huge resource cost for real time realistic graphics done trough RT, then you need to find better ways to get closer to realism.

RT is modled on how our vision works. You don't get any more realistic than that.

Who cares about the 3d mark benchmarks? In theory you can make a card as big as a truck that eats as much power as a whole town. With one million RT cores. And it will score very high in 3d mark. But we will never see so many RT cores used in a game (unless it is an Nvidia card and Nvidia sponsors that game ). :D

Any one interested in buying a graphics card should care about the 3DMark benchmarks as they give a good idea of how one card will perform against another. Again you are whining about Nvidia, the only possible reason you are is that only Nvidia are pushing the tech. AMD are sadly ~2years behind.

We will have to see in the next 3-5 years how the games that come from PS5 will look. Because that will tell us if the RT resources PC cards have are well used or we only get some brute force RT used in PC games, for marketing purposes, with very little or no optimization.
Let's see if we will have any independent developer that will use RT inside his games without being sponsored by Nvidia or AMD. And let's see what is the level of RT in that game. My bet is they will use very limited RT.

If I wanted console level graphics I'd have bought a console or a RDNA2 card. As we can see from The Medium, consoles/RDNA2 will be using low quality RT, while the quality settings will only be available for Nvidia cards. This is simply due to RDNA2 lacking the performance.

But i agree that AMD cards are expensive, hell i think even the Nvidia cards should cost half than they cost today. AMD should cost like 30% of the prices they have today :D.

I think £649 for the 3080 was a decent price. I only upgraded from a 1080Ti for new tech. I skipped the 2080Ti due to it's RT performance. The 3090 did tempt me, but I don't see Ampere/RDNA2 lasting more than a generation. Now with AMD trying to compete with the 3080, while lacking the RT performance and AI SS, does chopping off 1/3rd of the price seem such a bad idea?
 
It is coming - we might not be able to achieve the fidelity of offline ray tracers any time soon but we can feasibly exceed the quality of something like Lumen with acceptable denoising and performance with a bit more work hardware and software wise and with all due respect the nay-sayers on this haven't been playing with the latest developments.
I agree with what you say. That you can get far better visuals than you'll get on Lumen. But i am not sure how many games will do a good optimization for that. Simply put i don't trust the games sponsored by Nvidia, i think they are not made to show you what the tech can do, they are made to show you what the new Nvidia cards can do.
I am looking forward to see what some independent developers can do with the hardware we have now, i think you can make a much better looking game with the hw resources they have spent on CP, but i am not sure there will be too many to do it in the next 3-4 years.
And i don't expect RT to be pushed too far very soon because you can only put a certain number of RT cores inside the chipset. So i think the focus will be in better denoising and/or RT alternatives. That's why for me something like Lumen is more interesting.
RT is modled on how our vision works. You don't get any more realistic than that.
That is in theory. In practice RT is limited in PC games by reducing the number of bounces and the number of rays. How much is it reduced? As much as it needs to run well only on latest Nvidia hardware. :D
 
Last edited:
Why don't you just say that Nvidia worked with Epic to help them develop it because they were unable to do it, just like you think they helped Microsoft with their Direct ML? :D
Is there something in the graphics that Nvidia hasn't invented and good-hearted Jensen hasn't shared with the dummies that run the other companies? We shall all be grateful for that.

Because I don't follow Epic or Unreal Engine. I get the impression you think Nvidia just develop gaming GPUs?
 
I agree with what you say. That you can get far better visuals than you'll get on Lumen. But i am not sure how many games will do a good optimization for that. Simply put i don't trust the games sponsored by Nvidia, i think they are not made to show you what the tech can do, they are made to show you what the new Nvidia cards can do.
I am looking forward to see what some independent developers can do with the hardware we have now, i think you can make a much better looking game with the hw resources they have spent on CP, but i am not sure there will be too many to do it in the next 3-4 years.
And i don't expect RT to be pushed too far very soon because you can only put a certain number of RT cores inside the chipset. So i think the focus will be in better denoising and/or RT alternatives. That's why for me something like Lumen is more interesting.

What do you think will happen with transistor count when Nvidia move from 8nm to 5nm? Remember AMD is already on 7nm.

That is in theory. In practice RT is limited in PC games by reducing the number of bounces and the number of rays. How much is it reduced? As much as it needs to run well only on latest Nvidia hardware. :D

RT is limited by affordable hardware(7/8nm today, 5nm next gen), not the use of. It really isn't Nvidia's fault that AMD doesn't have the performance to compete. That was all AMD's doing.
 
That was all AMD's doing.

I'm sure there are more people but AMD only seem to have one person really working on the RT side as well (Aaron Hagan) whereas nVidia have several people (visibly) active in that role. Looks like a couple of nasty stability issues in the AMD driver as well which is probably why CP2077 doesn't do it on AMD yet.
 
What do you think will happen with transistor count when Nvidia move from 8nm to 5nm? Remember AMD is already on 7nm.



RT is limited by affordable hardware(7/8nm today, 5nm next gen), not the use of. It really isn't Nvidia's fault that AMD doesn't have the performance to compete. That was all AMD's doing.

I think the one Nvidia is using Samsung 8nm is more like 10nm just named different.

Either way, I'm going to sell my 3080 and jump on the next one when it comes. Im really enjoying ray tracing but it isn't fast enough right now.

I would like to play max settings 4k with quality dlss and stay locked to 60fps.
 
Last edited:
What do you think will happen with transistor count when Nvidia move from 8nm to 5nm? Remember AMD is already on 7nm.
Not too many good things. The count will increase and also the RT cores. And Nvidia will sponsor new games that run well only on their latest hardware. But don't you think there is a point where they will have to stop? They already have a huge die compared with the 2080 and that is 8nm vs 14nm. And how many RT cores has the 3080 compared with 2080? I think like 30% extra or something.


RT is limited by affordable hardware(7/8nm today, 5nm next gen), not the use of. It really isn't Nvidia's fault that AMD doesn't have the performance to compete. That was all AMD's doing.
So it is like in real life but limited. It's not God's fault if Jensen can't make his cards good enough to compete. :D

I'm sure there are more people but AMD only seem to have one person really working on the RT side as well (Aaron Hagan) whereas nVidia have several people (visibly) active in that role. Looks like a couple of nasty stability issues in the AMD driver as well which is probably why CP2077 doesn't do it on AMD yet.
That's a surprise, there is someone at AMD working for something for their videocards. There is still hope. :)
 
Not too many good things. The count will increase and also the RT cores. And Nvidia will sponsor new games that run well only on their latest hardware. But don't you think there is a point where they will have to stop? They already have a huge die compared with the 2080 and that is 8nm vs 14nm. And how many RT cores has the 3080 compared with 2080? I think like 30% extra or something.

Not too many good things? A massively faster GPU is what we will be getting. We already know Nvidia are working on chiplets and the tech to link them.

So it is like in real life but limited. It's not God's fault if Jensen can't make his cards good enough to compete. :D

Nvidia aren't competing, they are dominating. People who want the best eye-candy still don't have a choice.
 
Back
Top Bottom