• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Which other raster games are/were amd massively ahead in? COD is the main one I know of and then there was assassins creed vahalla but nvidia closed the gap massively here when they relased their dx 12 optimisation driver update. Maybe a few other likes FC 6 and callisto protocol but everything else, it's either nvidia win some, lose some or neck and neck.

I am not talking about very recent history but about the whole history of ATI/AMD vs NVIDIA. You can't just look at recent stuff and say they can do this and that and people will buy it - never in history of these 2 companies they did, no matter how much better ATI/AMD was in a given generation (way before RT, so of course only in raster). I put it mostly to the lack of proper marketing.

Well whatever the core issue was, it was an amd driver update that solved it (sorry was 2022, not 2023) - https://videocardz.com/newz/amd-has...he-ground-up-10-better-performance-on-average

Yep, I just described why it took them so long to do it - it wasn't as simple software change as it was possible on NVIDIA hardware with software scheduler.
https://videocardz.com/newz/amd-has...he-ground-up-10-better-performance-on-average
Something has to fund the engineers working on these otherwise they would be bleeding money especially if it is a long term investment/feature.

Pro/Enterprise does. That's where the real monies is this time. Gaming is just by the way these days. Even if today NVIDIA stopped selling any gaming cards for good, they would still be having big growth and great income.

That's only if developers do a **** job.... we have seen now in 3 well regarded titles, the games running pretty well even on consoles....

Sadly, again, cost cutting, rushing release and often games released on consoles are in horrible state (CP2077 was one of the best examples) and pretty much unplayable. Which is why I never buy games on release anymore, PC or consoles, and just wait few months for patches and price drops before I touch any of these.
 
I think metro ee, avatar and spiderman 2 are good showcases tbf and don't think you would be wrong on that statement either. People forget that raster has been around for donkeys now so the devs know all the cheats and tricks to get the best from it whereas with RT, it's completely new and as stated by 4a enhanced, there is still so much to learn and how to get the best from it so I can see the existing RT workloads/graphical effects becoming more efficient over time as opposed to more demanding (heck just look at the likes of bf 5 and control etc. in the beginning, absolutely tanked performance but nowadays, the same effects or more wouldn't hit performance anywhere as much), obviously we still need the hardware to be there in the first place but I think there is a lot left to discover/figure out on the software optimisation and feature front and of course, devs focusing more on RT than raster effects.....
RT algorithms didn't change much at all since BF5 times. Initially it worked horribly bad because they used too many rays in the scene (especially snow areas, which are even harder computationally with whole bunch of refractions, reflections and semi transparent objects like ice, etc.) and hardware couldn't handle it. Which is why after initial waves of RT games we got back mostly to token RT effect here and there, whatever was cheap and didn't require many rays (like shadows) but looked good on marketing. Then CP happened and they put more effects in and tanked performance, then PT in and again tanked performance even more, but you could easily see how noisy it is, how laggy with changing light, and how little details are in the reflections etc. - small number of rays give very noisy output with not many details and then denoising removes even more details and adds delays, then also take time to properly move light around with GI, etc. Just then we got finally AI assistance in cleaning up the image, which saved the situation a bit, even though amount of rays calculated in the scene is still very small.

Metro EE was a great example of how to do things right, not just to showcase most fancy reflections but actually to make things very playable too. Problem is that it takes time, effort and money. Using PT is just so much cheaper and easier but also really not for today's hardware - it barely works (all the issues I've mentioned, not just FPS - shortcuts and simplifications used to get any sensible FPS) in CP2077 as is, with AI involvement, upscaling and FG.
 
Rumours are the only card they are releasing in 2024 is the 5090.

Have to wait until 2025 for anything below that.
It really wouldn't surprise me. As NVIDIA's CEO said, NVIDIA is now an AI company, not a GPU company. That's their priority and that is where most of the R&D has already moved. Gaming seems to be just by the way - whatever they develop for AI is useful for gaming might trickle down but I wouldn't expect any other big improvements for as long as AI FOMO/FOTY is still alive.
 
I am not talking about very recent history but about the whole history of ATI/AMD vs NVIDIA. You can't just look at recent stuff and say they can do this and that and people will buy it - never in history of these 2 companies they did, no matter how much better ATI/AMD was in a given generation (way before RT, so of course only in raster). I put it mostly to the lack of proper marketing.



Yep, I just described why it took them so long to do it - it wasn't as simple software change as it was possible on NVIDIA hardware with software scheduler.
https://videocardz.com/newz/amd-has...he-ground-up-10-better-performance-on-average


Pro/Enterprise does. That's where the real monies is this time. Gaming is just by the way these days. Even if today NVIDIA stopped selling any gaming cards for good, they would still be having big growth and great income.



Sadly, again, cost cutting, rushing release and often games released on consoles are in horrible state (CP2077 was one of the best examples) and pretty much unplayable. Which is why I never buy games on release anymore, PC or consoles, and just wait few months for patches and price drops before I touch any of these.

I only go back to 8800/3450 gpus now (had most amd/ati gen gpus from the 3450 up to vega 56) and tbh can't really recall of AMD having a massive advantage in any particular games until COD, mantle/vulkan games ran very well but then amd needed this as their dx performance at this time was somewhat lacking. Now whilst I wasn't really ever effected too bad by drivers, amd/ati did appear to have some issues here for the masses, as said, I never really had anything serious but obviously my case doesn't apply to others. This is what put people of amd/ati big time.

Well yes, that is the sole cause of the problem 90% of the time, it's the devs to blame here and not RT, the fact we have some raster titles out there that ran/run worst than RT/PT games shows this perfectly and that it's not always a case of "RT is just too expensive".

RT algorithms didn't change much at all since BF5 times. Initially it worked horribly bad because they used too many rays in the scene (especially snow areas, which are even harder computationally with whole bunch of refractions, reflections and semi transparent objects like ice, etc.) and hardware couldn't handle it. Which is why after initial waves of RT games we got back mostly to token RT effect here and there, whatever was cheap and didn't require many rays (like shadows) but looked good on marketing. Then CP happened and they put more effects in and tanked performance, then PT in and again tanked performance even more, but you could easily see how noisy it is, how laggy with changing light, and how little details are in the reflections etc. - small number of rays give very noisy output with not many details and then denoising removes even more details and adds delays, then also take time to properly move light around with GI, etc. Just then we got finally AI assistance in cleaning up the image, which saved the situation a bit, even though amount of rays calculated in the scene is still very small.

Metro EE was a great example of how to do things right, not just to showcase most fancy reflections but actually to make things very playable too. Problem is that it takes time, effort and money. Using PT is just so much cheaper and easier but also really not for today's hardware - it barely works (all the issues I've mentioned, not just FPS - shortcuts and simplifications used to get any sensible FPS) in CP2077 as is, with AI involvement, upscaling and FG.

It's been so long since BF 5 so I could be wrong but weren't the RT reflections only being applied to metal, marble, glass and water/puddles? Don't remember snow or ice really having it applied:


I also didn't think RT transparency was a thing until guardians of the galaxy being the first game to use this.

Obviously impossible to get a like for like game but compare BF 5 RT reflections to something like ghostwire tokyo, guardians of the galaxy, darktide (all of which have additional RT effects too) and they run far better and arguably better imo. You don't think that during the last what 4-5 years, nvidia/game devs haven't found ways to optimise/get more from hardware other than just cutting corners? What do you consider things like SER in UE to be then? What about the driver updates, not just from nvidia but also intel and amd where they have improved RT performance without cheating/cutting corners on IQ?

The only serious initial investment in terms of time and money would be the first attempt at doing it right (which is pretty normal in development when you're learning new tools and changing your workflows to work for th new ways, after that hurdle is done, then you see the benefits of things being simpler, cheaper, quicker to run/do) as stated by 4a enhanced and cdpr, the overall investment of doing rt/pt proper and including DLSS, FG etc. would still be far less than that of doing purely raster, again, the core issues comes down to devs still working on both raster and RT which is why they aren't seeing the real benefits yet....

The reflections in metro ee were the least impressive for me tbh, it was more the lighting, shadows and GI that set it apart.
 
Was thinking of doing it like anytime now but could hold off maybe 6-9 months if we are defo getting new stuff by then.
I would hold off as the new Intel and AMD CPUs and Chipsets are coming later this year along with new AMD and Nvidia GPUs, between October and December, and the 5090 is looking like it will top the 4090 by a good margin.
 
I would hold off as the new Intel and AMD CPUs and Chipsets are coming later this year along with new AMD and Nvidia GPUs, between October and December, and the 5090 is looking like it will top the 4090 by a good margin.

Tricky one, because I'm not unhappy with my 4090 even at 4k but the temptation of new shiny is ever present. Not that this is new to anyone who owns a PC, the whole does it make sense conversation we have before throwing money at something.
 
I do ask myself on a regular basis if i made the right decision on the 4080 seeing what the 4090 can do. Its a difficult one for sure, some days i wish i had got the the 4090 for the raw performance but then see the price and still just cant justify it to myself. Though i will heavily consider the 5090 when its out.
 
What just happened?
4070 Super below MSRP weeks after release?

Whole thing a mess with regular 4070s more expensive than some 4070 Supers, etc. and by and large seems no one really wants them.

nVidia have completely stuffed up this generation, though they might have done OK financially I dunno, but they've left a lot of money on the table they could have picked up.
 
Last edited:
Trouble is existing 4070 stock will have been bought at a set price, that price will dictate current pricing and Nvidia probably don't want to refund a discount on what might be a million 4070 's sitting in retail inventory, Nvidia probably expect those retailers to take the losses.
 
Trouble is existing 4070 stock will have been bought at a set price, that price will dictate current pricing and Nvidia probably don't want to refund a discount on what might be a million 4070 's sitting in retail inventory, Nvidia probably expect those retailers to take the losses.

Yup, just ask the competitor still holding onto 6900xt stock, they have an Asus lc 6900xt that's come down in price but still a pointless buy when you can get a much faster 7900xt for not much more.
 
Yup, just ask the competitor still holding onto 6900xt stock, they have an Asus lc 6900xt that's come down in price but still a pointless buy when you can get a much faster 7900xt for not much more.

Not good is it?

"here is our new GPU, its £600 to you if you buy 5,000 of them, RRP is £660, that gives you 10% margins"
8 months later..... "how many of those 5,000 GPU's we sold you for £600 a pop do you have left?, They are over priced and you can't sell them? We know, that's why RRP on those is now £550, have fun with that....."
 
Last edited:
I'm defo holding out for 50xx now, 30xx has been doing pretty well in some of the latest titles I've been playing including UE 5 (beatings amds 7800xt by a substantial amount too and that's without RT), FG + good upscaling IQ has made this even more possible :cool:

Good lad. Wait and you will get the much better deal. Also you may feel as you waited so long to treat yourself to a 5090/80 rather than a 5070.
 
Good lad. Wait and you will get the much better deal. Also you may feel as you waited so long to treat yourself to a 5090/80 rather than a 5070.
Exactly.

If 5090 doesn't take the **** i.e. <£1500, may just go for it as I can't see nvidia making the same mistake they did with the 3080 - 3090 again in terms of perf and price.
 
Back
Top Bottom