• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Ratchet and Clank: Rift Apart RDNA 2 Ray Tracing

I looked into PS2 player count and it has so many ifs and buts, developer restrictions, having specific customs servers for the highest player counts, etc. But if it was more than 128 player stable then that just makes one more suspect of COD and BF not bringing this setting to PC. I did not mention RT as PC-specific tech because it was already in Consoles before it got any decent support on PC effectively making it a console tech from the beginning. You must remember how there weren't even tech demos available for people to run (except for press) on their 2080tis for months after its launch. Also, Consoles are/were not necessarily dependent upon progress in PC gaming hardware. This might be true now but PS3, if remember correctly had a quite different architecture to x86/x64 and was in some ways better than PC hardware for some time after its release (for some scenarios). Now you could say that PS3 has an Nvidia chip but point is, it was customized and not a close copy of PC GPU tech.

About the industry maturing, difficult to say how much of that is organic and how much is just various deliberate choices by devs/publishers/MS/Sony, etc.

PS2 has a high player-per instance count, because it makes loads of trade offs vs other multiplayer games. Like changes the player-server tickrate depending on your distance from the enemy and a whole bunch of other things like that. In PS1 it showed you the player ping/tickrate on screen/HUD if you enabled it and I think it was up to 2 seconds (2000ms) for idle players or players out of combat. Anyway the point is that extremely large numbers of player is possible but requires specialist optimization which is why few devs do it. It also suffers from a resource increase which is non linear, it's a power law, so the more players you have the more players each individual has to update, so you get this highly non linear growth of CPU and bandwidth increase that prevents you from doing huge player counts unless you do a lot of aggressive optimizations. We've had 100 player FPS games going back decades but they're rare and they make a lot of trade offs in order to get there.

Advancements in RT is a PC tech, Nvidia pioneered this with Microsoft to introduce RT into the DirectX API with the DXR instruction set and AMD played catch up to implement this in their PC hardware from LAST gen, and that last gen PC hardware is the basis for the new console hardware. Which is why they're so bad at RT and why any kind of RT implementation requires a lot of "optimzation" or in plain English, lowering of quality to get it to work.

Each console lifecycle is about 6-8 years, each PC GPU cycle is about 2 years. So just from that alone you know that the cycle of buying hardware and the re-investment back into R&D for graphics is happening on the PC. When the consoles launch they approach Nvidia and AMD to ask what they can provide and it's always some slightly customized version of their LAST gen hardware. The consoles aren't spending loads of money to provide next-gen graphics, they're borrowing from old PC tech. If the consoles wanted to out pace the PC and provide brand new and improved visuals unique to them, they'd need to charge massive premiums on the console hardware to cover the R&D they'd need to do to outpace Nvidia and AMD. But their target audience is budget gamers so they have to stick with copying old, last gen, hardware.

That's fine, I don't have a problem with it, most casual gamers don't care, but let's at least be honest about it.
 
PS2 has a high player-per instance count, because it makes loads of trade offs vs other multiplayer games. Like changes the player-server tickrate depending on your distance from the enemy and a whole bunch of other things like that. In PS1 it showed you the player ping/tickrate on screen/HUD if you enabled it and I think it was up to 2 seconds (2000ms) for idle players or players out of combat. Anyway the point is that extremely large numbers of player is possible but requires specialist optimization which is why few devs do it. It also suffers from a resource increase which is non linear, it's a power law, so the more players you have the more players each individual has to update, so you get this highly non linear growth of CPU and bandwidth increase that prevents you from doing huge player counts unless you do a lot of aggressive optimizations. We've had 100 player FPS games going back decades but they're rare and they make a lot of trade offs in order to get there.
The point is modern PC hardware can do it but we don't have modern games supporting it. Not all gamers have the best of the best tech but that has always been true. Almost nobody was able to run Crysis (with Very high settings) when it came out doesn't mean that the best PC hardware of that time could not do it.
Advancements in RT is a PC tech, Nvidia pioneered this with Microsoft to introduce RT into the DirectX API with the DXR instruction set and AMD played catch up to implement this in their PC hardware from LAST gen, and that last gen PC hardware is the basis for the new console hardware. Which is why they're so bad at RT and why any kind of RT implementation requires a lot of "optimzation" or in plain English, lowering of quality to get it to work.
Well, almost everything is a PC tech if you want to be pedantic about it. BTW, RT as tech has been around for a long time, it's just that the modern hardware implementation in GPUs was absent. And like I said it was effectively a console tech as we got nothing out of it for almost two years.
Also, You are wrong about RT being bad on AMD. Taking various benchmarks, which have many Nvidia RT optimized titles as opposed to AMD, Nvidia is around 32% faster on average and around 26% in minimums. If RT is bad on AMD then every card below RTX 3080 would have to be termed as bad for RT.

1By0SNO.png

Each console lifecycle is about 6-8 years, each PC GPU cycle is about 2 years. So just from that alone you know that the cycle of buying hardware and the re-investment back into R&D for graphics is happening on the PC. When the consoles launch they approach Nvidia and AMD to ask what they can provide and it's always some slightly customized version of their LAST gen hardware. The consoles aren't spending loads of money to provide next-gen graphics, they're borrowing from old PC tech. If the consoles wanted to out pace the PC and provide brand new and improved visuals unique to them, they'd need to charge massive premiums on the console hardware to cover the R&D they'd need to do to outpace Nvidia and AMD. But their target audience is budget gamers so they have to stick with copying old, last gen, hardware.

That's fine, I don't have a problem with it, most casual gamers don't care, but let's at least be honest about it.

There is nothing dishonest about saying that Modern PC hardware is not being utilized to its fullest extent. You keep on bringing various reasons as to why it is and those are self-evident and which I have acknowledged few times. All of the aforementioned does not change the fact that modern PC hardware is not being utilized to its full or close to its full capacity in games.
 
Consoles don't use last generation technology - if you look at unified shaders,it first appeared on the XBox360 GPU first. It's mostly current generation technology and maybe even aspect's of future GPUs.SMT in the consumer space appeared on consoles first IIRC. The PS4 GPU was more advanced than the PC GCN GPUs of its era,the XBox Series X GPU apparently has aspects not found in RDNA2,etc. If you go back further before the X86 consoles,the N64,etc were definitely more advanced than the PCs of the era.

Also if you really want to look at what most gamers - look on the Steam Hardware survey. Only one Ampere GPU at number 20(RTX3070). The second fastest GPU in the top20 is the RTX2070 Super. That means the consoles are faster and more advanced than most PCs on Stream.Due to miners and lack of capacity,the mainstream is stuck on older and slower GPUs.
 
Last edited:
Consoles don't use last generation technology - if you look at unified shaders,it first appeared on the XBox360 GPU first. It's mostly current generation technology and maybe even aspect's of future GPUs.SMT appeared on consoles first IIRC. The PS4 GPU was more advanced than the PC GCN GPUs of its era,the XBox Series X GPU apparently has aspects not found in RDNA2,etc. If you go back further before the X86 consoles,the N64,etc were definitely more advanced than the PCs of the era.
Exactly. The most we can say is that old Consoles had a weird architecture wherein it is difficult to put them exactly between PC GPUs/CPUs. Don't know about SMT/Hyper-Threading being introduced in consoles first though.
 
Exactly. The most we can say is that old Consoles had a weird architecture wherein it is difficult to put them exactly between PC GPUs/CPUs. Don't know about SMT/Hyper-Threading being introduced in consoles first though.

The console hardware is semi-custom and there is evidence at least part of AMD GPU R and D is prioritised towards consoles. GCN1.1 and RDNA2 seem to have been partly console focussed - for example with RDNA2, AMD talked about engineering the RT aspects for maximum area effiency. If you saw the GCN1.1 GPUs they had very good area effiency for compute. Consoles SOCs are engineered with area effiency to keep die sizes to the minimum.

In the consumer space the consoles introduced SMT years before Intel Nehalem and Intel Atom. It existed in Enterprise hardware first.
 
The point is modern PC hardware can do it but we don't have modern games supporting it. Not all gamers have the best of the best tech but that has always been true. Almost nobody was able to run Crysis (with Very high settings) when it came out doesn't mean that the best PC hardware of that time could not do it.

.......


There is nothing dishonest about saying that Modern PC hardware is not being utilized to its fullest extent. You keep on bringing various reasons as to why it is and those are self-evident and which I have acknowledged few times. All of the aforementioned does not change the fact that modern PC hardware is not being utilized to its full or close to its full capacity in games.

When devs worked with Nvidia on games like Control or Metro, etc there is no reason at all why they couldn't use PC hardware to the fullest. Control was an exclusive pc game for a while which is a showcase for RT and developed on the RTX2000 series cards. A 2080Ti is more powerful than the PS5 gpu yet the first PS5 games like Ratchet and Clank are better looking than any Nvidia partnered PC exclusive.
 
Also, You are wrong about RT being bad on AMD. Taking various benchmarks, which have many Nvidia RT optimized titles as opposed to AMD, Nvidia is around 32% faster on average and around 26% in minimums. If RT is bad on AMD then every card below RTX 3080 would have to be termed as bad for RT.

1By0SNO.png

With regard to RNDA2 being bad at RT he was 100% correct. The problem with Tom's 10 game average is that at least 9 of them are using a hybrid engine and 8 of them use only a subset of what RT can do. It's also worth noting that no settings were listed. e.g. we don't know if CP2077 was using SSR off (which turns RT reflections on) or Psycho GI.

DF did a decent job of comparing the 3080 with the 6800XT.

Of course you could also consider -
DirectX Raytracing Feature Test

1 GPU
  1. Score 65.23, GPU 3090 @2250/5512, CPU 10900k @5.3, Post No.0491, Jay-G25 - Link Drivers 460.89
  2. Score 65.20, GPU 3090 @2235/5528, CPU 10900k @5.3, Post No.0545, spartapee - Link Drivers 466.27
  3. Score 64.34, GPU 3090 @2235/5344, CPU 7820X @4.7, Post No.0489, anihcniedam - Link Drivers 460.89
  4. Score 63.87, GPU 3090 @2205/5328, CPU 6950X @4.401, Post No.0496, FlyingScotsman - Link Drivers 460.89
  5. Score 63.14, GPU 3090 @2265/4876, CPU 5950X @4.8, Post No.0462, OC2000 - Link Drivers 460.79
  6. Score 62.98, GPU 3090 @2205/5328, CPU 9900KF @5.0, Post No.0379, spartapee - Link Drivers 457.09
  7. Score 62.38, GPU 3090 @2160/4976, CPU 9900k @5.0, Post No.0480, Raiden85 - Link Drivers 460.89
  8. Score 62.24, GPU 3090 @2160/5276, CPU 5950X @4.9, Post No.0542, redkrptonite - Link Drivers 466.27
  9. Score 61.68, GPU 3090 @2130/5076, CPU 5950X @4.949, Post No.0531, Grim5 - Link Drivers 466.11
  10. Score 61.61, GPU 3090 @2115/5128, CPU 9980XE @4.5, Post No.0487, Greebo - Link Drivers 460.89
  11. Score 60.23, GPU 3090 @2145/5176, CPU 3175X @4.8, Post No.0415, sedy25 - Link Drivers 457.30
  12. Score 58.58, GPU 3090 @2100/5276, CPU 3600X @4.4, Post No.0445, Bickaxe - Link Drivers 457.51
  13. Score 55.57, GPU 3090 @1980/4876, CPU 5950X @4.1, Post No.0429, Kivafck - Link Drivers 457.30
  14. Score 55.57, GPU 3090 @1995/4876, CPU 10900k @5.1, Post No.0357, Sedgey123 - Link Drivers 457.09
  15. Score 55.50, GPU 3090 @2085/5076, CPU 3800X @4.7, Post No.0450, ChrisUK1983 - Link Drivers 457.51
  16. Score 55.47, GPU 3090 @2040/4876, CPU 5900X @3.7, Post No.0423, atomic7431 - Link Drivers 457.30
  17. Score 54.39, GPU 3090 @1905/5176, CPU 10900k @5.2, Post No.0446, kipperthedog - Link Drivers 457.51
  18. Score 52.24, GPU 3080 @2235/5252, CPU 3900X @4.649, Post No.0413, haszek - Link Drivers 457.09
  19. Score 50.56, GPU 3080 @2145/5248, CPU 3600 @4.4, Post No.0411, TNA - Link Drivers 457.30
  20. Score 34.15, GPU 6900X @2625/4280, CPU 5800X @5.049, Post No.0477, 6900 XT - Link Drivers 20.12.2
  21. Score 33.31, GPU 3070 @2085/4050, CPU 3175X @4.12, Post No.0392, sedy25 - Link Drivers 457.09
  22. Score 32.54, GPU 2080Ti @2130/3500, CPU 3950X @4.301, Post No.0357, Grim5 - Link Drivers 452.06
  23. Score 29.91, GPU 2080Ti @1980/3500, CPU 8700 @4.3, Post No.0391, Quartz - Link Drivers 456.55
  24. Score 23.96, GPU 6800 @2295/4220, CPU 3900X @4.541, Post No.0459, Chrisc - Link Drivers 20.12.1
  25. Score 21.36, GPU 2080 @2025/4050, CPU 9900k @5.0, Post No.0365, Cooper - Link Drivers 457.09

Toms hardware also gave the results with DLSS -
And their recommendation for RT -
Ray Tracing Winner: Nvidia, by a lot
e02O7cI.jpg
 
With regard to RNDA2 being bad at RT he was 100% correct. The problem with Tom's 10 game average is that at least 9 of them are using a hybrid engine and 8 of them use only a subset of what RT can do. It's also worth noting that no settings were listed. e.g. we don't know if CP2077 was using SSR off (which turns RT reflections on) or Psycho GI.

DF did a decent job of comparing the 3080 with the 6800XT.

Of course you could also consider -


Toms hardware also gave the results with DLSS -

And their recommendation for RT -
My comment was a reply to "they're so bad at RT ". I never said that RT is better on AMD. Perhaps a bit of careful reading is required on your part before going on the defensive and start quoting specifically chosen RT benchmarks and conclusions in bold letters. BTW, do you think questioning Toms Hardware for their RT benchmarks and then quoting their same article for declaring Nvidia a winner in RT speaks of some cognitive dissonance or bias on your part?
 
My comment was a reply to "they're so bad at RT ". I never said that RT is better on AMD. Perhaps a bit of careful reading is required on your part before going on the defensive and start quoting specifically chosen RT benchmarks and conclusions in bold letters. BTW, do you think questioning Toms Hardware for their RT benchmarks and then quoting their same article for declaring Nvidia a winner in RT speaks of some cognitive dissonance or bias on your part?

They also conveniently missed the DF video,where they estimated that the PS5 had as good(or better) RT performance than an RX6800 series GPU. IIRC,even they were thinking it was optimisation or driver issue on the AMD side. Everyone knows that Nvidia has had a 2 years of developers optimising for their designs,now as AMD gets its hardware out there,the performance gap will decrease. Then you have instances in WoW and the last RE game,where AMD wasn't doing too bad either. Look at the latest UE5 RT test - AMD is ahead.

But what performs better in the desktop space means nothing when most gaming PCs in existence are weaker than a console in both rasterised and RT performance.Lots of gamers can't even find an RTX capable GPU of decent performance,let alone find a few £100 extra.

The fact is PC hardware enthusiasts due to paying more and more for parts,often well above RRP,can't bear to see a cheaper console(even when it is in its scalped form) doing pretty decently for the price. Its more a case of slight buyers regret on their part IMHO,so they need to nitpick on the performance of systems which cost less than even an RTX3060!

The thing is,in barely 9 months after the launch of the PS5/XBox Series X we have games like that mentioned in the OP. Now consider the XBox Series X has more CPU and GPU power. Once MS actually gets off their arse we are going to see even more impressive looking titles.

Until GPU RRPs come back to normal,you are probably looking at spending 3X the cost of a console,to have something with similar capabilities(Ryzen 7 3700X and a £700+ RX6700XT).
 
Last edited:
My comment was a reply to "they're so bad at RT ". I never said that RT is better on AMD. Perhaps a bit of careful reading is required on your part before going on the defensive and start quoting specifically chosen RT benchmarks and conclusions in bold letters. BTW, do you think questioning Toms Hardware for their RT benchmarks and then quoting their same article for declaring Nvidia a winner in RT speaks of some cognitive dissonance or bias on your part?

Well your previous comment was what I replied to -
Also, You are wrong about RT being bad on AMD.

I simply posted facts that prove otherwise. There was nothing defensive.
speaks of some cognitive dissonance or bias on your part?

Really? Back to this BS as you have nothing of any relevance to say on the subject? The mods have already spoken about this behaviour.
 
Well your previous comment was what I replied to -
That's why context is important and quote mining is bad.
I simply posted facts that prove otherwise. There was nothing defensive.
Sure, whatever makes you sleep at night.

Really? Back to this BS as you have nothing of any relevance to say on the subject? The mods have already spoken about this behaviour.
Back to what, I never conversed with you before (except one reply on a different topic on a different thread almost two weeks ago). Again reading comprehension issue or are you confusing me with someone else? The only one sporting BS and troll behavior is you. Hijacking a conversion to spout out of context nonsense and trying to back that with mod threats. You are Ignored.
 
Last edited:
Back to what, I never conversed with you before. Again reading comprehension issue or are you confusing me with someone else? The only one sporting BS and troll behavior is you. Hijacking a conversion to spout out of context nonsense and trying to back that with mod threats. You are Ignored.

It was a generaliastion based on your reply. Again nothing to say on the subject of RT and so you throw insults before a tantrum :cry:
 
They also conveniently missed the DF video,where they estimated that the PS5 had as good(or better) RT performance than an RX6800 series GPU. IIRC,even they were thinking it was optimisation or driver issue on the AMD side. Everyone knows that Nvidia has had a 2 years of developers optimising for their designs,now as AMD gets its hardware out there,the performance gap will decrease. Then you have instances in WoW and the last RE game,where AMD wasn't doing too bad either. Look at the latest UE5 RT test - AMD is ahead.

But what performs better in the desktop space means nothing when most gaming PCs in existence are weaker than a console in both rasterised and RT performance.Lots of gamers can't even find an RTX capable GPU of decent performance,let alone find a few £100 extra.

The fact is PC hardware enthusiasts due to paying more and more for parts,often well above RRP,can't bear to see a cheaper console(even when it is in its scalped form) doing pretty decently for the price. Its more a case of slight buyers regret on their part IMHO,so they need to nitpick on the performance of systems which cost less than even an RTX3060!

The thing is,in barely 9 months after the launch of the PS5/XBox Series X we have games like that mentioned in the OP. Now consider the XBox Series X has more CPU and GPU power. Once MS actually gets off their arse we are going to see even more impressive looking titles.

Until GPU RRPs come back to normal,you are probably looking at spending 3X the cost of a console,to have something with similar capabilities(Ryzen 7 3700X and a £700+ RX6700XT).
Agreed for the most part. AMD is actually doing quite well on many titles. I was most impressed by Metro EE, an Nvidia-sponsored RT showcase title where RX 6900 XT almost matches RTX 3080 in their supplied benchmark. That benchmark is supposed to be heavier than most of the game and RX 6900 XT only loses in very RT limited custom scenarios to an RTX 3080. The reason being, the whole scene is not made of RT effects only and a combination of RT + raster seems to work better for AMD.

I am not bothered by Consoles being better price/performance-wise as long as my GPU horsepower is noticeably better than them. I am a little ****** out about the lack of optimization on PC and lack of proper utilization of my 1000$ GPU:mad:.
 
Last edited:
Agreed for the most part. AMD is actually doing quite well on many titles. I was most impressed by Metro EE, and Nvidia sponsored RT showcase title where RX 6900 XT almost matches RTX 3080 in their supplied benchmark. That benchmark is supposed to be heavier than most of the game and RX 6900 XT only loses in very RT limited custom scenarios to an RTX 3080. The reason being, the whole scene is not made of RT effects only and a combination of RT + raster seems to work better for AMD.

I am not bothered by Consoles being better price/performance-wise as long as my GPU horsepower is noticeably better than them. I am a little ****** out about the lack of optimization on PC and lack of proper utilization of my 1000$ GPU:mad:.

The price/performance aspect is important as developers will only push effects dependent on the target market. The problem is if most gamers can't actually get hold of an RT capable GPU,let alone one which is no worse than a console,then most PC games will stay essentially rasterised. Developers want to maximise their sales - a lot of PC games pushing RT effects are partly down to AMD/Nvidia try to push the point to sell more GPUs. Its like with stuff like tessellation,etc - you need to have sufficient market penetration in the PC market of hardware that can support those features. ATM,you can see the issue on Steam. Hardly any RT capable GPUs in the top20,and most are not going to really beat a console. The GTX1060 is still top of the chart after two new generations of Nvidia mainstream GPUs.

So for even you with your highend GPU,having more mainstream RT capable GPUs,actually increases the chance those effects get incorporated into more games. It will also mean more developers will actually spend time on doing it properly too.
 
The price/performance aspect is important as developers will only push effects dependent on the target market. The problem is if most gamers can't actually get hold of an RT capable GPU,let alone one which is no worse than a console,then most PC games will stay essentially rasterised. Developers want to maximise their sales - a lot of PC games pushing RT effects are partly down to AMD/Nvidia try to push the point to sell more GPUs. Its like with stuff like tessellation,etc - you need to have sufficient market penetration in the PC market of hardware that can support those features. ATM,you can see the issue on Steam. Hardly any RT capable GPUs in the top20,and most are not going to really beat a console. The GTX1060 is still top of the chart after two new generations of Nvidia mainstream GPUs.

So for even you with your highend GPU,having more mainstream RT capable GPUs,actually increases the chance those effects get incorporated into more games. It will also mean more developers will actually spend time on doing it properly too.
I know, I know. Those are all logical reasons of which I am also aware but I just lament the loss of creativity and outlandish steps (for their time) that some companies took in past to push PC hardware. Advances made in a game like Assassins Creed Unity with its great lighting and crowd density settings were sacrificed at the altar of Consoles (in newer AC games) as they couldn't handle it.
 
I know, I know. Those are all logical reasons of which I am also aware but I just lament the loss of creativity and outlandish steps (for their time) that some companies took in past to push PC hardware. Advances made in a game like Assassins Creed Unity with its great lighting and crowd density settings were sacrificed at the altar of Consoles (in newer AC games) as they couldn't handle it.

That is because the big publishers just wanted to push minimal effort jobs which are lowest common denominator,to attract as much sales as possible. Look at Red Faction and its environmental destruction,or the early FarCry games with the dynamic weather and fire models.Look at most modern games,which barely have improved on this or still have very static environments. AI models have barely improved in the last decade either.

Then you had Crytek who really pushed technologies forward,only for PC gamers to complain about Crysis - the same period where we had excellent value entry level and mainstream GPUs such as the HD3850,8800GT,etc.

Then with Nvidia/AMD then deciding to slow down improvements in the mainstream once they were more concerned about accountants,PCs didn't end up having as big an advantage as they should have. The GTX1060 is a 5 year old GPU and it shouldn't be still at number one for so long.

I still remember how fantastic mainstream/lower end enthusiast GPUs were like - literally previous generation highend performance for a much cheaper price. This is why consoles for many years couldn't compete - PC just pushed forward at too fast a rate at the mainstream.

Even look at monitors - resolutions have stagnated. This is because there is a stagation in mainstream improvements. So we still have lots of 1080p monitors even now. 1080p wasn't that special even 10 years ago!

It also didn't help Intel also was partly to blame. We were stuck on quad cores for a decade,so even as GPU power increased we were held back by a lack of cores. There is only so much you can do,if everything from the graphics threads,AI threads,etc are all competing for the same threads. We went from single core CPUs to dual cores to quad cores in a few years. Yet,the first mainstream CPU platform to move to 6 cores was AM4. Sure you had the Core i7 5820K on X299 but that was hardly a mainstream platform.I would argue the only reason we even have multithreaded engines now,was because previous generation consoles had relatively weak CPUs with lots of threads.
 
Last edited:
That is because the big publishers just wanted to push minimal effort jobs which are lowest common denominator,to attract as much sales as possible. Then you had Crytek who really pushed technologies forward,only PC gamers to complain about Crysis - the same period where we had excellent value entry level and mainstream GPUs such as the HD3850,8800GT,etc.Then with Nvidia/AMD then deciding to slow down improvements in the mainstream once they were more concerned about accountants,PCs didn't end up having as big an advantage,as they should have. I still remember how fantastic mainstream/lower end enthusiast GPUs were like - literally previous generation highend performance for a much cheaper price. This is why consoles for many years couldn't compete - PC just pushed forward at too fast a rate at the mainstream.

It also didn't help Intel also was partly to blame. We were stuck on quad cores for a decade,so even as GPU power increased massively we were held back by a lack of cores. There is only so much you can do,if everything from the graphics threads,AI threads,etc are all competing for the same cores. I would argue the only reason we even have multithreaded engines now,was because previous generation consoles had relatively weak CPUs with lots of threads.
Tell me about it. Nvidia FX 5200, 9600GT, and AMD HD 6850 all were just a tier below the top end yet provided fantastic Price/Performace. I don't think PC gamers complained about Crysis after the initial shock. When Crysis 2 came out many PC gamers derided it for being a Console game through and through. I remember it was Cytek who called PC gamers Pirates and went full throttle for Consoles. I don't think Nvidia/AMD stopped improvements to GPUs. Tessellation, Physx, VRR, High refresh gaming, more GPU ram, etc.; they were pushing these things one by one.

About Intel, Completely agree. Though I think Karma has come down hard on them.
 
Tell me about it. Nvidia FX 5200, 9600GT, and AMD HD 6850 all were just a tier below the top end yet provided fantastic Price/Performace. I don't think PC gamers complained about Crysis after the initial shock. When Crysis 2 came out many PC gamers derided it for being a Console game through and through. I remember it was Cytek who called PC gamers Pirates and went full throttle for Consoles. I don't think Nvidia/AMD stopped improvements to GPUs. Tessellation, Physx, VRR, High refresh gaming, more GPU ram, etc.; they were pushing these things one by one.

About Intel, Completely agree. Though I think Karma has come down hard on them.

Crysis didn't sell that well - people just complained they needed to buy new parts and didn't bother. AFAIK,they invested a ton of money in Crysis and never made as much as they needed it to. Crytek went to consoles because they were given a guarenteed revenue stream at the time. The way PC gamers treated Crytek,who were on of the most PC focussed gaming companies,was the point more and more developers started exploring console revenue IMHO. If Crytek couldn't make money from a PC first technological approach,nobody else could. The only really big AAA PC non crowd funded focussed titles really after that point were MMOs. Even Bethesda,etc moved games over to consoles too.

AMD/Nvidia have slowed down improvements for the entry level and mainstream over the last couple of years. The top end has improved,but its also gone up in cost since the Fermi days - buyers who are willing to throw lots of money at GPUs haven't noticed it,but mainstream and entry level are far more price orientated price-points.You see this by looking at the 60 series GPUs and the AMD equivalents. They basically are lower and lower down the stack and at the same time gotten more and more expensive too,hence why you get things such as the RTX2060 to RTX3060 improvement which is rubbish. Its why GPUs such as the R9 290/GTX970 which were lower end enthusiast parts(but were eventually at a modern mainstream price-point) lasted so long.
 
Back
Top Bottom