There will be meltdowns but not for the reason you think.Some of you guys are going to have serious meltdowns as ue5.5+ games in coming years will default to hardware ray tracing for lumen.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
There will be meltdowns but not for the reason you think.Some of you guys are going to have serious meltdowns as ue5.5+ games in coming years will default to hardware ray tracing for lumen.
They'll upgrade.
Yep, the mid range cards have always been the most popular by far. People who spend £500+ on a GPU are the minority.
If a game can't run on them, it just won't sell very well.
I've said it before and I'll repeat myself again. Graphic fidelity needs to go on the back burner for a bit and more focus on NPC AI, Physics and Sound needs to be moved to the forefront with proper gameplay still holding a clear top spot. Games have looked good for a while now but all the other areas are advancing rather slowly IMHO. At least slower than what I want them to
I've said it before and I'll repeat myself again. Graphic fidelity needs to go on the back burner for a bit and more focus on NPC AI, Physics and Sound needs to be moved to the forefront with proper gameplay still holding a clear top spot. Games have looked good for a while now but all the other areas are advancing rather slowly IMHO. At least slower than what I want them to
I've said it before and I'll repeat myself again. Graphic fidelity needs to go on the back burner for a bit and more focus on NPC AI, Physics and Sound needs to be moved to the forefront with proper gameplay still holding a clear top spot. Games have looked good for a while now but all the other areas are advancing rather slowly IMHO. At least slower than what I want them to
AI in games has gone backwards even compared to the 90s. Half-Life 1 still has better AI than modern FPSs. The last good AI I've seen was probably Alien Isolation, 10 years ago.
With things like ChatGPT etc, you'd think someone would have found a way to integrate that in to a game by now. Someone made a Skyrim mod, but that's about as far as it's got.
There doesn't seem to be enough talent left in the games industry to raise the bar now.
As much as I love my graphics, I completely agree with this.
AI would be processed by the CPU and stored in system ram, not the GPU.
To what exactly? An RTX5070? They will be buying the same 50 and 60 series cards - many of these will be laptops so probably not that great. The CPUs also are slower too.
Steam proves this,as does all other measurements. There is also the fact more people are buying gaming laptops,which is supported by market numbers too.
The issue with enthusiasts on tech forums,is that many of them are absolutely clueless about mainstream cards.
Firstly half of dGPUs are in laptops and the laptop versions of these cards,perform much worse. Many even had less VRAM. The laptop RTX3060 had only 6GB.
Secondly,mainstream gamers are price sensitive. Look at the Steam top10,it's organised mostly with cards under £400. Just because people on this forum might be willing to up their dGPU budget each generation by huge amounts,its not the reality at all.
Thirdly,this has been the performance jump since 2018:
1.)RTX2060:
20% over the GTX1070 which was launched around the same price.
2.)GTX1660TI:
33% over a GTX1060.
3.)RTX3060:
20% generation improvement after two years.
2.)RTX4060:
Under 25% generational improvement. Raytracing improvements are at 20% to 30% too.
So we are averaging between 20% to 30% improvement for each new generation.
The problem is Nvidia and AMD mainstream cards are not great. The RTX4060 is only 50% faster in rasterised and 60% faster in RT after 6 years over an RTX2060. An RX7600 is probably around the same over an RX5600XT.
That means that an RTX5060,unless it's a new 8800GT,will be slower than an RTX4070 and have less VRAM.
An RTX4070 already stuggles in newer games with RT,let alone AMD cards.
If the people obsessed with RT on here,want more intense RT and PT everywhere it will need:
1.)Nvidia to actually take the mainstream seriously,especially as they sell over 80% of dGPUs
2.)AMD to improves its RT
3.)Intel to step up too
4.)Consoles to step up too
5.)Devs need to actually optimise games properly
If not all you will have in 99% of RT games,economy RT implementations using FG for "optimisation" instead of doing a proper job. Just wheeling out a few examples of games which might do otherwise,won't prove most games will push things.
Not everyone is CDPR.
ATM,we seem to have mostly poorly optimised,mediocre looking games which need excessive hardware and using FG as a prop.
Considering all the three companies would rather sell more AI hardware,expect more mainstream stagnation.
AI in games has gone backwards even compared to the 90s. Half-Life 1 still has better AI than modern FPSs. The last good AI I've seen was probably Alien Isolation, 10 years ago. I don't think you could even call it AI in the majority of games, they just stand there and attack or follow set instructions.
With things like ChatGPT etc, you'd think someone would have found a way to integrate that in to a game by now. Someone made a Skyrim mod, but that's about as far as it's got.
There doesn't seem to be enough talent left in the games industry to raise the bar now.
Most of the latest interactive NPC AI demos,use the graphics cards to run some of the calculations not the GPU.
If this is about general NPC combat movements,the models are not well threaded AFAIK,so are dependent on single core performance.
One of the reasons why Crysis was so CPU dependent,is because it used two different forms of NPC AI movement systems.
One for walking NPCs and the other for the ones which flew about. Nowadays most games take the easy way out and just re-use the same walking model for everything. It's why Crysis 2 re-invented the aliens as walkers.
I suspect it is not helped consoles also have weak CPUs too,from Jaguar to Zen+ performance cores. Once these games went multi-platform,they have to balance the CPU requirements. The shinier graphics has a CPU overhead too.
Unless the next generation consoles use better CPUs,then we are stuck there. It's one of weirdest design decisions Sony/Microsoft have made,because even general performance is held back.
So, why do we need so many tensor cores in GPUs again? because DLSS barely use them as is and they are designed to process AI after all. That aside, AI models moved forth since good old times and if you want to have proper behaviour based on proper models, it will eat tons of ram and vRAM. Which is cheap enough these days but Nvidia will just not give people more, because. Even relatively simple "nano" ai models on mobile phones can easily eat few GB of ram.AI would be processed by the CPU and stored in system ram, not the GPU.
Certainly are. Good old NVIDIA. Oopsy.I think they are hinting at the vram mate..
"The Intel Arc A770, for example, which is a budget-focused 1080p graphics card, beats the RTX 3080, which was the 4K champion when it launched. Why? The A770 has 16GB of VRAM, while the RTX 3080 has 10GB."
There's ZERO FSR/XeSS support, thought NV didn't block AMD/Intel upscaling or is it only the Devs fault when NV is partnered?But but up scaling reduces VRAM requirements… or something.
RTGI in Indiana Jones
4K DLAA, locked at 100fps. your move, RT h8ers
I just found today this channel on youtubeThere will be meltdowns but not for the reason you think.