Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Looks like that buying gpus with more vram for better future proofing has really paid of then, 6800xt only capable of 1440p high with upscaling for 60 fps
You missed that the RX7900XT which is slower than an RTX4080 16GB is placed in the same tier. Also you missed that the RTX3080TI 12GB=RX6800XT 16GB in the game.The RTX3080TI was nearly a £1000 a
The RTX3080 10GB isn't any slower than the RX6800XT in core performance. But apparently it seems you need at least 12GB now. So all those arguments about 10GB vs 16GB aged very well.
People on here were saying 12GB were saying UE5 wouldn't use more than 12GB of VRAM for years at 4K. Then went in other threads saying an RTX4070TI 12GB would never need 16GB,as it was the perfect amount of VRAM and started argueing with anyone who said otherwise.
Now one of the first major UE5 games,recommends a minimum of 16GB VRAM at 4K. Even I didn't expect that!
And yet..... a rdna 2 16gb gpu is going to **** the bed at anything but <high settings with upscaling at 1440p and only 60 fps
The RX7900XT 20GB is slower than an RTX4080 16GB in all metrics.£1200 Nvidia card vs £700 AMD card. Yet,a few days ago people were saying UE5 wouldn't need 16GB at 4K,hence an RTX4070TI 12GB had no issues.
Fast forward a few days,the RX7900XT is being recommended at 4K over an RTX4070TI.
So that means the RTX4070TI 12GB is not considered a 4K dGPU,but a qHD dGPU now!
The $1200 RTX3080TI 12GB beats the RX6800XT 16GB in performance:
NVIDIA GeForce RTX 3080 Ti Founders Edition Review
NVIDIA's GeForce RTX 3080 Ti is the green team's answer to AMD's recent Radeon launches. In our testing, this 12 GB card basically matches the much more expensive RTX 3090 24 GB in performance. The compact dual-slot Founders Edition design looks gorgeous and is of amazing build quality.www.techpowerup.com
So that means a 12GB $1200 Ampere based dGPU is ******** the bed,because it's now no better than one which was under $700 at launch? This is a card which is under £500 now.
Yet the RX6800XT is being recommended over the RTX3080 10GB which cost the same.
Does that mean the RTX3080 10GB is now a premium 1080p card?
How many edits
On a phone.
Not watches it yet. But I honestly don't see anything in that game that makes me think wow, that looks awesome, I want to play that game!
To me it just feels like a dev just learning the engine and not one that has learned to get the most out of it and to optimise the game properly.
How is it you need a hell of a lot less grunt in games that look as good if not better?
I don't think people should look at these recent UE5 games and think all future ones will be a resource hog and look meh like this one.
For the performance this game requires it would need to blow my socks off and it does not even come close.
To me it just feels like a dev just learning the engine and not one that has learned to get the most out of it and to optimise the game properly.
How is it you need a hell of a lot less grunt in games that look as good if not better?
I don't think people should look at these recent UE5 games and think all future ones will be a resource hog and look meh like this one.
For the performance this game requires it would need to blow my socks off and it does not even come close.
Lol, "Switching to Performance makes FSR unusable", BUT it uses FSR in Ultra Performance mode on consoles
436p and some nice settings, tooThere will definitely be a learning curve, not just from game devs to get the best from UE 5 but also by UE themselves. They absolutely are going to have to find a way to get better from current hardware otherwise consoles are completely obsolete now unless 480p/720p @ 30 fps is now the new standard All very well amd etc. claiming FSR 3 support for consoles but that means nothing when chances are it will end up being the same as FSR 2 support i.e. 2-3 games using it....
Its going to make a huge number of gaming PCs have issues,so they do need to do something about it. However,the continual upselling of tiers has made mainstream dGPUs relatively weak now,and lots of people use older cards. The top20 is telling especially with how many older dGPUs are being used.
So it wouldn't surprise me Epic expected more of the market to be on better cards by now.
Also,that the modern replacements under £400 are poor improvements for the money. The RTX4060,RTX4060TI and RX7600 are not raising the bar that much.
The GTX1650,GTX1060 and GTX1050TI are probably worse than the GPU in the XBox Series S(or no better). The issue is the first two are in the top 3 of most commonly used dGPUs on Steam. The consoles(PS5 and XBox Series X) apparently use something along the lines of a RX6600XT~RX6700XT in performance.That means the GTX1660TI,GTX1660 Super,GTX1070,RTX2060,RTX3050 and RTX3060 are all slower.
Only the RTX3060TI,RTX3070 and RTX3080 are as fast or are faster than the consoles. This is nearly three years after the consoles have launched. This sort of situation would be unheard off before 2020(the 8800GT thrashed the XBox 360 for much less money within two years). Just think what we got three years after the PS4 Pro? The RX5700XT,RTX2060 Super,RX5600XT,etc for under £400.
Then if you look at ther average CPU:
60% of gamers are still using a 4~6 core CPU,running at under 4GHZ! It might need devs to start dialing down their actual expectations of what hardware they need for their games.
Lol, "Switching to Performance makes FSR unusable", BUT it uses FSR in Ultra Performance mode on consoles
436p and some nice settings, too
upload pic
The game does have issues. It renders your shadow while your body is just a floating hand and a camera (aka like all AAA stuff that gets pushed out these day - I guess to hard to render your body, too! ). No shadows for other NPCs or are rendered badly, which can be at an obvious distance between them and the ground - aka levitating.
Some textures look good, others look like are set on low or something. I don't know why on Earth are not using the same higher quality assets from cinematics since the loading on GPU it appears to be about the same, but (of course), are better. In general, the scenes seem to be lit up better than your usual stuff, some problems here and there as well. Distance objects (I guess helped by nanite) do look good.
So 40% out of 132mil is 52,8 million users have more powerful CPUs than you'd mentioned. So about the same as all "next gen" consoles combined. Don't know about graphics cards as I don't have the time now to add up all those percentages, but I guess it comes close - and since Series S has such a great "quality", then the bar isn't that high either way.
Overall, those old cards are history when it comes to new games. Zero sense to take them into account unless you purposely want to downgrade your game.
Don't forget that PC gamers (at least those who cough up for Nvidia) have a magical answer: upscaling which is magically better than native (at least native with very poor AA)!Actually those on six core CPUs and less is 70% and the 8 core numbers also includes 8 core CPUs such as the FX series,Zen/Zen+ 8 core desktop CPUs,older generation 8 core CPUs(HEDT ones from the SB,IB and Haswell era) and so on. The fact clockspeeds are below 4GHZ also indicative of this. Probably not even running RAM setup properly,especially most people are buying prebuilt systems and laptops.
Even Alienware systems have issues.
Plus the PS5/XBox Series X GPUs are around RX6600XT/RX6700XT level performance. These are generally faster than an RTX3060 overall. The XBox Series S is around GTX1650/GTX1060 level performance.
Now look for all dGPUs above an RTX3060TI/RTX2080 Super/RX6700XT(Xbox Series X). I started with the RTX4060TI/RX6750XT and included the laptop RTX3070(which is probably slower),and it came to just over 13.5% so around 18 million. So most of the rest of that 132 million gamers are on console level performance or less. Even if you include every dGPU with RTX2060 Super performance and better,its closer to 41 million players.
ATM,38 million PS5 console have been sold,plus 21 million XBox Series S and Series X consoles. The reality is most PC gamers don't have systems better than a console. Most of those people won't be on tech forums.
The problem is because of the stagnation in the sub £400 mainstream era,we have barely moved past RX6600XT/RX6700XT performance after THREE years. Consoles have economy level performance,so this is atrocious.
So many people will be upgrading to cards which are not really any better than a console in many ways. So if consoles are having issues,so will huge amounts of PC players. The 10% to 20% performance improvements after nearly three years is terrible. I find this even more perplexing as RT is being pushed,which is very intensive.
In the past this was hidden by the fact we had large generational improvements at the mainstream too. The PC could brute force things.
Don't forget that PC gamers (at least those who cough up for Nvidia) have a magical answer: upscaling which is magically better than native (at least native with very poor AA)!
Now this is magic AI super super upscaling not be confused with that nasty stuff consoles or TVs have had for years at which the PCMR turns their nose up. No, magic AI super super upscaling is the answer to hardware stagnation. I am all for cleverer especially for RT as full path tracing RT with a half-way realistic number of rays would probably require 10 x 4090 and consume close to 5,000W, however 'clever' really is just an excuse to sell the same performance at almost the same price each generation. Yes, silicon nodes have stagnated but with the record margins Nvidia do even "only" in their gaming divisions we can see that it not just that cost/transistors has almost stagnated, but rather that with a huge patent-wall and other barriers there is far too little competition.