• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Do AMD provide any benefit to the retail GPU segment.



I created a simple pause menu so you can exit the game, only takes about 30 minutes to set up, are you listening all other UE5 devs creating demo's? :P So press escape to get an exit menu, as you would any game.

About 500,000+ very high polygon ferns, no LOD's, all Nanite, normally you wouldn't pack this much in, certainly not ones this polygon dense, the grass asset layer is also very polygon dense which is half the performance hit. It will melt even high end GPU's.

Give it about 30 seconds before moving, i didn't build an on load shader compile delay, so it will do that after you load in, wait for the textures on the trees to compile to full res before moving.

Extract archive, use 7Zip, double click the old light .exe

Linux Builds on request.
 
Colorful confirms 4060ti is 8gb


It's going to be $450:
 
Colorful confirms 4060ti is 8gb


Yikes.

This is actually going to be barely faster than the 3060ti, same vram and likely cost more.

:cry:
 
I just hope all the people who bought QLC DRAMless SSDs or SATA SSDs don't start complaining! :o

I always told people to make sure they spent the extra and got a decent SSD in the first place!


I do like the fact UE5 is trying to take a balanced approach and incorporate a mix of different technologies. Epic know very well they need to make it work on a range of PC hardware and consoles too. When Nvidia/AMD get too involved,it's mostly so they can sell more of their higher end dGPUs so you can end up with suboptimal results.
That's absolutely true. Epic wants to sell engine and games, so this needs to be easy to use and work on most things with large array of settings. They don't care about GPU branding or their own tech, hence lumen can work with hardware RT and without it too (obviously not as good looking but still very well). Now we just need to wait for games and Devs to actually use it, so we can judge it in practice. :)
 


I created a simple pause menu so you can exit the game, only takes about 30 minutes to set up, are you listening all other UE5 devs creating demo's? :p So press escape to get an exit menu, as you would any game.

About 500,000+ very high polygon ferns, no LOD's, all Nanite, normally you wouldn't pack this much in, certainly not ones this polygon dense, the grass asset layer is also very polygon dense which is half the performance hit. It will melt even high end GPU's.

Give it about 30 seconds before moving, i didn't build an on load shader compile delay, so it will do that after you load in, wait for the textures on the trees to compile to full res before moving.

Extract archive, use 7Zip, double click the old light .exe

Linux Builds on request.
About 35FPS just looking at the distant mountains, on RTX 4090 in 1440p UW. :) I do see lots of artefacts around the "weapon" when turning camera - temporal upscaling? Foliage also still pops in a lot when moving around.
 
Last edited:
About 35FPS just looking at the distant mountains, on RTX 4090 in 1440p UW. :) I do see lots of artefacts around the "weapon" when turning camera - temporal upscaling? Foliage also still pops in a lot when moving around.

I get 9 :D RTX 2070 Super.

temporal upscaling?
Yes.

Foliage also still pops in a lot when moving around.

Grass or trees?
 
Grass or trees?
Grass, flowers - stuff on the ground. Trees seem fine, but grass is popping up into existence not that far from the player, so very visible. I reckon normally we'd have a LOD here and it wouldn't be so visible, though nanite was supposed to resolve that for good, I thought? :)
 
Grass, flowers - stuff on the ground. Trees seem fine, but grass is popping up into existence not that far from the player, so very visible. I reckon normally we'd have a LOD here and it wouldn't be so visible, though nanite was supposed to resolve that for good, I thought? :)

The is a view distance set for grass / flowers, if i didn't set that it just wouldn't run at all. :)

Nanite is excellent but its not magic...
 
Last edited:
I must admit I was half expecting Nvidia to pricedrop the 4080 into the $999 bracket on the 4070 launch as that would make it nice and neat for the 3 cards ($599/$799/$999) and prob really boost 4080 sales but nope, we got nada.
See i paid equivalent if 920ish pounds for a 4080 overseas, but to step upto to a 4090 would have been another 500 pounds, UK pricing and lacks of good sales discounts on the 40 series and even the 7900's seems odd. would have been approx 650 pounds for a AIB 7900xt
 
It's like if the 1050 ti had come out at $350 instead of $139 in 2016 prices.

It is,and PCMR just excuse makes for this. Has any other electronics gone up that way in price? Not even iPhones have gone up that much,and Apple jumps onto brand new nodes quickly and partially bankrolls TSMC and is their biggest customer paying top prices. Apple has much lower margins than Nvidia.
 
Last edited:
It's going to be $450:

Lets compare that to an RTX 3070 (none Ti)

3070:
Shaders 5888 at 1725Mhz, 21 TFlops FP32
Memory bandwidth 448GB/s

4060Ti:
Shaders 4352 at 2535Mhz, 22 TFlops FP32
Memory bandwidth 288GB/s

I would say its going to be marginally faster than the 3070, perhaps between that and a 3070Ti. But its got little over half the memory bandwidth, maybe if its lucky with some older games that don't have high textures streaming demands it can keep up with a 3070, but anything else and especially at 1440P it will not!
But that's ok, because with 8GB of memory its not a AAA 1440P card anyway.
 
Last edited:
The 4060Ti Desktop SKU already exists as the laptop 4070 (4608 CU but at lower clocks) so its performance is already relatively well known.

It essentially trades blows for the most part with the mobile 3070Ti (at the 140w TGP level anyway). The mobile 3070Ti sits somewhere between a desktop 3060Ti and desktop 3070 thus it would be reasonable to extrapolate the 4060Ti being basically a 3070/Ti class card with all same short comings (8GB VRAM) but with oodles of DLSS-fweeeeeee marketing making it look like it beats the 3090 across the board....

Meanwhile I will happily carry on gaming on my desktop 3070 class 3080M with 16GB of VRAM because at least, unlike the desktop 3070, that doesn't run out of VRAM in Hogwarts...* (When I am sofa gaming and not wanting to fire up my 7900XT desktop system for 100+ FPS).

*It was genuinely funny watching my wife's G15 with its 3080M 8GB card try and run the same settings as my Legion. Ignoring the lower FPS (same core but ~1500/14000 vs 1950/15500) the constant swapping in of higher quality textures only for them to disappear 3 seconds later made the experience... interesting. Its almost the perfect comparison because I have effectively a 3070Ti downclocked (8GB) and then a 3070Ti with 16GB of VRAM. Perfectly highlighting that the core itself is fine from a performance point of view but it runs out of filing space and new storage boxes are on back order with Nvidia...
 
Back
Top Bottom