• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

OcUK RTX4060 review thread

How do you know they sold 3? It just says "In Stock" no numbers...
Screenshot-583.png
 
The main issues with these cards using DLSS as a sales point is that as the cards get weaker, DLSS (and frame gen too) get worse. So it doesn't turn a bad experience into a good one, but does turn a good experience into a better one.
For example, I think Daniel Owen mentioned that frame gen is good in his opinion, but only if the normal fps is above 60 (the ok experience) and then you throw frame gen on top to get a better experience. So it kinda cuts the legs off of a low end card because once you start going below that 60 then even if your fps numbers read 80 odd then the experience is rubbish for him.
Normal DLSS is similar, you'd want it to help budget gamers get more, except at lower resolutions it's worse. N.B. I actually use it @1080p on quality on Darktide because it does a better job than the AA and I don't notice the difference bar the lack of the screen crawling compared to DLSS off, also use it in NMS because the difference doesn't bother me enough tbh.
I didn't use it in Hitman, but was actually impressed at how 'good' the image looked at 1080 using the worst settings, and it did a better job than FSR - though that might have been fsr 1.
Daniel Owen mentioned in his video that Frame Gen increases VRAM usage. So for the 8GB cards you're again cutting their legs off by using these extra features, and his Cyberpunk frametimes are more eratic with FG on.

This is longer than i expected, so TLDR, glitter is good on things and makes things prettier, but glitter on a turd does nothing except you're now also paying for glitter too.

Yep messing around with it on my 4070 leads to the following:

- CP2077 with RT Overdrive @ 3440x1440 - Taking ~40ish FPS to over 60. Feels "ok" to play but is not a true 60FPS experience (Feels like 40FPS but visually is 60). Kind of worth it to experience the RT elements, but I wouldn't actually play it like that day to day.
- Hogwarts @ 3440x1440 - Takes ~70-80 ish to over 110FPS - Feels fantastic and can't spot any DLSS3 oddities in actual motion. To my eyes it makes it play the same as it did on a 7900XT despite the latter being and outright better performer natively.

Not tried it in anything else as the FPS I am getting is high enough to not need it. Kind of the problem as others have mentioned already. Where it makes most technical sense to the user - Increase FPS from say 30 to 60 - is not where it actually works properly. You ideally need a high FPS at a base level to take full advantage of it (see Hogwarts in my examples above) and then you could make the argument that it isn't needed at all given the base level FPS is already high enough.

Saying all that I am currently playing Hogwarts at ~40-50 FPS on my Ally during lunch breaks so the above is somewhat academic. :p

Also 4060 wise.... That's a "meh" from me Bob.

The raw technical aspects I don't have a problem with (core count etc), the relatively positioning and price is, and always has been, the issue. Call it a RTX 4050 and make it £200, ok not as bad. But badging as the "mid tier" 4060 when it has only ~18% of the core count of the top end model feels all manner of wrong.
 
As someone who uses my GPU for games, not gawking at RT in a few select games

I'm very happy I chose to go with the RX 7600 (with RE4 included) for £259...

Its a very good GPU for its money. :) There's nothing wrong with it, i don't get the hate for it by reviewers, they just look a little biased to me.
 
Because it doesn't have have DLSS and worse RT, AMD should be ashamed that they are not paying him to use it....

I think I covered it :cry:

I sometimes get the impression they don't like it because it makes the Intel cards difficult to recommend, they are all for Intel now.... vs the A770 16GB its faster at 1080P, equal to it at 1440P and slightly cheaper that the A770 8GB, its the way they hated AMD dropping the price from $300 to $270 before launch, you would think they would be happy about that, no, instead they found a way to ridicule AMD for "being indecisive" and then set about trying to recommend the A770 anyway.

They always seem to behave like AMD are a thorn in their side.
 
Last edited:
I sometimes get the impression they don't like it because it makes the Intel cards difficult to recommend, they are all for Intel now.... vs the A770 16GB its faster at 1080P, equal to it at 1440P and slightly cheaper that the A770 8GB, its the way they hated AMD dropping the price from $300 to $270 before launch, you would think they would be happy about that, no, instead they found a way to ridicule AMD for "being indecisive" and then set about trying to recommend the A770 anyway.

They always seem to behave like AMD are a thorn in their side.

Yeh I don't get it.

Well I say that, but I do get it, it's people trying to justify their choices. It's why Nvidia fans always bang on about RT for example they will post a list of 200+ games and screen LOOK, LOOK at the list YOU ARE WRONG blah blah.

I've looked at those lists... There's like 2 games I'd maybe play I don't care about the rest but apparently they will play all 200+ games and everyone will be perfect 10/10 because it's the way it's meant to be played....

People just need to buy what they need and not what others think they need, likewise people should be recommended items based on the person they are helping not what they like or what they think they should like.

It's people trying to find justification in their purchasing decisions, I said before I'd be more leaning to the 7900xtx but no people said I'm stupid because of RT failing to consider even after me saying that I didn't care for RT.

It's unreal. Because both sides are selling **** they have also now become the mouthpiece for these companies trying to pick away at the scraps because no one is biting.
 
Last edited:
What i think it is.

A lot of them have personal friends at Intel, people like Ryan Shrout, Intel frankly are in trouble and these people are crying for their jobs, Intel recently sacked 30% of their workforce and the retail GPU segment is always at the top of the list for the chopping block.

They figure AMD have made enough money for who they are, they have caused too much damage to Intel already and they need help to be put back to their rightful place.

Its been the same with Nvidia for years, they never talk about AMD in terms of friends among them, probably because AMD have never been rich enough to be the sort of company you would see any reason or benefit to suck up to, you do see them talk about friends among Intel and Nvidia.
 
Last edited:
Back
Top Bottom