Caporegime
- Joined
- 18 Oct 2002
- Posts
- 30,827
It's Chill + Massive massive heatsink/cooler, both combine to do a great job tbfThat I get. 35C playing latest games I do not. Lol.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
It's Chill + Massive massive heatsink/cooler, both combine to do a great job tbfThat I get. 35C playing latest games I do not. Lol.
Division 2 on almost ultra settings (few non-essentials tuned).While playing what game?
Mine were all unplanned. I guess that's the cost of nutting insideOne of the reasons for which I don't want 3 kids.
Gaming GPUs are competing for wafer allocation with CPUs and AI hardware which means they're squeezed against higher margin products.
Prices will stay high until AI deflates or there is going to be overcapacity for high density nodes, which is not very likely sadly...
Just edit your post.. most just replace with 'deleted'.Sorry for another off topic post,but is there a way to delete your own posts? Mine are full of wisdom and helpful information of course,my cat wants to know.
Not enough of a jump for you tbh. I’d be aiming at XTX or 4080 levels to forget about upgrading a gpu for 4 years.If they can manage 7900XT and above performance for £400-500, I'll likely get it for my next upgrade. I feel like I'm done with chasing performance at the high end now, my 3070Ti has done more than a half decent job at gaming at 1440p for 3 years. Even if I upgrade to a 4K monitor, 7900XT levels of raster performance and 2x higher RT performance will probably be enough to get me at least 4k 60fps on even RT heavy titles with FSR.
Even with those I'm not sure will hold up 4 years from now at 4k. Maybe with some serious sacrifice in settings.Not enough of a jump for you tbh. I’d be aiming at XTX or 4080 levels to forget about upgrading a gpu for 4 years.
Didn't even think of that. Maybe I'm not the superior entity I thought I was. Regardless, onwards and upwards!Just edit your post.. most just replace with 'deleted'.
Nah you'll be fine with Medium settings for a good while, hell Medium looks damn good today tbf! Anything higher seems to be largely superfluous.I have a 4080 and play at 4k, no hope in hell that's lasting 4 years before I need to upgrade. 16gb is already becoming an issue with a couple of games, love my rt and dlss, but I'm hoping rdna5 will be my saviour.
That was my experience on the 30 series, first with the 3070@1440p then 3080@4K on the other system-but I don't use RTX.I have a 4080 and play at 4k, no hope in hell that's lasting 4 years before I need to upgrade. 16gb is already becoming an issue with a couple of games, love my rt and dlss, but I'm hoping rdna5 will be my saviour.
That's because the 4070 doesn't have the power to sustain a minimum of 60fps pre-FG (DLSS3), so the input latency is considerably higher and the experience is poorer. A minimum of a 4080 is necessary to mitigate this which would then result in a 60fps baseline at 1440p when path tracing, so a post-FG framerate of 90+ and input latency that's actually acceptable but not 100% perfect. Only a 4090 or above is capable of fully acceptable input latency with FG enabled on a path traced game at 1440p - That is of course unless DLSS upscaling is set to Performance. Before dll version 3.7, Performance mode at 1440p would introduce some signs of distance shimmering on specular highlights, this may not be the case any more but I haven't personally checked for this yet.he doesn't rate DLSS3 at all,
That's because the 4070 doesn't have the power to sustain a minimum of 60fps pre-FG (DLSS3), so the input latency is considerably higher and the experience is poorer. A minimum of a 4080 is necessary to mitigate this which would then result in a 60fps baseline at 1440p when path tracing, so a post-FG framerate of 90+ and input latency that's actually acceptable but not 100% perfect. Only a 4090 or above is capable of fully acceptable input latency with FG enabled on a path traced game at 1440p - That is of course unless DLSS upscaling is set to Performance. Before dll version 3.7, Performance mode at 1440p would introduce some signs of distance shimmering on specular highlights, this may not be the case any more but I haven't personally checked for this yet.
So if that 4070 can use DLSS Performance with tweaked settings and then get at least 60fps in CP with PT/RR (DLSS 3.5) enabled then the post-FG experience should be mostly excellent. Assuming dll version 3.7 is being used for FG, Upscaling and now RR (3.5) too.
Shock horror at capitalism!
Capitalism is just a tool, It just so happens human beings cannot control their greed. No matter which "ism" system is put in place, Humans will sadly always find a way to become greedy.
If I stay at 3440x1440 then yes, even if a game in 3-4 years time comes out with twice the graphical demand of Cyberpunk 2077, then I can simply go from DLSS Quality to DLSS Performance as that takes me to over my framecap of 139fps (Gsync) whereas currently it sits at 120fps. I've already accounted for these variables of future usage of the 4090 in newer games that use path tracing. The chances of a future game being twice as demanding as Cyberpunk though are extremely low, We've already reached the peak of path tracing anyway and the rest is just technology maturity and developers getting comfortable with toolsets so I actually see things improving over the year not getting more demanding.Do you see the 4090 lasting for about 5 years at 1440p ultra settings? (Presume 4K at >144fps will last for 3 years?)
RDNA5 has a LOT of catching up to do!
It’s the 1080ti successor then. Best gpu that will transcend 2-3 generations.If I stay at 3440x1440 then yes, even if a game in 3-4 years time comes out with twice the graphical demand of Cyberpunk 2077, then I can simply go from DLSS Quality to DLSS Performance as that takes me to over my framecap of 139fps (Gsync) whereas currently it sits at 120fps. I've already accounted for these variables of future usage of the 4090 in newer games that use path tracing. The chances of a future game being twice as demanding as Cyberpunk though are extremely low, We've already reached the peak of path tracing anyway and the rest is just technology maturity and developers getting comfortable with toolsets so I actually see things improving over the year not getting more demanding.
Even if I go 5120x2160 in a future monitor upgrade then the answer still is a yes. I'v only used 4k ultrawide so far using DLDSR and that hase a performance overhead to factor in and even that ran reasonably well at 100fps path traced, so a native 4K without the DSR overhead would be above that so should still be good factoring in the maturity benefits of future path tracing. Non path traced games will remain a walk in the park as they currently are even without frame gen on a 4090.
If I wanted 240Hz leveraged assuming I go for a 240Hz or greater display and I was dead set on "needing" to reach that framerate/cap then a 5090 may be needed, but I don't see that ever happening because I'm not one of those people who must reach that sort of refresh rate and fps max, past 120fps it's diminishing returns anyway and I prefer my 175Hz OLED running at 144Hz due to the native 10-bit mode anyway so it will remain like that.
Couple that with the advances in DLSS each update, DLSS Performance now looks almost as good temporally as Quality, for example, along with ReSTIR GI for Ray reconstruction that makes things more efficient and higher quality at the same time. All of this stuff is being baked into Unreal Engine 5 too as both nvidia and CDPR are helping that along due to the 15 year contract on using UE5 only by CDPR. We;re going to see some big gains in UE5 in coming years thanks to this.
I think people often forget just how far and away Nvidia made the 4090 in terms of raw power compared to everything else, it's been untouchable since launch 2 years ago, and will remain top tier for years to come really. No other card in all history has had that large of a performance delta from its lower model siblings.
Do you see the 4090 lasting for about 5 years at 1440p ultra settings? (Presume 4K at >144fps will last for 3 years?)
RDNA5 has a LOT of catching up to do!