• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is it time to give up and move to consoles

Status
Not open for further replies.
I am in a constant cycle of "build top end PC, get fed up, buy console, get fed up, build top end PC"

There's no debating that consoles are a much less stressful experience.

For me the only console that existed was Playstation as I love the Sony exclusives. This time round though I think I'll be staying with PC as more and more of those Sony exclusives are landing on PC, and the wait for them seems to be getting shorter too.

I can see day one pc releases being a thing eventually.

Going back to PC, the big issue and frustrations for me were often self inflicted. I'd end up fiddling more than gaming, whether it be opening the case and experimenting with different fans, coolers etc to reduce temps, or being in a cycle of bios, bluescreen, bios whilst trying to overclock.

High end hardware has pretty much removed the need to overclock, I've power limited my 4090 to 85% and set a 200w PL1/pl2 on the 13900k and I'm good to go.

Prices have increased but pc gaming is my only vice, so I don't mind spending the money on it.

This is probably the first time in a while that I've actually enjoyed pc gaming as I am actually just playing the games rather than messing about with needless stuff. It feels great being able to play most games at 120fps at 4k. I was really disappointed with my last rig, the 5950x overclocking was a nightmare, and the 3090 was NOT a 4k card! I spent most of my time in the bios rather than playing games and it just soured my experience which made me sell up!
 
Last edited:
Yeah I've ended up getting a PS5 for the exclusives (GOWR, upcoming FF games) but then I've been pretty happy with my fairly mid-range PC for the last couple of years (3060, 10400f... nothing special). I tend to run things supersampled to 1440p on a 1080p display (this seems to smooth out jaggies better than any other method for me) but considering getting an HDMI splitter to run it on the TV as well. I probably won't bother upgrading my PC at least until the RTX 5 series or equivalent are out especially given the state of this gen so far.
 
Last edited:
Your mac doesn’t have the £2000 gpu performance though dose it? That’s a silly comparison
Your mac doesn’t have the £2000 gpu performance though dose it? That’s a silly comparison
No but a complete system for just over £2k that is blazing fast uses a fraction of the power and has a massively better resale value in the future .
 
Would no hip you Could have had a 12600k system for £1000 so you would have been starting off would £1000 in your pocket, when you resell your already at £1000 plus what the system sells for
 
Problem is that even the upcoming 12gb RTX 4070 is a $600 card. So in 2023 even a mid range GPU alone costs more than a PS5/series X !
 
Last edited:
I am in a constant cycle of "build top end PC, get fed up, buy console, get fed up, build top end PC"

There's no debating that consoles are a much less stressful experience.

For me the only console that existed was Playstation as I love the Sony exclusives. This time round though I think I'll be staying with PC as more and more of those Sony exclusives are landing on PC, and the wait for them seems to be getting shorter too.

I can see day one pc releases being a thing eventually.

Going back to PC, the big issue and frustrations for me were often self inflicted. I'd end up fiddling more than gaming, whether it be opening the case and experimenting with different fans, coolers etc to reduce temps, or being in a cycle of bios, bluescreen, bios whilst trying to overclock.

High end hardware has pretty much removed the need to overclock, I've power limited my 4090 to 85% and set a 200w PL1/pl2 on the 13900k and I'm good to go.

Prices have increased but pc gaming is my only vice, so I don't mind spending the money on it.

This is probably the first time in a while that I've actually enjoyed pc gaming as I am actually just playing the games rather than messing about with needless stuff. It feels great being able to play most games at 120fps at 4k. I was really disappointed with my last rig, the 5950x overclocking was a nightmare, and the 3090 was NOT a 4k card! I spent most of my time in the bios rather than playing games and it just soured my experience which made me sell up!
Your frustration with the 5950x/3090 system seem entirely self inflicted...
There was/is no need to be overlocking or messing in the BIOS... you could run both of those at stock and have a very enjoyable 4K experiance.

IF you need to run every game at max settings 120+ FPS, well I guess your going to need to change your system every cycle to keep up.
If you just let GeForce experiance set appropriate settings... you can get a near-console like experiance in terms of setup, and still have much better visuals / higher FPS.

When I intialy built my current PC... I too fell into the pit of tweaking, chasing reviewer bench scores etc. Ended up making the gaming experiance worse, getting blue screens etc thanks to my aggressive undervolt.
Ended up reverting CPU to stock (no PBO even) and a very mild under-volt on GPU (mainly to reduce coil-whine). Been rock solid stable for the last 2 years. And gave up at best a couple % in terms of overal FPS.

I dont tend to fiddle with games graphic settings anymore either. Just let GFE set them. If I then find things are not smooth enough, or look janky I manually adjust. Usually its a case of turning RT off/down or changing the quality settings of DLSS.

Your post makes it sound like you absolutely had to sell / upgrade from a 5950x/3090 system ... when you really didnt.

Part of the problem or alure of the PC platform IS that tweaking... while in my early years of gaming I loved nothing more than a night spent tweaking OCing / and fine tuning a game... nowdays I cant be bothered.


On the more general topic... I feel like the gap between consoles and PCs in terms of costs has got too big, and the gap in terms of performance too small to justify it for many.
 
The problem is you think a 4070 is a mid range gpu. This is the problem, everyone as lost there way

Mid setting 1080p is a mid range gpu.
1440p is upper range
4k is high end

Well technically in terms of GPU,its a GA106/RTX3060 replacement. But instead of getting a nice improvement,we got a not so nice price increase! :rolleyes:
 
Well technically in terms of GPU,its a GA106/RTX3060 replacement. But instead of getting a nice improvement,we got a not so nice price increase! :rolleyes:
Meanwhile Intel is out there chucking out 256-bit 406mm^2 die GPUs for like £300, not too dissimilar to the 4080 physically (which actually has a smaller die than that!) although nvidia are using a newer process node. It's far from a perfect comparison but between GPUs produced at a similar time it's probably a fairly reasonable proxy for manufacturing costs. I think it's reasonable to assume Nvidia is making significantly larger margins than Intel but I guess that's the reward for creating a much more efficient design. Still, hurts as a consumer knowing that most of the cash is probably just going into Jensen's fancy coat fund.
 
Last edited:
Meanwhile Intel is out there chucking out 256-bit 406mm^2 die GPUs for like £300, not too dissimilar to the 4080 physically (which actually has a smaller die than that!) although nvidia are using a newer process node. It's far from a perfect comparison but between GPUs produced at a similar time it's probably a fairly reasonable proxy for manufacturing costs. I think it's reasonable to assume Nvidia is making significantly larger margins than Intel but I guess that's the reward for creating a much more efficient design. Still, hurts as a consumer knowing that most of the cash is probably just going into Jensen's fancy coat fund.
The 106 series dGPUs have always been a certain size and a certain percentage of the shaders,transistors, TFLOPS, etc vs the top end. Nvidia basically made a two node jump from Samsung 8NM to TSMC 4NM. Instead of giving consumers a decent performance uplift they jacked the price up. This why the uplift is getting worse as you go down the range.
 
Last edited:
The problem is you think a 4070 is a mid range gpu. This is the problem, everyone as lost there way

Mid setting 1080p is a mid range gpu.
1440p is upper range
4k is high end
The issue is it is midrange

4090
4080
4070 <<<<
4060
4050

If a card in the middle of the release stack isn't midrange what is?
 
Last edited:
The issue is it is midrange

4090
4080
4070 <<<<
4060
4050

If a card in the middle of the release stack isn't midrange what is?
I mean the 4070Ti exists so surely by that logic the 4070 is below midrange...

The 4000 series isn't completely released yet so it's tough to call it - but I don't think the 70 series has ever been 'midrange', it sits as the entry into high-end without being anywhere near 'enthusiast'.

If we look at the 3000 series as a full range, Wikipedia does a pretty surprisingly accurate job of breaking them down.

Entry: 3050 / 3060 8B
Mid-range: 3060 12GB / 3060Ti
High-end: 3070, 3070Ti, 3080 10GB, 3080 12GB
Enthusiast: 3080Ti, 3090, 3090Ti
 
Last edited:
So you need to spend at least £2k for a decent enough PC.

That’s the issue I’m finding. I was tempted to build a PC but I know I’d want a 13th gen Intel CPU, DDR5 RAM and an RTX 40xx GPU.

right now that pretty much sets you back £2k even if you opt for “mid-range-ish” i5/4070 level parts. Personally I’m happy to wait for prices to drop before building anything!
 
Last edited:
Meanwhile Intel is out there chucking out 256-bit 406mm^2 die GPUs for like £300, not too dissimilar to the 4080 physically (which actually has a smaller die than that!) although nvidia are using a newer process node. It's far from a perfect comparison but between GPUs produced at a similar time it's probably a fairly reasonable proxy for manufacturing costs. I think it's reasonable to assume Nvidia is making significantly larger margins than Intel but I guess that's the reward for creating a much more efficient design. Still, hurts as a consumer knowing that most of the cash is probably just going into Jensen's fancy coat fund.

The reality is the third level chip in the range is the one used for mainstream dGPUs. The AD104 by most historic metrics is a 106 series dGPU,because Nvidia released a 103 series chip for desktop this generation. Also if you look at relative die size,number of transistors,TFLOPS,etc compared to the top chip of the generation the AD104 also fits the same metrics. Nvidia made essentially a two node jump from Samsung 8NM to TSMC 4NM,but instead of sharing that increase with consumers,just jacked pricing up. So this is why as you go down the range the improvements seem to be getting worse. Intel is also using a modified 7NM node,in TSMC 6NM and I would argue drivers and game support are holding the design back.
 
I mean the 4070Ti exists so surely by that logic the 4070 is below midrange...

The 4000 series isn't completely released yet so it's tough to call it - but I don't think the 70 series has ever been 'midrange', it sits as the entry into high-end without being anywhere near 'enthusiast'.

If we look at the 3000 series as a full range, Wikipedia does a pretty surprisingly accurate job of breaking them down.

Entry: 3050 / 3060 8B
Mid-range: 3060 12GB / 3060Ti
High-end: 3070, 3070Ti, 3080 10GB, 3080 12GB
Enthusiast: 3080Ti, 3090, 3090Ti

Relative to the real high end specifications for Ada the 4080 and 4070ti are more like one tier down compared to their name while costing twice as much as the cards one tier down in the 3000 series stack...

Personally spending a grand on a GPU doesn't bother me, spending a grand on hardware which would be expensive at half the price does... the notion of spending what is likely £700 or more on a 4070 which is actually more like x50 grade hardware sickens me.
 
The issue is it is midrange

4090
4080
4070 <<<<
4060
4050

If a card in the middle of the release stack isn't midrange what is?
It’s the middle of there range. That’s doesn’t make it a mid range gpu. Bring AMD in to this conversation

I basic pc would use iGPU. 4090 shouldn’t be used for gaming that’s not what it was made for. Like the 3080/ti should be the top end of last gen the 3090 shouldn’t really be classed as a gaming gpu. I know they get used as them but not by general public.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom