• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

i just bought a Gigabyte GeForce RTX 4070 Ti SUPER WindForce
for about 750 quid... which in Swedish prices is absolutely "fine" and RRP
fine not being "fine" its a rip, but that's as low as its gonna get.

Swedish Krona is dirt level money right now... and the 25% vat "really is good for green house gasses"
You should have spent your money more wisely, by supporting local industries. I mean, don't they have Ikea in your country? :D
 
Yup, think I'll hang fire. Maybe go back to the other option of using 2024 to get my core system updated. I could wait until 15700K lands but that'll be Q4/Q1 '25 realistically, and then there's always the 16700K round that corner. We'll see. I'm in that dangerous place of wanting a new rig more than I really need one, but I've started seeing my system run out of system RAM when in Cyberpunk causing noticable stuttering where I assume it's switching to page file etc.

Hmm, maybe 14700KF build this spring, then 5080 right before the 5080 Super debacle starts :cry:

I am passing this GPU gen particularly as my 3600/AM4 system is needing it more than graphics and will see a better leap for it. Dunno why you are sticking with intel, you could get a zen 5 as it should be out at right time. End of year we can assess how the next gen GPU's are costing.
 
AMD compete at every tier except the 4090. So unless you're in the market for one of those (and are thus a fraction of a percent of the gaming market), why does not being able to match the 4090 matter? Your lower-end Nvidia card doesn't perform any better because the 4090 exists. AMD are lacking in RT performance, but Nvidia are lacking in raster performance and VRAM at any given price point. The whole "no competition" thing is and always has been an excuse for people who will only consider Nvidia cards, but can't admit to themselves that they're a fanboy. It's been going on since long before RT and upscaling technologies were a thing. A decade ago it was "drivers" or "efficiency" (which may or not may not matter in any given generation, depending on whether Nvidia are ahead or not). There's always a good reason why people HAVE to buy Nvidia. Just like iSheep.
Let’s face it, AMD offer similar performance at a similar price to Nvidia. You may get a 50 quid discount here and there but at the end of the day both companies are ripping consumers off.
 
AMD compete at every tier except the 4090. So unless you're in the market for one of those (and are thus a fraction of a percent of the gaming market), why does not being able to match the 4090 matter? Your lower-end Nvidia card doesn't perform any better because the 4090 exists. AMD are lacking in RT performance, but Nvidia are lacking in raster performance and VRAM at any given price point.

The only reason AMD offer more raster performance per pound is because they're forced to undercut in order to compensate for their lack of RT performance.
The balance between raster and RT is uneven though. Usually the "equivalent" AMD card will offer slightly more raster performance for significantly worse RT performance and many consumers will find the NVidia cards a better "balance".

The VRAM argument is also slightly specious. I do agree with you that NVidia have been far too slow/reluctant to increase the VRAM on their products compared to AMD but, in reality, what difference has this actually made so far?
Despite all the doom-mongers saying the 3080 10GB would be struggling in no time, here we are over three years later and 10GB is still enough for the vast majority of titles, with only a very few new ones now starting to cause it problems.

The arguments for increased VRAM are mainly focused around "future-proofing" and the belief that 8/10/12GB VRAM won't be enough for forthcoming titles in the near future. I do agree with this and feel we're on a bit of a threshold with VRAM where 8GB is highly inadvisable and even 12GB is questionable if you want your new purchase to last well in the next few years.

But, if you're arguing that AMD is more future-proof due to their increased VRAM, you can just as easily argue that NVidia is more future-proof due to their superior RT support.
 
Had a look at the 7900XT cards and prices have dropped on some, the Sapphire Pulse on, £739.99 and the Nitro, £769.99 or the Asrock and Powercolour Hellhound on £749.99.
Noo change for the 7800XT on pricing.

Interesting to see if the 4080 Super causes the 7900XTX to fall.
 
Despite all the doom-mongers saying the 3080 10GB would be struggling in no time, here we are over three years later and 10GB is still enough for the vast majority of titles, with only a very few new ones now starting to cause it problems.

The arguments for increased VRAM are mainly focused around "future-proofing" and the belief that 8/10/12GB VRAM won't be enough for forthcoming titles in the near future. I do agree with this and feel we're on a bit of a threshold with VRAM where 8GB is highly inadvisable and even 12GB is questionable if you want your new purchase to last well in the next few years.
I think the argument around VRAM wasn't necessarily targeting the 3080 - but more so on the 3070 only getting 8GB (same as the 1070) and that having an impact relatively soon after that card came out.

It also did feel a little odd for Nvidia to refer to the 3080 as their 'flagship' card only to give it less VRAM than the 1080Ti.
 
I think the argument around VRAM wasn't necessarily targeting the 3080 - but more so on the 3070 only getting 8GB (same as the 1070) and that having an impact relatively soon after that card came out.
I think people only targeted the 3080 over VRAM because they couldn’t get one and were upset about it.

The truth is it’s probably the best high end GPU to have released since the 1080ti.

A 102 die card with top tier performance for 650 quid is something you can only dream about nowadays where you’re getting 103 die 2nd rate performance for a grand.
 
I think the argument around VRAM wasn't necessarily targeting the 3080 - but more so on the 3070 only getting 8GB (same as the 1070) and that having an impact relatively soon after that card came out.

It also did feel a little odd for Nvidia to refer to the 3080 as their 'flagship' card only to give it less VRAM than the 1080Ti.

Yes they persisted with 8GB far longer than they should have done, that's for sure. My point was that there's a lot of chatter lately around 12GB not being enough for "future" titles and how this puts AMD in a favourable light for offering 16GB on their "equivalent" cards. I was just saying that you can't criticise NVidia's more limited VRAM on the basis of it not being as "future proof" as AMD without also noting that AMD's lacklustre RT performance could be just as limiting down the road :)

As for the 10GB of the 3080, yes it felt very odd compared to the 11GB of the 2080Ti and 1080Ti before it but it was dictated by the 320-bit bus, something that I noted earlier hasn't been repeated on the 40-series.
 
A 102 die card with top tier performance for 650 quid is something you can only dream about nowadays where you’re getting 103 die 2nd rate performance for a grand.

Exactly the point I was making. It sucks and isn't fair but that's just how it is and it's not going to change any time soon.
 
Yes they persisted with 8GB far longer than they should have done, that's for sure. My point was that there's a lot of chatter lately around 12GB not being enough for "future" titles and how this puts AMD in a favourable light for offering 16GB on their "equivalent" cards. I was just saying that you can't criticise NVidia's more limited VRAM on the basis of it not being as "future proof" as AMD without also noting that AMD's lacklustre RT performance could be just as limiting down the road :)

As for the 10GB of the 3080, yes it felt very odd compared to the 11GB of the 2080Ti and 1080Ti before it but it was dictated by the 320-bit bus, something that I noted earlier hasn't been repeated on the 40-series.
Saying the same thing, for the 7600XT and it 16GB VRAM, more future proof, useless now but covers you down the line. Though at the level of the 7600XT/4060 Ti RT is probably not really a consideration.
 
Correct me if I'm wrong, but it feels like we're sort of missing an obvious price point. Why are we so used to and willing to accept huge price hikes and say "yeah but it's 5x more powerful and only 3x as expenisve"? I paid £450 for a 980Ti when it came out in 2016 (?), and my 3080 is much faster and is actually a tier (sort of) lower than the 980ti and it was just shy of £900. It's a few years younger, and granted, it's faster, but this is what technology does. Yes, things get more expensive through inflation etc, but it's like saying the new car model blah mk2 must be more expensive at release times than the first mk1 at its release just because it's the new model with new features and looks. It just doesn't make sense; both cars were top tier at their respective times

Yes, I know it's WAAAAY more complicated; you can't just compare a 980Ti to a 3080ti like for like. Different architecutre, die size, core types. But surely, the 980Ti and 3080Ti say match in terms of tier level for their year. Why do the prices artiifically climb 'just because' it's newer and faster? Faster card, better tech, yes, but easier to manufacture etc.

Anyways, it just feels like cards are getting more and more expensive for no obvious reason other than greed and that people will pay it. Before the ban stick is waved at me, I'm referring to nvidia etc, not OCUK or stores in general.

I feel like the first response is going to be "hey there, first day on earth?" and it's probably apt. Just the consumerism pricing greed is getting disgusting everywhere you look. The UK is becoming a bit of a joke here.
You either buy or get out of PC gaming.
My last highly price/performance ratio GPU was a R290. 2xxx and Radeon VII I should have skipped because bad price. 3xxx and 6xxx skip due to crypto bubble and covid. Now skip 4xxx and 7xxx again thanks to bad pricing. So, what options do we have in this highly profit is all that matters mindset market? Both companies increased prices while Intel can't keep up yet - probably will join the dance once he can make good enough GPUs.

So... either buy or you quit PC gaming. If you buy, it doesn't mean you accept the situation, you're just forced into it.
 
You either buy or get out of PC gaming.
My last highly price/performance ratio GPU was a R290. 2xxx and Radeon VII I should have skipped because bad price. 3xxx and 6xxx skip due to crypto bubble and covid. Now skip 4xxx and 7xxx again thanks to bad pricing. So, what options do we have in this highly profit is all that matters mindset market? Both companies increased prices while Intel can't keep up yet - probably will join the dance once he can make good enough GPUs.

So... either buy or you quit PC gaming. If you buy, it doesn't mean you accept the situation, you're just forced into it.
If enough people drop out of PC gaming, it should self correct eventually. If we get to a point where only the rich can afford graphics card, why would developers bother developing for PC? As it is, its only Nvidias money that is paying for game developers to produce the really high end features like path tracing.
 
You either buy or get out of PC gaming.
My last highly price/performance ratio GPU was a R290. 2xxx and Radeon VII I should have skipped because bad price. 3xxx and 6xxx skip due to crypto bubble and covid. Now skip 4xxx and 7xxx again thanks to bad pricing. So, what options do we have in this highly profit is all that matters mindset market? Both companies increased prices while Intel can't keep up yet - probably will join the dance once he can make good enough GPUs.

So... either buy or you quit PC gaming. If you buy, it doesn't mean you accept the situation, you're just forced into it.

Thing I find quite funny is that people who go on about nvidia pricing and how they're taking the **** etc. are still running old gpus before 30xx/rdna 2 and even funnier, some still on turing, surely if amd are coming to the rescue with their options, these people have the upgrade path?

If enough people drop out of PC gaming, it should self correct eventually. If we get to a point where only the rich can afford graphics card, why would developers bother developing for PC? As it is, its only Nvidias money that is paying for game developers to produce the really high end features like path tracing.

Nvidia paying for features like RT/PT? Even though amd sponsored, non brand sponsored and console only games have such features...... Makes sense.
 
Everyone comparing the new cards to a 3080 @ £650 when first released (I was lucky enough to snag one at that price), but how long did it actually stay at that price? Even at retailers brand new the price went up by at least £500 after the first month. That never happens usually does it, for instance when the PS5 & switch sold out everywhere and no one could get hold of one, the retailers didn’t just bump the prices up.
 
Thing I find quite funny is that people who go on about nvidia pricing and how they're taking the **** etc. are still running old gpus before 30xx/rdna 2 and even funnier, some still on turing, surely if amd are coming to the rescue with their options, these people have the upgrade path?



Nvidia paying for features like RT/PT? Even though amd sponsored, non brand sponsored and console only games have such features...... Makes sense.
Cyberpunk and Alan Wake 2 are Nvidia sponsored games no? The two most technologically advanced games? Of course Nvidia is throwing money and resources at them to make these features (they'll probably describe it as a "collaboration"), they need to justify their high pricing somehow. Which other games have PT that aren't part of the Nvidia Remix thing?
 
Last edited:
Cyberpunk and Alan Wake 2 are Nvidia sponsored games no? The two most technologically advanced games? Of course Nvidia is throwing money and resources at them to make these features (they'll probably describe it as a "collaboration"), they need to justify their high pricing somehow. Which other games have PT that aren't part of the Nvidia Remix thing?

Of course AW 2 and CP 2077 have had nvidia involved, that's obvious. Your post insinuated not just PT but other technology:

If enough people drop out of PC gaming, it should self correct eventually. If we get to a point where only the rich can afford graphics card, why would developers bother developing for PC? As it is, its only Nvidias money that is paying for game developers to produce the really high end features like path tracing.

If you do just mean PT then yes, agree but as we have seen what happened with just normal "ray tracing", it probably won't be long until products outside of nvidias grasp start to adopt PT as we saw happen with RT (where RT started of with just 2 nvidia sponsored titles i.e. bf 5 and control). PT/RT is not a "nvidia" thing no matter how much they try to make it out to be a RTX/nvidia thing.

EDIT:

Obviously RTX remix is different where it is a nvidia tool.
 
Last edited:
Back
Top Bottom