• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

What a world we live in...

$399 for a 1.15x improvement over the previous generation

I guess it's better than nothing, considering that the price is the same as the launch price of the RTX 3060 TI.

There's no reason for people to upgrade though.

I think we knew it was gonna be a bit disappointing, based on the RTX 4070 having the same shader count as the RTX 3070. but priced $100 higher.

It's really just Ampere Next / Ampere plus, like the roadmap said.
 
Last edited:
With the 4060Ti 16GB, Nvidia do not have to release a Super range to plug gaps in there range, they can release 16GB versions of the 4070 and 4070 Ti and add £100 to the cost.
 
Personally I don't mind the tech as long as it genuinely makes the game more smoother to play. However, I don't like how Nvidia uses it to compare with older gen cards that doesn't support the tech just to inflate their numbers. Not every game supports it so I prefer seeing the base numbers for a more realistic comparison.
Yes definitely, the tech should be compared like for like, so DLSS2 vs FSR2, native vs native etc. PR teams will always do the opposite however. Either way, the gist of the matter is that frame gen is legitimately good, and Cyberpunk is proof of that.

Framegen on a 8gb card lol, from my testing it can use anything from 1-2gb game depending. $500 for the 16gb is another joke. Hope the press **** all over these $400+ 1080p cards.
:cry:
Yeah I don't know how useful it will be on an 8GB card, not paid attention to how much extra/less VRAM is used with it on vs off, would have to trawl through my RTSS overlay screenshots to compare those but at this point it would be pointless I guess :p

DLSS and FG are game changers, but usually it's amd proponents that go around forums talking smack. It's been that way for the last 15 years in every damn forum. I still remember the AMD crowd going nuts about how insanely good the FX8150 was. People never change...
Does seem like the cycle repeats itself, happens with CPU brands as well lol.
 
If the 16gb was the correct price of 399, not great but acceptable. But no, Jensen has to **** over gamers right down the stack. These won't shift.
It limits it's use mostly to 1080p for the 8GB version.

So, I imagine it will sell to people that bought would've RTX 3060 / RTX 3060 TI (e.g. 1080p monitors)

It puts it in the same class as the RX 7600, but presumably is going to be more expensive.

It's pretty likely that the RX 7600 is going to cost £300 or more now though.
 
Last edited:
I also have to add that the frame generation jokes, surely can't be serious? I was sceptical at first but in Cyberpunk the generated frame makes a big difference, from 64fps to 112 or whatever it is, isn't just giving higher numbers to fps overlays, you /do/ feel the difference and see it in action.

I do wonder if people slating it have actually used it.
The first iteration of DLSS3 wasn't great with all the artifacts and glitches, but I knew they'd sort them out. The second iteration is already much better.
 
If the 16gb was the correct price of 399, not great but acceptable. But no, Jensen has to **** over gamers right down the stack. These won't shift.

He's hoping for another crypto or some other world event that needs gpus again. I'm also guessing by next release 5000 series in 2024 the whole mining rubbish will be back and all this stock sitting collecting dust will move at silly prices. That is what they are hoping for I'm sure or the hope of this BS A.I stuff was going to sell all their gpus. That A.I bubble is soon going pop too and a lot of companies that believed the hype will suffer soon too.
 
Last edited:
The first iteration of DLSS3 wasn't great with all the artifacts and glitches, but I knew they'd sort them out. The second iteration is already much better.
Yeah I was playing Cyberpunk looking for the artefacts people mentioned early on and could not see any such issue, everything just felt normal. Granted I joined the party late but maybe that's a good thing.
 
He's hoping for another crypto or some other world event that needs gpus again. I'm also guessing by next release 5000 series in 2024 the whole mining rubbish will be back and all this stock sitting collecting dust will move at silly prices. That is what they are hoping for I'm sure or the hope of this BS A.I stuff was going to sell all their gpus. That A.I bubble is soon going pop too and a lot of companies that believed the hype will suffer soon too.

Nothing stopping nvidia from making their own meme coin and then locking it to nvidia gpu's only :cry:, that would be something!
 
People should and I think will just wait it out until the 5000 cards, that will be another safe time to upgrade moment from Jensen.
Just give your money to AMD instead if you want a card with more VRAM. It's an option.

I kind of think ~£400 would be a fair price for an AMD card with 16GB of VRAM. The RX 6800 needs a price cut.
 
Last edited:
He's hoping for another crypto or some other world event that needs gpus again. I'm also guessing by next release 5000 series in 2024 the whole mining rubbish will be back and all this stock sitting collecting dust will move at silly prices. That is what they are hoping for I'm sure or the hope of this BS A.I stuff was going to sell all their gpus. That A.I bubble is soon going pop too and a lot of companies that believed the hype will suffer soon too.
They're not stupid, they know mining isn't coming back, but there's going to be a huge push for local AI generation rather than cloud-based to give users more control over their data security. Or at least that's what they're going to sell it as.
 
Just give your money to AMD instead if you want a card with more VRAM. It's an option.

I kind of think ~£400 would be a fair price for an AMD card with 16GB of VRAM. The RX 6800 needs a price cut.

Not all of us buy gpus for gaming... what about people that need CUDA ?

Also AMD have joined Nvidia so don't expect anymore price cuts on any of the 6000 series they will all go out of stock before that. AMD will not save you and I'm shocked you still have not upgraded by now with all the posts and realised things are not going to change and only get worse. The 6950xt for £589 was a good buy at that and the only sensible update for most.


Seems it has also gone up £10 now too.
 
Last edited:
There's no way I'd buy the RTX 4060 TI 16GB for $500 / £500, if the price is correct.

That is roughly how much I would pay for the RTX 4070.

AMD will not save you and I'm shocked you still have not upgraded by now
Lots of people are waiting for the prices to drop. I will probably upgrade in June or July, the Vega 64 isn't doing so great at 1080p and runs too hot under load (I got it as a stopgap in January 2023).

The RX 6950X is probably what I'd get right now, if building a new system.

I'd get the RTX 4070 instead if it was priced at £500.

Is a used RTX 3080 TI a good buy at £600?

I noticed that it provided a reasonable framerate in the Witcher 3 Next Gen (with RT disabled).
 
Is a used RTX 3080 TI a good buy at £600?

Bit expensive IMO. Not sure I would class it as a ‘good buy’, should be about £500 or less now although it’s still a capable card. A 4070 has the same amount of ram and it has the genuine benefit of being protected by a warranty…
 
Last edited:
I think this is probably what I'll buy, but gonna wait another month or so.

My only concern with the RTX 3080 TI, is that the VRAM temps (hotspot) on the RTX 3000 series (with GDDR6X) seemed stupidly high to me. I had a used RTX 3080 and the cooler failed on it after a few months of use.

To Nvidia's credit, they did fix these problems with the RTX 4000 series.
 
Last edited:
Is a used RTX 3080 TI a good buy at £600?

I noticed that it provided a reasonable framerate in the Witcher 3 Next Gen (with RT disabled).
Keep in mind the benchmarks you see will be from Witcher 3 running before the latest patch which added big boosts to fps. A 3080 Ti can run Witcher 3 well above 60fps now anywhere in the game, outside of Novigrad central town square you're looking at 85-100fps or so at 1440p with ray tracing and everything on Ultra+ with DLSS enabled.

I had no issues with temps, yes it does get to 81 degrees, but it has never been a problem and the boost rates are stable too, on the FE I had.
 
Last edited:
Back
Top Bottom