• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

How is that 10GB card holding up now in RT? The 12GB card was the buy, smart people waited...

I bought two cards last gen actually, a 3060ti I bought for £480 during the great card drought, which I later flipped for £1100 to a miner...and then a 6700XT which I bought with the proceeds, that's how I roll...;)

I haven't been on 1080p since God was a lad.

Meh. I got a 3080 FE that worked out under £600 after selling the codes that came with it. Then eventually sold it to CEX for £1620 and now I have a 3080 Ti which I got for £575 from a reputable member here. That's how it's done ;)
 
They did, it was the 3090 @ x2 the price but only 10% extra perf. I'm amazed for the hate of the 3080 is still so strong. You didn't buy any of that gen nor this one.

This is a point which is always overlooked for obvious reasons.... I would rather face such issues where I have to reduce 1-2 settings than having to have spent the "extra" £750 just to be able to keep a setting one notch higher. Had there been a **** ton of games where this was the case and not just 1-3 broken unoptimised games or/and that 1 setting higher actually made a significant difference to the visuals then they would have a point but reality is, they don't.

How is that 10GB card holding up now in RT? The 12GB card was the buy, smart people waited...

I bought two cards last gen actually, a 3060ti I bought for £480 during the great card drought, which I later flipped for £1100 to a miner...and then a 6700XT which I bought with the proceeds, that's how I roll...;)

I haven't been on 1080p since God was a lad.

You mean when those 3080 12/ti models came out for £1200+? Yup very smart buy..... Or do you mean waiting 2+ years for them to drop prices by which time, people have played said games.

The smart buy was jumping on a 3070, 3080 FE or a 6800, 6800xt (if you didn't care for RT or upscaling tech.) for their MSRP when available, anything else made no sense from a value perspective.

How is that 10GB card holding up now in RT?

Which RT games are you referring too here? Please don't say hogwarts, forbidden....... :o
e54c8Hy.png

rb7fIsB.png

bzj4Bt2.png
 
Last edited:
Meh. I got a 3080 FE that worked out under £600 after selling the codes that came with it. Then eventually sold it to CEX for £1620 and now I have a 3080 Ti which I got for £575 from a reputable member here. That's how it's done ;)

Really? Because you got robbed by CEX...you left money on the table there...

;)
 
Last edited:
Nope. And that 10GB has aged like milk in RT, just like we all knew it would, and how long did it take for NV to replace it with the 12GB version? The one they should have released as bare minimum...

The 10GB card was a bait and switch, to catch all the early adopters out, all those who just cannot wait, when we all knew the full-fat (well NV's full-fat, more like semi-skimmed) version was coming. PS - AMD were already at 16GB standard, even for the weaker 6800...

;)
What difference does it make whether amd was already on 16? I care about the results, not the spec sheets. A 3080 @ 1440p high on TLOU1 matches a 6800xt at 1440p medium. Those are the facts. So it aged much better while still having substantially better RT performance and a better upscaler
 
What difference does it make whether amd was already on 16? I care about the results, not the spec sheets. A 3080 @ 1440p high on TLOU1 matches a 6800xt at 1440p medium. Those are the facts. So it aged much better while still having substantially better RT performance and a better upscaler

Yeah, on reflection I think I'm going to go with Hub's conclusions over yours...

:D
 
Yeah, on reflection I think I'm going to go with Hub's conclusions over yours...

:D

dumbanddumber-hearnoevil.gif
 
I see you're still missing the point.
I absolutely am, I admit. I don't understand the issue, at all. In fact, I don't see an issue, at all.

Let me put it this way, and probably nobody will answer, but here is hoping. I have a 4090. Should it have double the RT performance - cause it really really REALLY struggles to play for example, cyberpunk. We are talking 40 or less fps here at native 4k. Is the 4090 DOA? Nvidia should have released it with 50% more RT cores to actually get a stable 60 in all games?

Say tomorrow I decide to be a gamedev and release a game that require 48 GB of vram for ultra textures. Is it the 4090s fault?

I just don't understand what the whole back and forth is about. You have to drop settings on older / slower cards.. I've been gaming on PC for a little over 20 years, that has always been the case. I was dropping settings day 1 on my 1080ti and my 3090. So what is the actual issue here?
 
Back
Top Bottom