• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is the 3090 a 'waste' for gaming?

They are too expensive - but they are there for those that can and want to afford them. And for folk that use GPU's for work and buy them through work - no brainer.

Yes, I game less as each year passes really and work or productivity tasks increase. When I do game however, if I have the display to showcase 4k I need a card capable. My vega56 was coping but not able to do full 4k, hence why this new gen I was excited for once to get there. The only downside to longevity was with the 3080 and low VRAM whereas the 3090 and the 6800XT were a bit forgiving as they would never fall foul of this possibility (regardless of people thinking 10Gb is enough).
 
Bigger is better :p;)
I'm wondering how many of us with 3090's would have been happy with the 3080/6800xt had they been availablein numbers at time of purchase

Or a 6900XT. And theres the crux of it... yes. Spending this much money is nuts but I have my hands on something. We've still got an entire winter to get through yet before supplies normalise according to nvidia and a vaccine is just as long a wait. Life's too short.
 
Or a 6900XT. And theres the crux of it... yes. Spending this much money is nuts but I have my hands on something. We've still got an entire winter to get through yet before supplies normalise according to nvidia and a vaccine is just as long a wait. Life's too short.

I'd have definitely been happy with a 3080, I have an FTW3 Ultra on the way through EVGAs step up programme but looking like early next year before it's with me. Will make a decision then which card to sell.
 
Yes, I game less as each year passes really and work or productivity tasks increase. When I do game however, if I have the display to showcase 4k I need a card capable. My vega56 was coping but not able to do full 4k, hence why this new gen I was excited for once to get there. The only downside to longevity was with the 3080 and low VRAM whereas the 3090 and the 6800XT were a bit forgiving as they would never fall foul of this possibility (regardless of people thinking 10Gb is enough).

we both went from vega56 to 3090FE. Thats lolz.:D:D:D
I had sapphire pulse 56 with 64 bios.

I did have a 3070FE in the middle of it, and that was a banging card. Best bang for the buck out of this gen for sure. FE for £470 is untouchable.

But i wanted to play COD and Watchdogs, had codes for them both but couldn't activate with 3070. So YOLO 3090. Just testing it now.

Corsair HX750i seems to be handling it just fine. For now.
 
we both went from vega56 to 3090FE. Thats lolz.:D:D:D
I had sapphire pulse 56 with 64 bios.

I did have a 3070FE in the middle of it, and that was a banging card. Best bang for the buck out of this gen for sure. FE for £470 is untouchable.

But i wanted to play COD and Watchdogs, had codes for them both but couldn't activate with 3070. So YOLO 3090. Just testing it now.

Corsair HX750i seems to be handling it just fine. For now.

Yes similar PSU too! The jump from vega was perfect, would not have bit if I already owned a 5700XT or 2070S say.
 
Probably depends on what res you are trying to game at. Even the 3090 starts to bog down in some games at 4k, well have to turn down settings to keep things nice and smooth
 
3090 FE at 1815 Core 1024.5GB/s Ram + 14.3% PL
cant even hold 400fps in overwatch at 1440p max settings

its 300-350 FPS most of the time sometimes 400 FPS
Its just over twice as fast as a Sapphire Vega 64 Nitro+

Fortnite STW its 120-240 fps at 1440P Epic Settings no RT
and cant even hold my 144 fps refresh

I certainly wouldnt say its overkill
overpriced yes
overkill no

I have also noticed Nvidia are clearly lowering texture quality dynamically
in both said games at random when load is high
its actually a huge downgrade quality wise from my Sapphire Vega 64 Nitro+
and once in a while you will see a texture that is clearly potato setting with no detail

neither game supports Variable Rate Shading
so clearly nvidia are upto no good to pump up frame rates
surprised people havnt noticed this tbh and called nvidia out on it

Its not the game doing it as it never happened once on the old card

RT = Cheaks
DLSS = I have to give them it DLSS looks so good on Fortnite not that i use it
but yeah its very hard to tell the resolution is lowered a huge upgrade in upscaling tech
 
Last edited:
My preload on steam is done. I will be playing as soon as it launches at RT Ultra settings baby!
According to the benchmarks, the 3090 is the only card which can do 60 FPS RTX ultra at 1440p with DLSS at quality. At 4K, no card can handle 60 FPS. Seems nvidia’s system requirements article was targeting 30 FPS and not 60.

This game is the new Crysis and would need multiple generations of graphics cards to run smoothly.
 
According to the benchmarks, the 3090 is the only card which can do 60 FPS RTX ultra at 1440p with DLSS at quality. At 4K, no card can handle 60 FPS. Seems nvidia’s system requirements article was targeting 30 FPS and not 60.

This game is the new Crysis and would need multiple generations of graphics cards to run smoothly.

Doeds it also have Hairworks? If not, what's the point?
 
So @TNA are you ready for that 30fps goodness..? :p

zEMKgpfRYxP6SDGjvFMrag-970-80.png
 
Jumping in here because I too stupidly bought a 3090 recently.

Last PC I bought was a i5 2500K, GTX580 (GTX980 later), 16GB jobby - I'm apparently a "run it until its on its knees" kind of guy. The primary reason I bought the 3090 was because of the shortages of the 3080s - I didn't want to wait any longer and annoyingly I could justify the cost (in my head at least).

Plus as someone who does the odd but of 3D design the huge amount of VRAM is a massive bonus.
I personally don't see this card becoming a fixture on a "minimum spec" to run a game for a serious amount of time.
 
Last edited:
So @TNA are you ready for that 30fps goodness..? :p

zEMKgpfRYxP6SDGjvFMrag-970-80.png
Fine by me mate. I am G-Sync ready and as usual I play it slow and in a non twitch style. Also I will gain a little from disabling the usual image degrading things like motion blur, depth of field, chromatic aberration etc.

As I recall I played Mankind Divided with similar fps also on my 1070 when it came out.

I will be playing this game no less than 3 times, so can do a RT and non rt playthought if I fancied.
 
Fine by me mate. I am G-Sync ready and as usual I play it slow and in a non twitch style. Also I will gain a little from disabling the usual image degrading things like motion blur, depth of field, chromatic aberration etc.


Motion blur yes I totally agree but depth of field.? Come on it adds depth to image.
 
Jumping in here because I too stupidly bought a 3090 recently.

Last PC I bought was a i5 2500K, GTX580 (GTX980 later), 16GB jobby - I'm apparently a "run it until its on its knees" kind of guy. The primarly reason I bought the 3090 was because of the shortages of the 3080s - I didn't want to wait any longer and annoyingly I could justify the cost (in my head at least).

Plus as someone who does the odd but of 3D design the huge amount of VRAM is a massive bonus.
I personally don't see this card becoming a fixture on a "minimum spec" to run a game for a serious amount of time.

Same. :)
 
Back
Top Bottom