• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

3090 is NOT a gaming card - Gamers Nexus DESTROYS the 3090

Ampere is simply not a gaming GPU. Just like Fermi, it has been designed for everything else first. If AMD have taken a step back with Navi 2 and turned it into a gaming card? it will be much better behaved, cooler, lower power use meaning it will overclock very well. Just think of it as the 5000 series VS Fermi all over again. After that AMD went GCN trying to kill two birds with one stone (IE the power user/server market and gamers). We all know how that ends. Great if you are Pixar, crap if you are a gamer.

Games have moved on since pong. Embrace Ampere's 3080 as the first true card capable of RT at playable speeds. I'm not a fan of DLSS, at least not so far, but great if that can be used to further enhance RT. BTW. I'm no Nvidia fanboy. I'd be all over the 6000 series If AMD could beat Ampere's RT. I have enought raster performance from my 1080Ti.
 
Games have moved on since pong. Embrace Ampere's 3080 as the first true card capable of RT at playable speeds. I'm not a fan of DLSS, at least not so far, but great if that can be used to further enhance RT. BTW. I'm no Nvidia fanboy. I'd be all over the 6000 series If AMD could beat Ampere's RT. I have enought raster performance from my 1080Ti.

The issue is I think Nvidia see DLSS as the total saviour to giving more power to PC users and will be the future solution to everything if gaming at 4k and above.

I mean based on power usage the 3080 doesnt give hardly any extra brute force performance than the last gen but does have major performance gains in RT and DLSS. I can see this trend continuing. Its as it graphics manufacturers have basically run out of steam of brute force performance and are now having to bring in things like DLSS to get playable framerates at 4k. You can see by just how much of the 3000 series launch was all about DLSS.
 
Games have moved on since pong. Embrace Ampere's 3080 as the first true card capable of RT at playable speeds. I'm not a fan of DLSS, at least not so far, but great if that can be used to further enhance RT. BTW. I'm no Nvidia fanboy. I'd be all over the 6000 series If AMD could beat Ampere's RT. I have enought raster performance from my 1080Ti.

Turing is perfectly capable of RT with good FPS when using DLSS. If you are not a fan of DLSS? then the 3080 still struggles badly with RT without it.

Especially at its intended resolution of 4k.

However, you are confusing what I said. I said that the 30 series is not good for gaming. And it isn't. As a gaming leap it is worse than Fermi. You remember? the tech with all the memes. It only looks better than it is right now (see also Turing) because AMD have been counting sheep. Let's see what happens when they release something to compare both Turing and Ampere to.

Seriously, you want to see how amazing Ampere is at folding. It's incredible. It's also amazing at video rendering and etc. That doesn't make it good at gaming, though.
 
Turing is perfectly capable of RT with good FPS when using DLSS. If you are not a fan of DLSS? then the 3080 still struggles badly with RT without it.

Especially at its intended resolution of 4k.

However, you are confusing what I said. I said that the 30 series is not good for gaming. And it isn't. As a gaming leap it is worse than Fermi. You remember? the tech with all the memes. It only looks better than it is right now (see also Turing) because AMD have been counting sheep. Let's see what happens when they release something to compare both Turing and Ampere to.

Seriously, you want to see how amazing Ampere is at folding. It's incredible. It's also amazing at video rendering and etc. That doesn't make it good at gaming, though.

You are wrong if you think gaming stops with rasterisation. There are many sites out there covering RT performance without DLSS showing playable frame rates at 4k and an excellent experience at 1440p.

I would have jumped on the 2080Ti at launch if it had 3080 RT performance.

The issue is I think Nvidia see DLSS as the total saviour to giving more power to PC users and will be the future solution to everything if gaming at 4k and above.

I mean based on power usage the 3080 doesnt give hardly any extra brute force performance than the last gen but does have major performance gains in RT and DLSS. I can see this trend continuing. Its as it graphics manufacturers have basically run out of steam of brute force performance and are now having to bring in things like DLSS to get playable framerates at 4k. You can see by just how much of the 3000 series launch was all about DLSS.

I see DLSS as a smart solution used while GPUs move from rasterisation to RT. So it's going to be with us for a long time. If they had left out the RT hardware then the 3080 could have provided higher raster performance, likewise if they had omitted raster hardware.
 
Last edited:
This review is not in English, but the FPS difference in the benchmarks makes an Asus Strix 3090 OC over a 3080 totally worth while.

https://www.youtube.com/watch?v=Jmxao7GxzQ0


For double the money? Wouldn't say 'worthwhile' as such. @ £1000 it could've looked better value but £1400-£1600, whilst the extra performance is there......nah. Still want one and if some were in stock I may have bought one as I'm an impulse buyer.
 
For double the money? Wouldn't say 'worthwhile' as such. @ £1000 it could've looked better value but £1400-£1600, whilst the extra performance is there......nah. Still want one and if some were in stock I may have bought one as I'm an impulse buyer.
He is either loaded, has many houses generating him money and enjoys financial freedom or living a yolo life style. Only way someone would say that, well could be confused too :p
 
This review is not in English, but the FPS difference in the benchmarks makes an Asus Strix 3090 OC over a 3080 totally worth while.

https://www.youtube.com/watch?v=Jmxao7GxzQ0

Hmm but that was against a 3080 FE. The question should be whats it like against a 450W Asus 3080 OC card? That difference might be down to below 10% or even 5% in every game. Thats the review comparison I want to see.
 
This review is not in English, but the FPS difference in the benchmarks makes an Asus Strix 3090 OC over a 3080 totally worth while.

https://www.youtube.com/watch?v=Jmxao7GxzQ0

Well it's still not good value for money but yes if you're going with a 3090 the binned models like the strix are the way to go - the card is 20-30% faster (depending on which game tested) than the 3080 out of the box
 
Well it's still not good value for money but yes if you're going with a 3090 the binned models like the strix are the way to go - the card is 20-30% faster (depending on which game tested) than the 3080 out of the box

1. The strix arent binned chips, just board that have higher overclocks out of the box and can use 480W and have great coolers to keep the gpu cool so it doesn't lose 15Mhz of boost for every 5C of temps
2. I keep saying this every review I see of the 3090 strix where they compare it to a reference two 8 pin 320W 3080. Show me the comparison to the 450W Asus strix 3080. I suspect the difference will be only 5-10% at best when both on max wattage.
 
The FE will have the best bins also the best cooler IMO, same as always (aside from specials like Kingpin).

Its a great cooler but really poor at cooling the memory chips apparently. Nvidia has made the board too small and everything is too cramped.

Good coolers like the TUF and Strix are still much better at keeping the GPU cool than the FE coolers.
 
I do not like coolers that blow air all over case so always had FE's, though this has 1/2 changed now.

Some of AiB are ridiculous (always have been), I mean if you cable-tie some case fans onto a GPU obv it will cool better but again air is everywhere.
 
1. The strix arent binned chips, just board that have higher overclocks out of the box and can use 480W and have great coolers to keep the gpu cool so it doesn't lose 15Mhz of boost for every 5C of temps
2. I keep saying this every review I see of the 3090 strix where they compare it to a reference two 8 pin 320W 3080. Show me the comparison to the 450W Asus strix 3080. I suspect the difference will be only 5-10% at best when both on max wattage.

Aren't all chips for the OC cards Binned as A chips at Nvidia?
 
Aren't all chips for the OC cards Binned as A chips at Nvidia?

No. Well they might bin chips for their own FE cards and Ampere is reported to have three bins - 0,1 and 2 with 2 being the best chip. But apparently Nvidia doesnt send the chips to the AIBS as pre binned. Thats up to the AIB to do any binning if they want to.

Maybe Asus are bin testing the chips and keeping the 10% bin 2 chips back for their even higher flagship cards later on.

But it it would be much cheaper and especially since the boards are identical between oc and none oc boards, to just put the gpu chip in the board with the OC bios and test it is okay. If it is, then it sells as a OC board. Even bin 0 should pass to go into a strix oc board although they wont have much headroom for overclocking much further.

After all the minimum boost criteria for the OC strix isnt that high and i have yet to see any 3080 card from any manufacturer not hit the Strix OC boost level. So you will get poor OC strix cards and good ones. Look at the guy on here who got a cheap nasty none oc Palit card. Clearly it was a bin 2 chip as he is running at a close constant 2100Mhz and low temps in a reference board with only 2 power pins. Its reported that 30% of chips are bin 0, 60% are bin 1 (good) and 10% are bin 2 (exceptional). If Asus needed bin 2 chips for their strix oc card they would 6 times more none oc cards to every oc they made. Since they sell more oc than none oc cards that would cause them big issues.

Now cards like the MSI lightening and EVGA Kingpin obviously do require a chip from the 10% bin 2. But as production continues the percentage of chips hitting bin 2 will increase as well so it makes sense to not bother testing atm and keeping any bin 2 chips back until the boards have finished being made and designed.

Especially when you have 1000s of pre oders to fulfil.

So for now, i would say whatever card you buy you have 30% chance its a lemon (even buying an oc strix) and 60% chance its good and 10% chance you win the silicon lottery.

I have two plain standard inno3d 2080ti that both clock to 2100 core and 4000 memory purely from winning the lottery last time.
 
Last edited:
1. The strix arent binned chips, just board that have higher overclocks out of the box and can use 480W and have great coolers to keep the gpu cool so it doesn't lose 15Mhz of boost for every 5C of temps
2. I keep saying this every review I see of the 3090 strix where they compare it to a reference two 8 pin 320W 3080. Show me the comparison to the 450W Asus strix 3080. I suspect the difference will be only 5-10% at best when both on max wattage.

Wrong in a couple fronts

out of the box the strix is 390w, you have to raise the power limit for 480w and if you do the card will bounce of the temp limit, it cannot cool that so it's a moot point. The difference between 390w and 480w is not worth discussing because the card cannot cool that heat so it drops performance

When normalising for power draw, the 3090 strix runs faster than the FE card as shown by a couple reviewers, when normalising performance the strix 3090 draws less power than the FE.
 
Wrong in a couple fronts

out of the box the strix is 390w, you have to raise the power limit for 480w and if you do the card will bounce of the temp limit, it cannot cool that so it's a moot point. The difference between 390w and 480w is not worth discussing because the card cannot cool that heat so it drops performance

When normalising for power draw, the 3090 strix runs faster than the FE card as shown by a couple reviewers, when normalising performance the strix 3090 draws less power than the FE.

but none of that is relevant if you are comparing it to a strix 3080 oc. There is always going to be a massive difference even at stock 390w vs a 3080 FE. That’s my point that you seemed to have missed.

fed up seeing comparisons to just an 3080fe and claiming it’s 20-30% faster etc.
 
The Review below posted in another thread confirms Greebo's comment about them not binning the GPU's. This review sample tops out with a +15 Mhz overclock. Hopefully the 3090's will have generally better silicon.

https://www.tweaktown.com/reviews/9...6s-Rn-E2LHbz1E6QruVPnKzLsvFmLRpoa5ZzvA4yno4s8

Indeed and proves my point perfectly. They are people on here with Palit cards that have better silicon that that strix OC card in the review and his Palit overclocks to 2100.
 
Back
Top Bottom