• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

RTX 3080 VS RTX 3090 4K ULTRA SETTINGS PERFORMANCE & BENCHMARK COMPARISON || TESTED IN 5 GAMES (Only one I have found so far)

(His Typed Overlay says 1080p so not sure if he made typo or running 1080p, either way it still made a difference (must have a GOOD CPU lol).



It's rubbish, it's the same footage with a fake overlay

He even says: "For you all :- It’s just a asumption. When the rtx 3000 gpu will come then I’ll make more depth videos on them"
 
The 3080 Ti would just be a 3090 with a few SMs disabled. Just like the 1080 Ti and 2080 Ti.

The 3080 has 68 SMs whilst the 3090 has 82 SMs.

For comparison:
  • RTX Titan has 72 SMs and the 2080 Ti has 68 SMs.
  • Titan Xp has 28 SMs and the 1080 Ti also has 28 SMs.
3080 Ti would be 95% of the performance of the 3090.

VRAM isn't a measure of performance. 1070/1080 had 8GB, 1080 Ti has 11GB and Titan Xp had 12GB. Not that different but completely different classes of cards in price.
I think you're missing the point. It's not about the VRAM. The 3090 can be at most 20% faster than a 3080, and will likely be less, since even 20% would require perfect scaling of on-paper specs, which never happens. Let's look at the 2080 vs the 2080 Ti:

48% more CUDA cores
38% more ROPs
30% more SMs
352-bit vs 256-bit memory bus
Actual performance gain: 28% on average (at 4K, much less below)
https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-founders-edition/33.html

The 3090 has over the 3080:

21% more CUDA cores
17% more ROPs
21% more SMs
384-bit vs 320-bit memory bus
Actual performance gain: ???

The 3080 is relatively MUCH closer to the 3090 than the 2080 was to the 2080 Ti, and there was only a 30%-ish performance gap between those two. It's going to be a lot closer between the 3080 and 3090, because GPUs aren't magic and the specs are known. There's almost zero room for a 3080 Ti in the performance stack even as it stands, unless people are prepared to pay 50% more for 10% more performance. Then you add the existence of the 3080 20GB on top of that and you don't even the VRAM factor to differentiate it. That's the relevance VRAM has to the discussion - marketing. You could maybe sell a 10% faster 3080 Ti on the double VRAM. Maybe. But you can't do that if there's a 20GB 3080, which AIBs are saying privately that there is.
 
Anyone find any info on the models with the fan at the back of the GPU if it made a difference to the core temp?

Curious about the inno3d x3 and x4 if it's useful or not other than it looks hideous.
 
You know I'm kinda hoping stocks are delayed, if AMD came out with a GPU within spitting distance of the 3080 which is being reported but it's too early to tell, It will more than likely have 16gb ram and 100-150 quid cheaper I would jump on it simply because the market needs shaking up, everyone loves an underdog.

It will probably use less power too, since AMD are on a much better process (TSMC 7nm). Fingers crossed!
 
Sorry, I think I get it now, you are talking about cherry-picking results to show what you want?

Exactly. You can see in the picture that testing the full suite of games ends up with the 2080 Ti being 30% faster, but if you restrict the games to only the best performers then it jumps to 40%. I use that only as an illustrative example: You can tell the truth but not lie, and yet still mislead about a card's performance. It's really a universal problem and hence all the "fake news" talk today. It's not that people straight up lie so much as it is that they don't provide the full context, and therefore paint a different picture than what's actually the case. Applies to all areas of life really. Context is all important.
 
It will probably use less power too, since AMD are on a much better process (TSMC 7nm). Fingers crossed!

Yep it is a much better process, Nvidia would have been on if they didn't decide to argue about the 20 dollars per chip, could backfire on them, Hope I don't get itchy fingers Thursday and wait for AMD, it's the sensible decision.
 
Indeed. DF come across as just a paid for preview outlet, I tend to just ignore.
IIRC they did a preview for the Stadia and didn't even use the same displays in their latency test... :rolleyes:
Indeed. It's one of those situations where, if you see one thing wrong then all of a sudden you're on the look-out and see lots of other things wrong that were there before but you never noticed. For me the Vega reviews served as a great way to triage tech sites for which had a clue & which were pretending. Sadly very few made the cut. At the end of the day there's no substitute for thinking for yourself. Someone's always gonna try to sell you (on) something, and unlikely to your benefit.
 
Really?...

Eh-Dp-Fi-HU0-AAWovm.jpg
Really This graph is ****** I got video what happens when you run out of Vram in doom after steves ****** video


I found a way to Force load in to Vram and loaded EXTRA 6gb out of my 12gb that dropped frames to 30s range. When i vram 4gb drop was almost non existant.

Was for my polish forums cause that 4k bench looked limpossible and it is. As Only person that got 1080ti beating 2080 in 4k is Hardware unboxed.

But hey what could You expect from chap that cant even properly tighten timings on Ryzen and makes video showing how NOT to use dram calculator lol

Look around benchmarks only HU using some magic hot 1080ti beating 2080 in 4k ultra nightmare.

performance-3840-2160.png


The 1080ti Here scores almost as good as my titan. No clue what card he used for 83fps or where he ebnched.
 
Last edited:
RTX 3080 VS RTX 3090 4K ULTRA SETTINGS PERFORMANCE & BENCHMARK COMPARISON || TESTED IN 5 GAMES (Only one I have found so far)

(His Typed Overlay says 1080p so not sure if he made typo or running 1080p, either way it still made a difference (must have a GOOD CPU lol).



Isnt that the one where it says in the blurb that these are the "expected" framerates of the 3080/90 based on their specs?
 
Back
Top Bottom