• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia now at 90% market share.

These people do keep saying DLSS / RT adds 30 / 40% value, i do tho have a problem with that idea.
Yeah, hard to quantify how much difference it makes.
Again, the fact that AMD have bothered with FSR and RT at all shows they feel there is some value to it and the fact they're bothering to improve it just adds more weight to that.
Another thing that probably has some bearing on things is brand recognition, but again quantifying what that's worth is tricky.

I mean I know nothing about vacuum cleaners, so if I was buying one and the options were a "Dyson" model and a "Werkzeug Vakumes" model and they both got reasonable reviews and they were a pretty similar price, I'd probably go with the Dyson one. This is probably a poor example as there's likely not enough complexity to what makes a good vacuum, but I don't think you can blame people for going for Nvidia over a brand they're no so aware of.
 
They kind of have to be a little charitable at this point as they just lost even more marketshare recently. They need to come out with something that is performance competitive and a LOT cheaper than the competitions equivalent card or the 8000 series could very well be AMD's last dGPU series.

Nah. Radeon is a premium brand and not a charity mate. They should just wait for Nvidia to release their cards first so thry know pricing and can release after. This has worked very well so far, why stop now?
 
Is that due to a CPU bottleneck with the Nvidia driver causing it to run in power saving mode? Because if what he's saying is AMD GPU's always run at maximum wattage not matter what the load on the card is he's full of ****.... they don't. As soon as they are not loaded above 95% they draw less power.

power-vsync.png


TPU does 1080p@60fps v sync testing. Latest test for a 7900xtx is from summer. It needs about 66% +/- more power for same FPS. Even the 4090 is more efficient than the 7800xt, 7800xt needing more than 16% power although it has less vRAM. Speaking of which, even though 7900xtx has 4GB vRAM more, the power consumption is about the same to its lesser brother.

In the video the numbers/percentages vary up and down, so it depends per game and scene. I'd say no GPU limitation for nVIDIA, just AMD being less efficient. Intel would be mocked and crucified for something similar.
Did a few extra comparisons of these...

RT performance:
4080 - 100%
7900XTX - 74%

4070Ti - 100%
7900XT - 82%

4070 - 100%
7800XT - 77%

4060Ti - 100%
7700XT - 94%

4060 - 100%
7600 - 62%


And then TDP:
4080 - 100%
7900XTX - 111%

4070Ti - 100%
7900XT - 105%

4070 - 100%
7800XT - 132%

4060Ti - 100%
7700XT - 153%

4060 - 100%
7600 - 143%

+ DLSS, better FG, etc.
Nah. Radeon is a premium brand and not a charity mate. They should just wait for Nvidia to release their cards first so thry know pricing and can release after. This has worked very well so far, why stop now?

Release after and still have less features, but they'll promise to release them at some point that may or may not come in the near future. Surprised pikachu face when they hit 5% market share.
 
Last edited:
Not allowed to value those things. Plus they are fake :p

People say they don't care about those, they don't care about RT/PT, but they care about plenty of vRAM, cheaper cards, etc. and yet that doesn't show up in the actual sales and market share.

People like to quote Steam HW Survey, well 4080s is 61% higher in those charts than 7900xtx, although it has way less time on the market. 4080 is almost 73% more and 4080s + 4080 have 2.34x more numbers than the 7900xtx. 7900xt doesn't even show up, while 4070ti super has sold 52% more than 7900xtx...

So, where are those people that vote in poles that they don't care about RT and all that, vRAM is king etc? They sure are not spending money on AMD (or on what they preach!). Or... are just a vocal minority.
 
Last edited:
  • Like
Reactions: TNA
Is that due to a CPU bottleneck with the Nvidia driver causing it to run in power saving mode? Because if what he's saying is AMD GPU's always run at maximum wattage not matter what the load on the card is he's full of ****.... they don't. As soon as they are not loaded above 95% they draw less power.

power-vsync.png


TPU does 1080p@60fps v sync testing.

You misspelled "look here you're right humbug"
 
Last edited:
Reasons.... IDK i never said this wasn't true, i know its true. Is this a genuine question? I have some theories.
Of course it is.

By the original video I didn't mean that AMD doesn't throttle down in low demanding scenarios at all, it just doesn't do it low enough or the GPU doesnt have a more granular control over its innards to better "surf" the demand as needed.
 
Of course it is.

By the original video I didn't mean that AMD doesn't throttle down in low demanding scenarios at all, it just doesn't do it low enough or the GPU doesnt have a more granular control over its innards to better "surf" the demand as needed.

I was referring to the video.

Its likely to be more than one thing, possibly a less efficient architecture, possibly something to do with it being an MCM design, Nvidia are on newer more advance silicon, 4nm vs 5nm.
 
Last edited:
Is anything AMD do really that bad or do we put too much faith in the obnoxious slop tech jurnoes put out?

My point is that Nvidia don't just have majority market share, and not slowing down, but dominant market mindshare, The latter of which is very hard to crack unless AMD come out with something immense for a decent amount cheaper as that is what it will take to sway the usual Nvidia only customers.

Nah. Radeon is a premium brand and not a charity mate. They should just wait for Nvidia to release their cards first so thry know pricing and can release after. This has worked very well so far, why stop now?

Yeah continually losing marketshare is a great business tactic ! :D

As an AMD customer I'd like them to succeed, Hopefully the 8000 series comes in at a decent price and decent performance for a mid range offering.
 
Last edited:
I was referring to the video.

Its likely to be more than one thing, possibly a less efficient architecture, possibly something to do with it being an MCM design, Nvidia are on newer more advance silicon, 4nm vs 5nm.
N4 Vs N5 does matter (it's not nm though, that's just marketing bs, hence the real name is n4 and n5) and I've seen whole analysis before even 7900xt(X) hit the shelves yet, showing MCM seems to be the main culprit here of high power use. Just the interconnect between chips itself requires a lot of power apparently, which has nothing to do with generating 3D graphics as such.
 
4080: $1200, 100% raster performance.
7900 XTX: $1000, 105% raster performance, 25% better value per frame. (reviewed bad)

4070 Ti: $800, 100% raster performance.
7900 XT: $900, 110% raster performance, worse value, (reviewed very bad and deservedly so)

4070: $600, 100% raster performance.
7800 XT: $500, 107% raster performance. 27% better value, (reviewed meh to bad)

4060 Ti: $400, 100% raster performance.
7700 XT, £450, 122% raster performance. 10% better value, (reviewed bad)

4060: $300, 100% raster performance.
7600: £270, 89% raster performance. (reviewed bad and deservedly so)
---------------------

4060: $250, 100% raster performance.
B580: $250, 105% raster performance, 5% better value (reviewed like the best thing since sliced bread, tech jurnoes have not been this positive about a GPU since probably something from 3DFX) ¯\_(ツ)_/¯

Is anything AMD do really that bad or do we put too much faith in the obnoxious slop tech jurnoes put out?
Maybe AMD don't have a significant amount of certain people working for them ?? Hmmmmm.
 
Back
Top Bottom