• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Just what is NVIDIA up to?

image.png
 
HW unboxed just rested it and found its improved by 3%, that doesn't put it within 5% of a 4080.

I have the actual cards in question and can assure you the 7900 XT is on average about 10% slower. When both are overclocked that lead shrinks to about 5%

So my own testing and experience needs to be thrown out cus Joxeon and HUB tell me I’m wrong.

The levels you are going too are beyond parody now.
 
I have the actual cards in question and can assure you the 7900 XT is on average about 10% slower. When both are overclocked that lead shrinks to about 5%

So my own testing and experience needs to be thrown out cus Joxeon and HUB tell me I’m wrong.

The levels you are going too are beyond parody now.

I doubt HUB is wrong guys been doing it for years and for all to see , and it comes down what games are benchmarked I'd like to see average say over 30+ games
 
Last edited:
I doubt HUB is wrong guys been doing it for years and for all to see , and it comes down what games are benchmarked I'd like to see average say over 30+ games

Exactly and I said just this a few pages back. The games I tested were Jedi Survivor, God of War, Hogwarts Legacy, Elden Ring, Watch Dogs Legion and Halo Infinite. As well as the obligatory 3DMark Timespy run.

These were the games I had installed on the test PC. Had I tested different games I would get different results, just like HUB. The only game where the 4080 had a decent 20% lead was Halo Infinite but both GPUs were well over 100 FPS at 1440p.
 
Last edited:
If you're not using the games canned benchmark its very easy to make one card look better than the other and vice-versa.
I'm not saying anyone is doing this but you can find a part of the game where GPU A does much better than GPU B does overall, or in another part of the game you can switch that right round.
Or they might not even do it deliberately. Its just where the cards fall.

To add to that i am suspicious of GPU vendors wanting reviewers to follow "review guides" you can bet your life those review guides are designed to make your product the best it can vs the other guys product, what's more the hardware vendor might suggest the GPU is up to 5% faster than it actually is, some reviewers might just copy those numbers down if they can't get there because its only 5% and its not worth arguing with the vendor over.

I have seen reviewers talk about how they think these review guides are useful because then they know the product is performing as expected, that there isn't a problem, HUB are one of those who have said this, and when it comes to AMD review guides contradicting Nvidia review guides who do we think these reviewers are likely to agree with?
 
Last edited:
If you're not using the games canned benchmark its very easy to make one card look better than the other and vice-versa.
I'm not saying anyone is doing this but you can find a part of the game where GPU A does much better than GPU B does overall, or in another part of the game you can switch that right round.
Or they might not even do it deliberately. Its just where the cards fall.

To add to that i am suspicious of GPU vendors wanting reviewers to follow "review guides" you can bet your life those review guides are designed to make your product the best it can vs the other guys product, what's more the hardware vendor might suggest the GPU is up to 5% faster than it actually is, some reviewers might just copy those numbers down if they can't get there because its only 5% and its not worth arguing with the vendor over.

I have seen reviewers talk about how they think these review guides are useful because then they know the product is performing as expected, that there isn't a problem, HUB are one of those who have said this, and when it comes to AMD review guides contradicting Nvidia review guides who do we think these reviewers are likely to agree with?

Why would hub agree with Nvidia?
 
90% market share, have demonstrated that they are happy to cut them out of the sampling list if they don't conform, AMD are weak and the path of least resistance to go against.

Hub already said they don't care if Nvidia don't send them review samples they'll just buy them instead, after the last scandal with them
 
Hub already said they don't care if Nvidia don't send them review samples they'll just buy them instead, after the last scandal with them

HUB don't have a channel without them, so i don't believe that for a second.
 
HUB don't have a channel without them, so i don't believe that for a second.

Unless you provide evidence I'll keep watching and hub have been upfront of other brands trying to get favourable reviews and showed emails, I have no reason to doubt them and the channel is big enough
 
Last edited:
what if the reason rdna4 won't have any high end GPUs is because AMD wants to save the best wafers for AI products and because Nvidia is planing to do the same

With the mining boom it didn't matter cause a gaming GPU is great at mining

But with servers GPUs they can't game and vice versa. So perhaps both companies kare planning to keep all the best chips for AI server GPUs and gamers will only get mid and low end chips, resulting in another poor generation that doesn't really perform much better than what it replaced


At this stage everyone is gasping for breath at the moment rdna4 news but what everyone should be asking is why is amd doing that and the most likely answer is because Nvidia is going to do that too
 
Last edited:
HUB don't have a channel without them, so i don't believe that for a second.
HUB have been very critical of Nvidia for most of their ADA GPU reviews, only the 4070 was more positive than it should have been but it think its because it had the 12gb VRAM but even then it wasn't a "good" review but rather ok.
 
what if the reason rdna4 won't have any high end GPUs is because AMD wants to save the best wafers for AI products and because Nvidia is planing to do the same

With the mining boom it didn't matter cause a gaming GPU is great at mining

But with servers GPUs they can't game and vice versa. So perhaps both companies kare planning to keep all the best chips for AI server GPUs and gamers will only get mid and low end chips, resulting in another poor generation that doesn't really perform much better than what it replaced


At this stage everyone is gasping for breath at the moment rdna4 news but what everyone should be asking is why is amd doing that and the most likely answer is because Nvidia is going to do that too

If this happens then ladies and gents we have a big gap in the niche market for high end gaming. Someone will need to fill that gap.

I doubt intel will, they will probably do the same and invest heavily in AI.
 
HUB have been critical of Nvidia in the past and while I don’t always agree with their test methods, by and large they are unbiased IMHO.

The problem with reviews are when they test extreme RT games and declare Nvidia as significantly faster. They neglect to inform you that even being significantly faster does not mean playable.

For example the poster boy of RT is the ever present CP2077.

4K avg RT on without up scaling.
4080 - 29 FPS
4070Ti - 23 FPS
7900 XTX - 20 FPS
7900 XT 17 FPS

None of these are playable FPS and even at 1440p the 4080 averages an on paper playable 60 FPS with lows around 45 FPS. Though this is ,so borderline you end up enabling up scaling anyway.

Is the 4080 a better GPU, yes. Is it a £400 better GPU, no. The 4070Ti with 12GB of VRAM is even worse value right now.
 
Back
Top Bottom