• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Radeon Resizable Bar Benchmark, AMD & Intel Platform Performance

Yup. :)

In other developments, it looks like this possible system error appears to be more widespread that further thought.

Note the computerbase chart regarding the 3080 and HD Textures. :cry:
iybu6ay.png

PcGamesHardware found the same too.
70OJFFc.png

I like the second image.

The 3080 is a top of the range very expensive GPU, it should be able to run the latest game with the highest settings no problem, that includes HD Textures.

IMO every reviewer worth their salt should be brutally frank about these very expensive pieces of equipment, if they can't run the game say so or put the performance up, if its 6 FPS.... there you go, that's what the performance is of that card in this game.

If everyone did that instead of defending them Nvidia wouldn't get away with it.
 
I must be the only one who can read here :cry:

Possible system error?:p

MY takeaway from the above quote is- tpu's w1zzard testing at launch showed then that MAX everything isn't happening for gpus with less than 12Gb evident in In tpu's performance charts with HD pack installed/max detail BUT RT'ing disabled.

Where in TPU statement do they say they experienced drops to single digit fps? Also, maybe I missed a bit but where do they say they tested that "without ray tracing"? As in your quote there: "4K, HD Texture pack, and ray tracing combo" :) Like I said, there's a bit of a difference between "some stuttering" and having your fps plummet to single digits and being stuck at said single digits....

Although funnily there was little to no stuttering in my footage even when 10GB VRAM was exceeded (at least not the kind of stuttering you would expect to see) but nope lets just ignore that :cry:

I hate it when reviewers defend hardware vendors being tight.
I don't even understand what legitimate reason they could have for doing that, could anyone explain to me why i should be happy to settle for less?

Should we also overlook that about RT for amd cards too? Bearing in mind amd show far more perf. issues across several RT games and all the sites have shown this....

Plenty of sites have already stated vram "concerns" on ampere, more so the 3070 8gb, as HUB summed up in their video, they are yet to see/encounter any issues with the 3080 10GB.

Yup. :)

In other developments, it looks like this possible system error appears to be more widespread that further thought.

Note the computerbase chart regarding the 3080 and HD Textures. :cry:
iybu6ay.png

PcGamesHardware found the same too.
70OJFFc.png

Come on matt, don't turn into some other posters now... you used to be able to post arguably good content with reasonable logic and evidence but you're completely contradicting yourself here and now ignoring valid questions again:

Not sure if you're referring to the 3080 or 3070 but either way.....

As you just said your very self when it comes to "old data":

That’s old data from when the product launched and they tested it I believe.. Using old BIOS, GPU/Chipset drivers etc. Look at the list of games they tested, it’s from their launch testing of the feature.

;)

Techpowerup mentioned this, "does stutter sometimes on the 10 GB GeForce RTX 3080".... no mention of fps drops to single digits? Bit of a difference.

Again, I never really "debated" tommys 3070 in FC 6 at 4k60 with no FSR or max ray tracing, mainly because I haven't followed that card nor have much interest in it and most importantly...... as I have said all along for the 3070.... a 3070 hasn't got enough grunt for 4k60 at "max" settings i.e. settings will have to be reduced or/and FSR/DLSS enabled, which is shown perfectly in that 3070 vs 6700xt HUB video i.e. even though the 3070 is 19% faster on average across 50 games at 4k (and that's not including heavy RT settings/titles either....), it's perf. is still not good enough, same way even though the 6700xt has 12GB vram, the grunt clearly isn't there..... What I did question with regards to his 3070 was his experience/issues he mentioned in other games like metro and rdr 2 though as, correct me if I'm wrong but where are those sites saying the same for those games? :)

If SAM/re-sizable bar works best when it is properly implemented into a game by the game developer (like fsr being implemented into the game directly rather than using rsr at driver level) shouldn't we be encouraging game developers to properly implement SAM into a game so that we can all get free extra performance. After all I see many people call out a game for being poorly optimised, why shouldn't we consider if a newly released game has been well optimised for both re-sizeable bar and fsr, rather than let game developers get away with not programming for them correctly. For me the issue of hardware unboxed not having tested with SAM/FSR is that it takes pressure away from game developers to optimise for both. I would have thought they could do separate videos for gpu with native, SAM and FSR. They will get extra revenue for doing more videos and it will encourage game developers to get behind FSR and SAM, giving us free extra performance.

+1

Seems that people don't want to look further into it for whatever reason, perhaps because it's not a simple turn on and off that some would have us believe? FC 6 being amd sponsored is one title I would have expected to have seen a massive boost with, given other amd sponsored titles and "vram".....
 
The 3080 is a top of the range very expensive GPU, it should be able to run the latest game with the highest settings no problem, that includes HD Textures.

IMO every reviewer worth their salt should be brutally frank about these very expensive pieces of equipment, if they can't run the game say so or put the performance up, if its 6 FPS there you go, that's what the performance is of that card in this game.

I mentioned this earlier when Steve from HUB, quoted:
..but I feel when spending $500 or let's be honest, a lot more than $500 on a GPU for that premium experience disabling high quality texture packs when available, isn't something most gamers will want to do!
..of course there is still that little matter of VRAM and although 8 gigabytes is still enough for the most part, we are seeing examples where it just isn't enough.

This will be is the same for 10Gb and 3080 as danced around on the farcry thread.
 
I mentioned this earlier when Steve from HUB, quoted:

Even with that quote all he is doing is acknowledging the obvious while at the same time still saying its not really an issue, its about the most milk-toast acknowledgement of a problem you can get.

Its like something an Nvidia rep would say.
 
The 3080 is a top of the range very expensive GPU, it should be able to run the latest game with the highest settings no problem, that includes HD Textures.

Yep. And it does, apart from very few outliers. Exactly the same can be said about the 6900XT, which I thought was meant to be a step up from the 3080 yet generally seems to be used as the comparison. They've both been out for about a year and a half and in 99% of cases they are both spectacular. The 3080 with 10GB doesn't quite manage Far Cry 6 with full 4k textures. 6900XT seems to stumble with ray tracing in some scenarios. SAM/reBAR is either amazing/pointless/detrimental.

Neither can play Cyberpunk 2077 at full whack, so essentially both vendors have done us a favour there...
 
even with that quote all he is doing is acknowledging the obvious while at the same time still saying its not really an issue, its about the most milk-toast acknowledgement of a problem you can get.

You understand this, however there is a minority of folk who cant acknowledge this and think it will go away or even worse deny its an issue. The more these reviewers ponder round it instead of knocking a home run over it the better it becomes accepted, however we are now progressing from a few people on forums with no credibility to known reviewers referenced by the community mentioning it. Next thing we will be told its a mid range card! :cry:
 
Perfect timing ^^^

Yep. And it does, apart from very few outliers. Exactly the same can be said about the 6900XT, which I thought was meant to be a step up from the 3080 yet generally seems to be used as the comparison. They've both been out for about a year and a half and in 99% of cases they are both spectacular. The 3080 with 10GB doesn't quite manage Far Cry 6 with full 4k textures. 6900XT seems to stumble with ray tracing in some scenarios. SAM/reBAR is either amazing/pointless/detrimental.

Neither can play Cyberpunk 2077 at full whack, so essentially both vendors have done us a favour there...

Compromises, as enthusiasts paying through the nose for the best hardware they offer we just love compromising. Don't we?
 
I mentioned this earlier when Steve from HUB, quoted:

You do realise that comment is in reference to the 3070 "8GB" VRAM and not the 3080 10GB, right? :o

No, show me anyone who has.

Emmmm, what :confused:

Have you not seen all the doom and gloom threads on "vram" yet those same people are ignoring the lack of RT perf. on RDNA 2....

Again, lets use the same logic as hum:

The 6900xt is a top of the range very expensive GPU, it should be able to run the latest game with the highest settings no problem, that includes ray tracing

Not see how flawed certain peoples logic is now....

even with that quote all he is doing is acknowledging the obvious while at the same time its not really an issue, its about the most milk-toast acknowledgement of a problem you can get.

:cry:

The only thing that would please you is if he came out and said "nvidia is ****, 8gb is useless, avoid this card, it should be no more than £200 if that" :cry:

Again, if you had watched the video "fully", you would see the 3070 on average is 19% faster across 50 games at 4k and 13% faster on average at 1440p, this is not even including most of the RT heavy games/settings.... FC 6 was the only game to have considerably worse perf. and even then, neither cards had enough grunt for a locked 4k60 and both would need to enable FSR or/and reduce settings further, which likely would resolve the vram issue for the 3070 8gb and possibly even have the 3070 performing better.....

Did you not also see the bit where at the end, he recommended the 6700xt over the 3070 because of pricing?

You understand this, however there is a minority of folk who cant acknowledge this and think it will go away or even worse deny its an issue. The more these reviewers ponder round it instead of knocking a home run over it the better it becomes accepted, however we are now progressing from a few people on forums with no credibility to known reviewers referenced by the community mentioning it. Next thing we will be told its a mid range card! :cry:

No one is saying the "potential" issue to do with vram will go away? People with experience in the field (as per your comment in the other thread :cry:) know that there will be various other things that will impact what a gpu can do/deliver before vram becomes the "main" issue..... Hence why you are seeing a few games now where no gpu has enough grunt including the 3090 i.e. we are seeing the issues of grunt not being enough far more than "vram".
 
The 3090 and 6900XT will be no more than mid range cards in 6 months and only good for 1440p max settings for the new AAA games that will be optimised for the latest more powerful generation of cards just like how the 2080ti now can't run the latest games at 4K max settings.

IMO people worry to much about VRAM and miss the bigger picture.
 
I remeber installing Shadow of War's 4k texture pack without issue while using a 6GB 980Ti. Today, with AMD's sponsorship and lack of market share, we need 12GB to support 1/4 resolution spot effects :cry:

ALL features should be used in benchmarks, including DLSS and maxed out RT. CP2077 was a prime example where AMD should have been called out for disabling RT at launch, but all these Nvidia shilling sites decided not to?

I hate it when reviewers defend hardware vendors being tight.

Maybe it's the consumer being tight as you always had the choice of a 24GB card for just a few pounds more.

I don't even understand what legitimate reason they could have for doing that, could anyone explain to me why i should be happy to settle for less?

Said the same myself as we are well in to our 4th year of hardware accelerated RT and AI based super sampling. Why buy a new GPU that doesn't perform well at all with today's tech?
 
Perfect timing ^^^



Compromises, as enthusiasts paying through the nose for the best hardware they offer we just love compromising. Don't we?

Oh, absolutely not, but compromises seem to be necessary wherever you go. Just a case of picking your poison really.

Just a shame that a discussion relating to SAM and reBAR is back to RAM comparisons and mud-slinging.

I mean, it is fun, but we already had a thread for that.

EDIT: Last point wasn't aimed at you, just the thread in general.
 
You understand this, however there is a minority of folk who cant acknowledge this and think it will go away or even worse deny its an issue. The more these reviewers ponder round it instead of knocking a home run over it the better it becomes accepted, however we are now progressing from a few people on forums with no credibility to known reviewers referenced by the community mentioning it. Next thing we will be told its a mid range card! :cry:

Nvidia have proven time and time again that if reviewers don't follow their editorial they will get removed from the sampling list, if it was AMD it wouldn't really matter so much, AMD would just be hurting themselves, because Nvidia have massive reach, it would kill their channel and business. So even if a large channel dependant on getting those review samples early along with the rest actually wanted to hold Nvidia to account they couldn't.

On the consumer side the problem is brand religion, my religion is green and it can do no wrong.
 
@Nexus18 - Those computerbase charts are new, only just seen them. Small tip for you though, if you weren't in such strong denial about Far Cry 6 and the VRAM requirements, and the multiple users/tech sites documenting the mentioned issues, a lot of this VRAM talk would have died down a long time ago. :p

Regarding HUB, not sure why they are comparing the 6700 XT to the 3070. The 6700 XT is designed to compete against the 3060 TI, that's why he recommends the 6700 XT because its cheaper.

If the 3070 was compared to the 6800 as it should, that 19% performance difference would be reversed. It's not clear to me why HUB ran that particular comparison.
 
Nvidia have proven time and time again that if reviewers don't follow their editorial they will get removed from the sampling list, if it was AMD it wouldn't really matter so much, AMD would just be hurting themselves, because Nvidia have massive reach, it would kill their channel and business. So even if a large channel dependant on getting those review samples early along with the rest actually wanted to hold Nvidia to account they couldn't.

On the consumer side the problem is brand religion, my religion is green and it can do no wrong.
HW unboxed conducts their cpu testing with Sam turned on for the 6900XT yet this makes the AMD CPUs look better as they perform slightly better than Intels do with SAM so it's swings and roundabouts tbh.
 
The 3090 and 6900XT will be no more than mid range cards in 6 months and only good for 1440p max settings for the new AAA games that will be optimised for the latest more powerful generation of cards just like how the 2080ti now can't run the latest games at 4K max settings.

Can't wait to see a 4060 destroying a 3090 in RT games and just about matching it in rasterization :cry:

I might throw in a cheeky offer of £350 and a half eaten sandwich for a 3090 and have it as a backup card or for when I want to play FC 6 at 4k or even 8k :cry: ;)

I remeber installing Shadow of War's 4k texture pack without issue while using a 6GB 980Ti. Today, with AMD's sponsorship and lack of market share, we need 12GB to support 1/4 resolution spot effects :cry:

ALL features should be used in benchmarks, including DLSS and maxed out RT. CP2077 was a prime example where AMD should have been called out for disabling RT at launch, but all these Nvidia shilling sites decided not to?

Maybe it's the consumer being tight as you always had the choice of a 24GB card for just a few pounds more.

Said the same myself as we are well in to our 4th year of hardware accelerated RT and AI based super sampling. Why buy a new GPU that doesn't perform well at all with today's tech?

Indeed, it's hilarious.

I can't wait for RDNA 3 or perhaps RDNA 4 to match or beat nvidia in RT, will have to quote some old posts ;)

Oh, absolutely not, but compromises seem to be necessary wherever you go. Just a case of picking your poison really.

Just a shame that a discussion relating to SAM and reBAR is back to RAM comparisons and mud-slinging.

I mean, it is fun, but we already had a thread for that.

EDIT: Last point wasn't aimed at you, just the thread in general.

Always the way on this forum, *tinfoil hat time* can't help but think it's because HUB have highlighted yet another issue with an amd feature that people are trying to divert attention away from? :p :cry:

But in all seriousness matt, this please:

Maybe Matt can get someone in amd to do an in-depth look at Sam/rebar and what's really required to get the best from it as would love to know why some games see no difference, is there something game developers need to do on their side?

@Nexus18 - Those computerbase charts are new, only just seen them. Small tip for you though, if you weren't in such strong denial about Far Cry 6 and the VRAM requirements, and the multiple users/tech sites documenting the mentioned issues, a lot of this VRAM talk would have died down a long time ago. :p

Regarding HUB, not sure why they are comparing the 6700 XT to the 3070. The 6700 XT is designed to compete against the 3060 TI, that's why he recommends the 6700 XT because its cheaper.

If the 3070 was compared to the 6800 as it should, that 19% performance difference would be reversed. It's not clear to me why HUB ran that particular comparison.

Fair if they're new but explain why I don't see it on my system (unless enabling rebar and not using FSR) as shown in videos? IIRC, PCGH and computerbase enable rebar? So might be why, same way that HUB video, they had rebar enabled, which "could" explain the drop, again, I "only" had the same experience when I "force" enabled rebar too....

EDIT: Also, there's only 2 users to my knowledge posting about it? And 2 sites stating that, at least regarding the 3080? When you have other sites that don't show the issue and users who have said no issues too... So I wouldn't say it was "widespread", same way I wouldn't say the multiple users with 3090s on the ubi forums mentioning fps drops was "widespread"

Is the 6700xt not priced the same as 3070 though therefore they are competing against each other....

MSRP 6700xt = £420?
MSRP 3070 = £460?

Bearing in mind location and MBA/FE situation i.e. in uk, no chance of getting a 6700xt at that price, at least not atm.....
 
Last edited:
@Nexus18 - Those computerbase charts are new, only just seen them. Small tip for you though, if you weren't in such strong denial about Far Cry 6 and the VRAM requirements, and the multiple users/tech sites documenting the mentioned issues, a lot of this VRAM talk would have died down a long time ago. :p

Regarding HUB, not sure why they are comparing the 6700 XT to the 3070. The 6700 XT is designed to compete against the 3060 TI, that's why he recommends the 6700 XT because its cheaper.

If the 3070 was compared to the 6800 as it should, that 19% performance difference would be reversed. It's not clear to me why HUB ran that particular comparison.

I hear AMD want to discontinue the RX 6800 and replace it with an RX 6750XT with higher clocked memory.

The suggestion being AMD never really wanted to have a 3'rd level cut down 6900XT as yields are very good and they are cutting down healthy dies that could have been 6800XT's or even 6900XT's, the 6700XT even with faster VRam is much cheaper to make and will bring higher margins.

If true that's a bit crappy, even with faster VRam the 6700XT isn't anything like as good as the 6800.
 
Just a shame that a discussion relating to SAM and reBAR is back to RAM comparisons and mud-slinging.

I mean, it is fun, but we already had a thread for that.

Basically this Bill:

@[B]Nexus18[/B] - Those computerbase charts are new, only just seen them. Small tip for you though, if you weren't in such strong denial about Far Cry 6 and the VRAM requirements, and the multiple users/tech sites documenting the mentioned issues, a lot of this VRAM talk would have died down a long time ago. :p
.

Rather than just taking a valid observation and knowing this is indeed a thing to consider (reference many a forum poster that said they didn't buy a 3070/80 card due to worrying about its longevity) we had to wait for tech experts (but not really experts as they have as much knowledge than most of us and are regular folk that need no worshipping).

I appreciate people trying to be neutral though. :)
 
Back
Top Bottom