• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Radeon Resizable Bar Benchmark, AMD & Intel Platform Performance

Not that you'll see this since:

But I think a few are starting to pick up that you're coming across as very bitter going by your posts in recent threads lately (more so than usual) and as a result, your posts are starting to come across as even more silly now, which are even less to what have been some good discussions happening at long last.

I know it's a bitter pill to swallow but move on if you can't add anything worthwhile to a discussion without coming across like you have a grudge, I imagine a couple will start reporting posts if it keeps up.
Whilst I agree with general sentiment of your post, you are absolutely guilty of these things too. Until you acknowledge that and change your ways, I'm not sure much will change as you love to gloat and goad people.

This is not a one way street, you even have a forum signature dedicated to it and dismissing alternative opinions to your own as BS.

My advice to you would be to take a step back and think about how you and your behaviour towards others contributes towards the many negative interactions you are involved in here. Something I'm sure we should all be doing from time to time, myself included.

I'm sure you won't end up on ignore lists and spend 95% of your time here arguing about VRAM usage requirements if you did that.
Oh i can indeed answer that honestly which many people on these forums sadly do not seem to be able too.
From what you are saying then and kudos its actually the truth, is that some games are so much better programmed than others
and also gpu vendors deliberatley work with manufacturers to gimp games on the competition. Finally we actually get to the truth lmao.

PS: oh the fuss on the other thread about the site deliberatley disabling resize bar in testing was great, how short memories are eh, as in having a go at the site for not including all the results in 1 area whereas was it not just a few months back that a certain benchmark thread had its rt disabled as when there was a need for it then it would be added.

Pathetic really from the usual trio.
Some games are better programmed than others, that's nothing unusual in my experience.

Your question was a prod at Humbug, wasn't it? I was highlighting that by taking your question and making it relevant to the GPU (2070 8GB) Humbug was using and testing with Far Cry 6.

I disagree with this statement 'gpu vendors deliberatley work with manufacturers to gimp games on the competition'.

If you had phrased it more like this I might be more inclined to agree with you, 'gpu vendors work with game developers to optimise performance for their hardware'.

Regarding SAM, no fuss just expressing my opinion. I'm guessing you don't agree with it and that's okay I've got no problem with you metalmackey.

My opinion is simply that when testing graphics card performance, I want to see it tested on an optimised system to showcase the maximum performance available of the graphics products being tested.

To do that and to reduce the possibility of any CPU limited scenarios appearing, you need to enable SAM. If you don't, you're just handicapping performance which is not what I want to see in these types of videos. SAM is purely a potential performance benefit option, it does not affect image quality so does not skew the results.

I personally think that's a fair and reasonable argument, but of course others will disagree for various reasons and that's fine too.
 
Oh come on now this is just getting silly. Firstly do you really think that nvidia do not say to developers of games they are sponsoring to dial up the rt to 11 fully knowing that it will cripple amd cards, and vice versa with amd making sure they use way more vram than is actually needed knowing some nvidia cards will suffer, you cannot be that naive. sencondley you say you want games tested maximum performance but then deliberatley chose when making benchmark threads to disable certain options that would make the company you work for look bad then saying i would add them if needed. You cannot have it both ways matt and its now just coming across as a bit sad. As i told you in a email i have both amd and nvidia in my system and if i actually had another option at the time it was built it may even have ended up as a full amd system. I have no loyaltys to any brand and also no issue with any either and i wish some on this forum would just be like that.
 
Whilst I agree with general sentiment of your post, you are absolutely guilty of these things too. Until you acknowledge that and change your ways, I'm not sure much will change as you love to gloat and goad people.

This is not a one way street, you even have a forum signature dedicated to it and dismissing alternative opinions to your own as BS.

My advice to you would be to take a step back and think about how you and your behaviour towards others contributes towards the many negative interactions you are involved in here. Something I'm sure we should all be doing from time to time, myself included.

I'm sure you won't end up on ignore lists and spend 95% of your time here arguing about VRAM usage requirements if you did that.

Some games are better programmed than others, that's nothing unusual in my experience.

Your question was a prod at Humbug, wasn't it? I was highlighting that by taking your question and making it relevant to the GPU (2070 8GB) Humbug was using and testing with Far Cry 6.

I disagree with this statement 'gpu vendors deliberatley work with manufacturers to gimp games on the competition'.

If you had phrased it more like this I might be more inclined to agree with you, 'gpu vendors work with game developers to optimise performance for their hardware'.

Regarding SAM, no fuss just expressing my opinion. I'm guessing you don't agree with it and that's okay I've got no problem with you metalmackey.

My opinion is simply that when testing graphics card performance, I want to see it tested on an optimised system to showcase the maximum performance available of the graphics products.

To do that and to reduce the possibility of any CPU limited scenarios appearing, you need to enable SAM. If you don't, you're just handicapping performance which is not what I want to see in these types of videos.

I personally think that's a fair and reasonable argument, but of course others will disagree for various reasons and that's fine too.

Any of my posts having what you could call "digs" are in humour/jest kind of way, hence the use of emojis and tone. Gpuerrilla was the same at the start but last couple of weeks, his posting style has taken a turn and now is just coming across as baiting whilst adding literally nothing to the threads/discussion which then sends the thread further of with kid like responses (all good fun mind but does get tiring when it's constant now and nothing else of use is being added, just a broken record....). My sig is also done in a humour/jest way hence the use of Saul and his call card (a comedy like role)

And that's another point, you seem to think I am dismissing other peoples opinions yet I haven't.... I have simply stated "possible" reasons/causes/questions i.e. again refer back to this:

Except that's the thing... I and others have "noted" it and offered our inputs as to "potential" reasons why certain people are getting that issue, no one is not saying it's not happening for "certain" systems/users...

It's a certain band of people who don't want to look at the bigger picture or "acknowledge" some other "facts" i.e. why are some 3090 users experiencing fps drops? Why does it seem to be a widespread problem across a whole range of nvidia gpus and not amd? Why did likes of tommy experience this issue even at 1440P and using FSR with NO HD texture pack? (although he recently stated that no longer happens but didn't answer my question as to if that was from upgrading his cpu or/and adding 16GB more RAM.....) Why is my 3080 not showing the issue (unless I enable rebar and don't use FSR)? Why have the developers stated they are looking into the issues if it's purely just vram amount and nothing else (even when people are experiencing the issues without the HD texture pack)? Why do HUB etc. not note the same experience with the 3080 in their recent 3080 vs 6800xt video? Perhaps because they didn't enable rebar in that testing? Why does the FPS remain at single digits and not return to higher FPS like it does in every other game when there is a legit shortage of vram i.e. as per my cp2077 video.... Maybe a vram management issue? Hence why the developers might be looking into......

These are the questions that no one can or will come back to after all this time..... So it's not a clear cut "vram and nothing else" that a small minority are leading it on to be when you look at the whole picture hence the:

Rather than just taking a valid observation and knowing this is indeed a thing to consider

Being very true but it's one side who aren't doing that.....

Not going to go into all the detail on FC 6/vram posts again.... but until recently, only shank and humbug have actually provided some good info/opinions on the bits that I have asked answers for, which no one else has done despite being asked many times before. No one still has been quite able to answer why I don't see the same issues (perhaps I would need to play fc 6 for 3+ hours to run out of vram...) as tommy and gerard nor why HUB don't see the issues in their "recent" 3080 vs 6800xt video, however, they encountered it with the 3070..... Nor why tpu don't see the single digit fps drop either...

If people would accept that there is a **** ton more to games and development than just sheer specs and it not just being a simple case of "vram amount and nothing else", we would have saved a lot of posts....

2 other examples being the texture rendering/loading issue - you and 1 or 2 others insisted not enough vram and that it couldn't be anything else yet developers siad they found the issue and it was fixed for tons of people on this forum and ubi.

The graphical corruption occuring in my video, you insisted it was because of running out vram yet Nvidia released a driver update to fix that...

Again see humbugs take on this when he has provided some good insight to the behaviour in my and tommys examples:

just as you explain with tommys 3080 its unrecoverable, i don't know why that is, there could be all sorts of reasons for it most of which i probably don't understand, only the devs of the game know what is going on there and maybe its something they need to look in to, my guess is once once the system RAM has been employed as a buffer it can't revert back to its previous state without a game restart.

As well as shanks good insight/link to also show there is a lot more going on behind the scenes than just sheer "vram amount".
 
Oh come on now this is just getting silly. Firstly do you really think that nvidia do not say to developers of games they are sponsoring to dial up the rt to 11 fully knowing that it will cripple amd cards, and vice versa with amd making sure they use way more vram than is actually needed knowing some nvidia cards will suffer, you cannot be that naive. sencondley you say you want games tested maximum performance but then deliberatley chose when making benchmark threads to disable certain options that would make the company you work for look bad then saying i would add them if needed. You cannot have it both ways matt and its now just coming across as a bit sad. As i told you in a email i have both amd and nvidia in my system and if i actually had another option at the time it was built it may even have ended up as a full amd system. I have no loyaltys to any brand and also no issue with any either and i wish some on this forum would just be like that.
No, I do not think any developer/vendor deliberately set out to gimp performance on other hardware. However, I will admit to thinking that way years ago. My opinion has changed thogub, so sorry but I just don’t agree with you there. :)

Wanting to see a hardware reviewer test using an optimised system to showcase maximum potential graphics cards performance does not seem unreasonable to me, so again respectfully I disagree with your comparison.
 
No, I do not think any developer/vendor deliberately set out to gimp performance on other hardware. However, I will admit to thinking that way years ago. My opinion has changed thogub, so sorry but I just don’t agree with you there. :)

Wanting to see a hardware reviewer test using an optimised system to showcase maximum potential graphics cards performance does not seem unreasonable to me, so again respectfully I disagree with your comparison.

Disagree there.

Nvidia and tessellation back with witcher 3 and crysis 2 ;) :)
 
Disagree there.

Nvidia and tessellation back with witcher 3 and crysis 2 ;) :)
Maybe, that was before my time and also a long time ago now. I just don’t believe the view that the main aim is to hurt a vendors performance presently.
 
Jesus i just give up. These companies are a business, if nvidia could put any competition out of business do you seriously not think they would. No competition means more profits.No company really cares about the consumer they just want your money. nvidia are guilty of it and so are amd no matter what the defence force for either say. And to say that you want to see an optimised system running a game at maximum graphics is perfectly acceptable and something i totally agree with as thats why i lurk on this very forum, but it totally contradicts what you tried yourself to do with the gotg bench thread and you got rightly called out for it. Cannot have your cake and eat it bud.

Anyway im done i have not got the patience to keep going over the same old stuff to have it either blatantly ignored or get just a bs excuse(and that isnt aimed at you matt, just in general from a lot of forum members)
 
Jesus i just give up. These companies are a business, if nvidia could put any competition out of business do you seriously not think they would. No competition means more profits.No company really cares about the consumer they just want your money. nvidia are guilty of it and so are amd no matter what the defence force for either say. And to say that you want to see an optimised system running a game at maximum graphics is perfectly acceptable and something i totally agree with as thats why i lurk on this very forum, but it totally contradicts what you tried yourself to do with the gotg bench thread and you got rightly called out for it. Cannot have your cake and eat it bud.

Anyway im done i have not got the patience to keep going over the same old stuff to have it either blatantly ignored or get just a bs excuse(and that isnt aimed at you matt, just in general from a lot of forum members)
No hard feelings Mackey.
 
My GPU has the performance to run FC6 Ultra settings, with RT at 1440P with the HD Textures, 60 FPS + no problem.

The limitation is VRAM capacity, that limitation shouldn't exist, it has the memory architecture of a mid range $300 card from 2015 on a $500 GPU from 2019, the latest 3070TI still has this memory architecture.

AMD put 2GB Memory IC's on a similarly priced GPU, in the last 5 years i have had 4 brand new GPU's, 3 of them Nvidia, this criticism in not about fonboying, its because i want a card that for $500 runs out of grunt before it runs out of memory capacity, especially when the remedy for it was as simple as putting 2GB memory IC's on it instead of 1GB, like AMD do, is that so much to ask when AMD's gross profit margins are 50%, Intel 53% and Nvidia 67%.
 
@humbug did you not read the posts/threads where many a time it was the GPU runs out of grunt, or as someone put it horsepower due to ray tracing before anything else...?

I said you wont see this issue running at 1440p.
 
Disagree there.

Nvidia and tessellation back with witcher 3 and crysis 2 ;) :)

Maybe, that was before my time and also a long time ago now. I just don’t believe the view that the main aim is to hurt a vendors performance presently.

I remember that well. It was horrible. My 5870s became a slideshow. They AMD put in some slider to the drivers that I cannot remember what it was but it reduced the amount of needless Tessellation to get that performance back.

Crazy to think Crysis 2 was 22 March 2011 and still waiting on the London Olympics.

 
I remember that well. It was horrible. My 5870s became a slideshow. They AMD put in some slider to the drivers that I cannot remember what it was but it reduced the amount of needless Tessellation to get that performance back.

Crazy to think Crysis 2 was 22 March 2011 and still waiting on the London Olympics.



In 2010 when Metro 2033 came out, that's when I knew my 1 year old 5870 was now trash
 
Jesus i just give up. These companies are a business, if nvidia could put any competition out of business do you seriously not think they would. No competition means more profits.No company really cares about the consumer they just want your money. nvidia are guilty of it and so are amd no matter what the defence force for either say. And to say that you want to see an optimised system running a game at maximum graphics is perfectly acceptable and something i totally agree with as thats why i lurk on this very forum, but it totally contradicts what you tried yourself to do with the gotg bench thread and you got rightly called out for it. Cannot have your cake and eat it bud.

Anyway im done i have not got the patience to keep going over the same old stuff to have it either blatantly ignored or get just a bs excuse(and that isnt aimed at you matt, just in general from a lot of forum members)

Quoted for truth. Well Said Mackey. The circular debate about Vram is a new low for this forum.
 


It goes both ways. Just last month an ex developer at Ubisoft was quoted saying that him and his team at Ubisoft were required to sabotage AMD sponsored games on several occasions to run worse on Nvidia; with the most recent example being Assasins Creed Valhalla

So no need to pretend like Nvidia is the evil guy and AMD is Jesus the saint
 
You know what, lets try something different, why don't you explain to me why my GPU can't run the HD Texture pack.

Your GPU can run the HD texture pack, just not very well. I imagine all of us taking time to discuss it know why. If you also look back you will notice I have not once argued against why.

My views on this gen have not changed since launch. We have Nvidia on 8nm and leading with RT and AI. We also have AMD on 7nm competing in rasterisation. While everyone has seal clapped AMD for a job well done with RDNA2 I have pointed out that if Nvidia had gone with 7nm also, AMD wouldn't even be a consideration for the most hardened fanboy.

Now we come to the current grasp for market share. AMD pushing two titles that require more VRAM than Nvidia's original 'flagship' shipped with. Is that really so hard to understand why AMD have done this? My point is that the visuals do not appear to be that much better, if at all when compared to similar titles, while also being far behind in modern tech. Remember 10GB, yes I know you have 8GB but that was never the discussion, is a 25% increase over previous gen and also the magic number chosen by not just Nvidia but also Microsoft and Sony.

BTW. Yes, I know what textures are. You need to progress beyond the yellow book and understand textures are not just for putting pretty pixels on screen as demonstrated by their use within FSR2 given the correct hardware.
 
Last edited:
It goes both ways. Just last month an ex developer at Ubisoft was quoted saying that him and his team at Ubisoft were required to sabotage AMD sponsored games on several occasions to run worse on Nvidia; with the most recent example being Assasins Creed Valhalla

So no need to pretend like Nvidia is the evil guy and AMD is Jesus the saint
Where is the source?
 
Jesus you ****ers realllllllly know how to bang on about some **** :D

Agreed, I thought it all had calmed down abit and people just got bored but this is now the latest AMD V nvidia thread.

Its going to get even worse once Intel release their cards.
 
Back
Top Bottom