• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD fine wine

Tlou looks better to me probably one of the best visually , as does red dead 2 and it runs great on lower hardware but I haven't looked much of remnant 2 only from that YouTube
Have you gone back to replay TLOU yet? I need a new SSD lol but I need to remove gpu and vertical mount to get at it :D
 
Last edited:
People never learn that narrow bus-size will always come back to bite the users eventually...it's just a matter of time...

If people decide to support Nvidia paying them 4070ti money for a 4060ti spec card, then that's on them... :rolleyes:
The bus is starving the cards like 4070ti and 4060ti of bandwidth. Effectively building in obsolescence.
 
Last edited:
That's precisely what Nvidia are doing.
Yes but this goes for both Nvidia and AMD, they both have 128Bit cards, the difference is AMD's only goes up to £250 while Nvidia's goes to £500, in that sense a 128Bit is not so bad at £250, but at £500? Yeah that's pretty bad....

And pathetic amounts of VRam, doesn't just cause stutter but also reduces even your consistent frame rates.

I dislike HUB behaviour at the moment but they do make good content from time to time, it should also be said that HUB were the ones who drew attention to the problem of not enough VRam, and they continue to do that. Credit where credit is due.

Most recently they compared the 8GB 4060Ti to the 16B one, so the only difference here is the VRam, and its not looking good for the 8GB is enough apologists.

101 vs 59 FPS in this screenshot, its not just the worst case in an image, watch the video, its quite eye opening.

I've made this exact argument for years, even put up my own video here as proof and explained exactly what's going on, i was dismissed as an Nvidia hater, what do i know.... i'm not going to proclaim myself an expert but i have been working with 3D game engines for a decade.... i do know a thing or two.
I don't hate Nvidia persay, i hate being taken the pee out of by companies like Nvidia, as we all should.

So here it is people, are we listening now????? 8GB cards are utter utter junk, its like putting a lawnmower engine in a Lotus and then apologising its a light weight car, well yes but a lawnmower is even lighter!

Please please stop being so default Nvidia.

ttssE4J.png


 
And pathetic amounts of VRam, doesn't just cause stutter but also reduces even your consistent frame rates.

I dislike HUB behaviour at the moment but they do make good content from time to time, it should also be said that HUB were the ones who drew attention to the problem of not enough VRam, and they continue to do that. Credit where credit is due.

Most recently they compared the 8GB 4060Ti to the 16B one, so the only difference here is the VRam, and its not looking good for the 8GB is enough apologists.

101 vs 59 FPS in this screenshot, its not just the worst case in an image, watch the video, its quite eye opening.

I've made this exact argument for years, even put up my own video here as proof and explained exactly what's going on, i was dismissed as an Nvidia hater, what do i know.... i'm not going to proclaim myself an expert but i have been working with 3D game engines for a decade.... i do know a thing or two.
I don't hate Nvidia persay, i hate being taken the pee out of by companies like Nvidia, as we all should.

So here it is people, are we listening now????? 8GB cards are utter utter junk, its like putting a lawnmower engine in a Lotus and then apologising its a light weight car, well yes but a lawnmower is even lighter!

Please please stop being so default Nvidia.

ttssE4J.png
:confused: That never happened to me, not even once when I put up loads of my 3080 vram findings!:eek::p:cry:
 
Last edited:
:confused: That never happened to me, not even once when I put up loads of my 3080 vram findings!:eek::p:cry:

:D

Yeah, i know.... :) HUB have been banging on about this for a while, and all credit to them, but for the first time i think they have actually shown its effects in action, they didn't need to wait for the same GPU with different levels of VRam to prove it, tho i do understand why they did.

This is quite an old game, 2018, So its about as old as my 2070S which at the time is an upper range GPU, performance should be pretty good, it is, no problem running 120Hz at 1440P maximum settings.

Until i get to the scrap yard, then it drops to about 30, it doesn't stutter, its still smooth, tho laggy.
Its caused by VRam over spill, it can't load the scrap yard in to VRam, its already full, so it takes it to the next level available to it, system ram, and with that my 2070S effectively becomes an APU, no joke, worse in fact because the latency between the dGPU and system RAM is much greater than from the CPU socket to system RAM.

If one didn't understand what was going on here one would think its a problem with the game, its why people keep banging on this or that game isn't optimised, and all running Nvidia GPU's.
Actually in this case the game is doing remarkably well to keep things smooth despite the sudden massive change in latency.
I wonder how many people with 3070 / 3080's have had this sort of thing happen and thought nothing of it beyond "this game needs optimising" or just plain "WTF, oh well...."

At about 3 minutes i reduce the graphics setting to about half, the VRam drops to about 5GB and its fine again.

 
Last edited:
@humbug, waiting to see some 40 series age like a brick in water.:(

The higher vram allocation AMD usually has over NV is where some of the Chianti's coming from.:D

Hub's been dishing it to nvidia for ages but when they dish it to AMD people question their b3haviour. Yawn
You're making it sound like not a single NV user questions their behaviour when NV gets it in the neck.:p

Have noticed in their QnA discussing Starfield and AMD that they have started using 'allegedly' blocking DLSS which was missing from the OG discussions.
 
The problem is most people wont even notice, this is one of the things about mindshare, if a game is behaving badly on an Nvidia GPU they blame the game, if its behaving badly on an AMD GPU they blame the GPU.

There is still some hang over with that from Intel mindshare, AMD have not completely killed that yet, still some odd behaviours with these P-Core / E-Core CPU's for some people, they have a mixture of slow and fast cores and a mixture of separated cache level's, just like AMD's CPU's with the one chip that has 3D cache with another that doesn't, that still causes some odd behaviour, but in this case in peoples minds it is the CPU, which it is, in the case of the Intel mindset largely its not, its windows or something else...... never the CPU.

Ok so should AMD expect software devs to work around AMD's uneven architecture? No, AMD should put the 3D Cache on both CCD's.
Same with Intel's mixed up CPU's, Intel are doing it because its a way to get high R23 scores, not much else matters.

I would never buy a 7950X3D, i would never buy a 7900X because it only has 6 cores on each CCD so a heavily threaded games need to use both CCD's.
I did buy a 5800X, single 8 core CCD, fantastic CPU, i'd love a 7800X3D.

I would never buy any of these new Intel CPU's. In the same way none of AMD's GPU's appealed to me in the last decade Nvidia's Ampere GPU's were all also just crap.

I like the RDNA2 GPU's and i'm waiting to see the rest of RDNA3, the 40 series.... all overpriced crap.
 
Last edited:
Back
Top Bottom