• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Anyone who can, should enable ReBar!/

Not always, it depends entirely on ones setup and what game they are playing, the only setup where I would recommend to set it on and forget about it would be a ryzen 5xxx and RDNA 2 setup. I find it helps on the whole but there are scenarios where performance was slightly reduced or/and encountered severe issues such as in fc 6 from my "experience" anyway.
 
Not the right thread for this.... "Basically the same thing"? Had a quick look, they look pretty different to me:



Worthwhile upgrading from a 2070s to a 3070 for the price? That's a different thing now, imo not really.

Bhawahahahahaha!!!

Basically the same thing he says... The gap is bloody bigger than the gap there is between a 3080 and 3090 for godsakes. What utter rubbish :D

Here is a screen grab:

Yep-No-difference.png


Yep, no difference indeed. Hahahahahaha!


+25%? Its a 2070 with mustard up its bottom, What did you have before and how much did you pay for your over excited 2070? @TNA

Erm, I paid £460 or there about's as I recall. Was essentially free from my 3080 sale where I made £1000+ profit from selling it to CEX :D

Too funny dude, clutching at straws or what :D


But weren't people holding TPU/wizard comments in high regards because he said about vram issues or something, now all of a sudden, "clearly never played the game", goalposts right there ladies and gentlemen :cry: Just like the fury x comments of "4gb being enough for 4k" :D



And yet you come to the conclusion "it's basically the same", not see the flaw here, also depending on res. and game, it is more than 25% at times....

Like I said, coming from a 2070s for a potential outlay of £450+? Is it worth it? Nope. Had you sold on your 2070s to bring the 3070 down to say <£200, then depending on ones needs/wants, it "might" be worth it.



And if you read through the thread, you will see it was debunked. In fact HU did 2 videos as they got called out for their results showing nvidia being better than amd, turns out the testing scene they used had more action/enemies where as other sites tested areas with nothing happening :cry:

Exactly. And the usual lot even without realising show their bias by agreeing with such tripe.
 
Bhawahahahahaha!!!

Basically the same thing he says... The gap is bloody bigger than the gap there is between a 3080 and 3090 for godsakes. What utter rubbish :D

Here is a screen grab:

Yep-No-difference.png


Yep, no difference indeed. Hahahahahaha!




Erm, I paid £460 or there about's as I recall. Was essentially free from my 3080 sale where I made £1000+ profit from selling it to CEX :D

Too funny dude, clutching at straws or what :D




Exactly. And the usual lot even without realising show their bias by agreeing with such tripe.


That's one example, here is another, 14%, i can get that with an overclock.

ItA0Ey1.png
 
I didn't say there wasn't a difference, i said its a 2070 with mustard up its bottom, its like a 2070 but a bit faster, like an overcloked 2070.

You literally said is basically the same thing….

What you are saying makes no sense.

Here is a somewhat recent review of a 3070 and at 1440p I had a look and on average it is around 33% better. But yeah, basically the same thing :cry:
 
Oh, 33% extra performance does not make a difference to humbug guys, so that means its basically the same thing. Couldn’t make this up. Clutching at straws as I said :cry:
 
I play Halo Infinite on a 3840X1080 screen (Basically a little more than 1440p) and noticed how it shows 11gb "usage" at ultra settings on my 6800XT. When I saw that I got a chuckle because I'm sure AMD had some influence there. Nvidia did it to AMD with Crysis and unnecessary tessellation and now it seems AMD is returning the favor with the titles they are involved with and vram.

That's another thing about vram. It seems to be rather easy to blow out *any size* buffer if a developer chooses to do so.

Yeah at least you can notice the company shenanigans which is a good thing. I had this downloaded quite a while back but now I completed Odyssey I may give it a whirl.
 
That's not how it works, it either needs those assets in the buffer or it doesn't.

Memory on GPU's is a hierarchy, just as it is on CPU's.

It starts with the fastest memory of all, your L1 cache, if it doesn't fit in there its ejected to L2, if it doesn't fit in there it gets ejected to L3 cache, RDNA2 GPU's have this, they call it Infinity Cache, its a level 3 cache on the GPU, in the case of your Ampere GPU's its your GDDR6 memory, if it doesn't fit in there it gets ejected to your System RAM, which is very much slower than your GDDR6 buffer, so lower performance, it can even cause frame stalls. Stutters, on your £700 GPU.

Rumour is nvidia are using this in their Ada lineup.. thanks AMD! :cry:
 
What, hardware thread scheduling?

This I think.

I'd forgotten about Halo Infinite.
...
I'd be curious to see how it runs the level shown in this video on your 3080 Twinz, with an overlay showing your frame times at 4K max settings, with the HD Texture pack installed.

Well any 'neutral' 3080 owners would be great. ;)

...having done some research into the game and user feedback from actual owners on external forums.

The approved list endorsed by debunker Matthew? or just user forum spam obfuscation like he uses selectively?
 

Oh that, yeah, its not as simple as just increasing the size of the cache tho, not without other tweaks and optimisations, simply growing the cache makes it physically larger, which increases latency and with that you could lose whatever you gain from having more.

Its no good looking at the 5800X3D and concluding "oh, bigger cache = more performance, well that is simple enough" it isn't, AMD have spend the last few years working with and perfecting not just large cache latency but also off die latency, to the point now where you can have a chunk of cache on a completely different CPU die and it not matter one bit.

I'm sure Nvidia will figure it out, they are far more talented than lets say, Intel, but i would rather they work on their thread scheduling, that is a far more pressing problem, IMO.
 
Status
Not open for further replies.
Back
Top Bottom