• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Does AMD need more VRAM to do the same thing?

  • Thread starter Thread starter TNA
  • Start date Start date
Honesly Rowntrees don't hold a candle to Calipos

Exactly. Calipos are so much better. Tried rowntrees once. Cheap nasty stuff relatively speaking imo. Love me a Orange Calipo. But I have cut down on these things a lot.

As for something about gpus, top post i agree with what you said.


Who the **** do I have to fight to prove that Twisters are da best!!??? SHOW MEEEEEEEEEEEEEE!!!!!!!!!!!!!!!!!!!!

:p

Ooh. Those are very nice too. Tiny things though. Need minimum two. Maybe even the whole box of 6 :cry:
 
Last edited:
IDK, but NV's had years of skimping out on the vram so they must have learned how to pack suitcases better.

Pretty much what I was trying to say earlier, just much more elegantly put!

For some reason AMD paired down the Infinity Cache size on Navi 32 and Navi 31. Plus Navi 33 still has the tiny Infinity Cache amount Navi 23 had. No doubt Nvidia has hogged most of the GDDR6X supply,but I do wonder whether having to use more MCD chiplets actually makes sense,over a larger die size.

I know they want to cut down on die sizes,but sometimes I really think they are going too far with this. If you have to use more MCD chiplets,more memory chips,bigger PCB,etc all adds to more complexity,power and cost.

@KompuKare sometimes talks about this too.

The bean counters have taken over!

I know there was a rumour of a larger chiplet GPU above Navi31, but currently going chiplets seems to have mostly a waste. Well, unless the aim was to reduce costs so that pitiful quantities remain somewhat viable?

A lot of trouble to go through to yet again snatch defeat from the jaws of victory?
 
You can see why this guy a is a high school maths teacher and not a science teacher as he hasn't got a clue how to test something properly.

Looking forward to seeing your testing to debunk him! :D



Just me or it coming across like poor Daniel no longer a reputable source like back in the day? :p ;)
 
Why would I need to debunk him, if you can't see his testing methodology isn't at all scientific then the data gathered isn't useful.
If a job is worth doing it is worth doing properly, no?

Did you watch the video? What exactly was wrong with the testing? And what should he have done differently?

It was pretty in depth and well tested all round, certainly beats the one liners we get on here with no data to counter arguments and a much better insight into what is actually happening when vram runs out i.e. without this video, I wouldn't have realised that nvidia handles the vram bottlenecks somewhat more gracefully than amd in certain games (not that that is a saving grace where 8gb is clearly not enough regardless)

As he said himself he's not an engineer so can't explain what or why there are these differences but his video demonstrated amd and nvidia do in fact have different ways of handling vram. That and this video only strengthens the topic as TPU, PCGH and others have highlighted before but not really done any further digging.
 
I wouldn't have realised that nvidia handles the vram bottlenecks somewhat more gracefully than amd in certain games (not that that is a saving grace where 8gb is clearly not enough regardless)
The only problem I have with that specific comment is, the NV cards that I've owned 3070/80 that should have higher vram specs, they get gracefully rat faced drunk when it counts.
 
Doesn't Nvidia use some sort of vram compression?
Something called Delta compression in general and I think a new type of compression.

What people here seem to be missing is, compression leads to degradation in image quality.

I've always been vocal and tied this as the main reason why Nvidia image quality looked a bit softer then AMD when you swap out GPUs.
 
Something called Delta compression in general and I think a new type of compression.

What people here seem to be missing is, compression leads to degradation in image quality.

I've always been vocal and tied this as the main reason why Nvidia image quality looked a bit softer then AMD when you swap out GPUs.

AMD have sharpening turned on by default in their control panel iirc.
 
It's not, you have to turn it on but AMD and actually Intel looks sharper in general then Nvidia without that.

I own all 3 vendors

Can't say I noticed any difference when I switch from amd to nvidia on either my 4k oled or 34" 3440x1440 IPS screen at the time.

Not seen/used intel so can't comment on that.
 
Back
Top Bottom