• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Ahh ok, My 4090 is currently wasted, I keep locking everything to 60FPS so I can have a quiet system XD
Dude you really need a system tweak I think, 4k/120 here and my system is still v quiet at load.

Make sure the 90 is fed with enough cool air and enough going out to get rid of it and set a custom profile if needed and you'll be golden.
 
Last edited:
Do you mean these parts of the image:

oN8eyvo.jpg

I took a closer look. I'd assumed these parts would be missing because it's a qualification sample. Or, deliberately obscured.

Er no, not just those areas. Those were clearly edited, which matches what you posted earlier about how anyone can believe a particular chip MUST belong to a particular card. But it's not necessarily the case. Or in this case, the image may not even be the real thing.

In this particular case of concern was various aspects, I'll go through them:

1. The (R) after nVIDIA is wonky. It's not straight like how everything else is. Or how other photos of Nvidia chips are (the (R) is straight). I can't imagine Nvidia would print/etch something that's like the Team Rocket R with a slant to it but he rest of the NVIDIA is normal.
2. The erased parts of the top of the die of the chip, it's clearly been edited, which basically means it's impossible to validate that it even IS the chip in question. Anyone could have put anything on it. (Basically similar to your point in your post I replied to). The sharpness of the text could be attributed to the camera, but it seems odd that the main text is so crisp, yet the (R) is so wrong.
3. What set me off most however, was the lighting and shadows. Especially the the amount of light on the fibres we see dotted around the chip. If we look at the shadows, the primary light source is coming down from the "top" of the image (not necessarily overhead, but in that direction the source of the light is). And if we look at the green part of the chip near the bottom left of the die, we can clearly see some shadows there cast by the die itself at the bottom of it. However, if we also look closely, the bottom left part of the die on the silvery grey part, the fibres there are clearly lit up like it is in direct lighting, with no sign of any shadows; yet the shadows below the die on the green part are clearly being cast by a lighting source from above (the top of the image). Which if there was another light source that eliminated the shadow on the silvery grey bit of the bottom left of the die, it would have also elimianted the shadows on the bottom left part near the die on the green parts. Also if we look to the top of the image, where there is another set of shadows, the fibres there are clearly darker (in shadow). Yet the ones in the silvery grey bit in the bottom left portion of the die are clearly not overcast in the slightest.

That is why the image is weird to my eyes. It doesn't make sense given the lighting and shadows. Suggesting it's been heavily modified. So it would be better to assume it may not even be what it claims to be at this time, since we can't even verify that from all the editing.
 
That makes no sense, just set a custom fan profile.

It does if you know the use case.

Dude you really need a system tweak I think, 4k/120 here and my system is still v quiet at load.

Make sure the 90 is fed with enough cool air and enough going out to get rid of it and set a custom profile if needed and you'll be golden.

If I'm playing games like The Witcher 3, Hogwarts, Deus Ex, Jurassic Park, SWTOR, Anno etc... I keep all system fans at 400RPM, GPU is always on a 1:1 fan curve and FPS capped at 60 as I don't need 60+ in those specific games and it keeps the rig quite silent and power use down as a little bonus.

If I'm doing PvP in Destiny 2 or need faster reaction times in games like F1 or other racing/competitive games then I set all fans via iCUE to a saved "60+" profile that puts them all at 1200RPM and also uncap the FPS in the Nvidia control panel.
 
Last edited:
If I'm playing games like The Witcher 3, Hogwarts, Deus Ex, Jurassic Park, SWTOR, Anno etc... I keep all system fans at 400RPM, GPU is always on a 1:1 fan curve and FPS capped at 60 as I don't need 60+ in those specific games and it keeps the rig quite silent and power use down as a little bonus.
No that's fair enough, everyone has a different liking to how quiet they like their system (or not!). I'm just glad of no coil whine with this one, my TUF was horrendous!!!

I seem to have mine set well bar the LT720 AiO, real happy with it atm. Still set to standard (quiet) and could do with a bit of tuning to push more air through the top :)
 
Last edited:


Gigabyte lists GeForce RTX 4070 graphics cards with 10, 12 and 16GB memory​



HUMMMM....
 
  • Haha
Reactions: TNA
Gigabyte lists GeForce RTX 4070 graphics cards with 10, 12 and 16GB memory

Definitely a:
GIF-Amused-bemused-blinds-entertained-happy-jim-carrey-lurk-lurking-peek-peep-GIF.gif
 


Gigabyte lists GeForce RTX 4070 graphics cards with 10, 12 and 16GB memory​



HUMMMM....
well to be fair nvidia started it with the 4080 16gb and 4080 12gb :)
 
Apparently the graphics card producers don't know the specs of cards either

I don't think anyone has got a clue what the spec of RTX 4070 cards is going to be. Except for probably having 12GB VRAM, but that's the least interesting part.
 
It's not the be all and end all. 10GB would be fine for a 1080p monitor, which is still the type most have - 68% according to Steam Hardware Survey:

It's like buying a 3 legged horse and proclaiming "Look everyone... I have a horse" congrats, The horse needs to be put down and then sent to the glue factory but hey everyone, You had a horse... useless horse.
 
It's not the be all and end all. 10GB would be fine for a 1080p monitor, which is still the type most have - 68% according to Steam Hardware Survey:

Just 2% have a 4k display resolution set.
That survey isn't necessarily super accurate though.

For example, my desktop monitor is 3440x1440, but I do all my gaming on my LG G2 4k TV.

However, if the TV isn't turned on when the survey pops up, the survey doesn't recognise/record it and only records my desktop resolution.
 
Don’t know about anyone else, but with the 10GB 4070 on the way, I’m looking forward to the 12/16 GB 4060 occasionally beating the 10 GB 4070.
With settings that both of them will be getting 7 fps? Sure, it will happen. Question is, does it matter?
 
Back
Top Bottom