Associate
- Joined
- 11 Jan 2016
- Posts
- 218
- Location
- Derby UK
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
1. Don't know yet but I did hear that undervolting really isn't a thing with this card.I saw the der8auer vid and it seems like a no brainer.
Two questions:
1 - Are there any long term risks to the card with this?
2 - Would this allow this card to be used with a 750w PS (Understandably not 100% best plan).
That's actually a good point! If you know you don't need that much performance you mind as well get a lower end sku.We done this last gen on the 30 series, saves a ton of watts. Pretty sure someone also jibbed in and mentioned should have got a lower sku if your running it at such a reduced capacity!![]()
1. Don't know yet but I did hear that undervolting really isn't a thing with this card.
2. I seriously doubt it because of the amount of power require just to turn it on. If nvidia mandates a higher wattage it's for a legit reason.
That's a nice start, but the spikes the professional reviewers report on are measured at the 20ms level (well TPU's W1zzard and Igorslab) and it's those which might trip a PSU up.Whilst true ive done some basic testing comparing the 3080 to the 3090 Ti.
3080.
Full system idle - 130W
Full system load - 510W
GPU board Power draw / Power, W.
Idle 16W
Max 325W
3090Ti
Full system idle - 130W
Full system load - 680W
GPU board Power draw / Power, W.
Idle 27W
Max 491W
Fully system is measured at the wall using a power meter plus. It includes (PC, 34" monitor & speakers)
I wouldnt recommend it, but I think most could get away with a 750W PSU. Nvidia will no doubt be covering their asses saying 850W minimum
lol just lol....2. I seriously doubt it because of the amount of power require just to turn it on. If nvidia mandates a higher wattage it's for a legit reason.
Don't mind losing upto ~10% performance if that means being able to run a 300W+ card at 200W or less. With AMD's drivers Chill feature the card might even use less.
It's not always the titles you expect which actually push the GPU the hardest. Out of the games I've tried, it's A Plague Tale: Innocence of all things which causes by far the highest power draw. Using a 3070 at 1890MHz/900mV, Cyberpunk 2077 with ray tracing enabled pulls 160-170W, versus 195-205W for APTI.I think undervolting and limiting you could go as low as about 80% before the performance began to dive on the 3090. 60% seems quite a lot so would be interesting to see how that affected ray traced titles and pushing the card.
It's not always the titles you expect which actually push the GPU the hardest. Out of the games I've tried, it's A Plague Tale: Innocence of all things which causes by far the highest power draw. Using a 3070 at 1890MHz/900mV, Cyberpunk 2077 with ray tracing enabled pulls 160-170W, versus 195-205W for APTI.
I get double my old RTX 3090 in performance with 100 less watts, that is the whole point of this.That's actually a good point! If you know you don't need that much performance you mind as well get a lower end sku.
I get double my old RTX 3090 in performance with 100 less watts, that is the whole point of this.Why? Performance is worse and who cares about saving power when you are buying a £2K GPU
Almost AMD-likeThe 4090 is incredibly efficient