Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
It's AMD making the perf/watt gains though as the 3080 is 25~30% faster than a 2080ti while using 25~30% more power.I had to google, but isn't AMD's 6800XT at 300W/7nm, while Nvidia's 3080/8nm sits at 320W?
This is from a website friendly to Nvidia:I had to google, but isn't AMD's 6800XT at 300W/7nm, while Nvidia's 3080/8nm sits at 320W?
Look how insanely good RDNA2 can be @perf/watt if you tune it:
https://www.youtube.com/watch?v=rZz6BXXGjVc
I think the reported power usage is for the chipset only but that is under 200W for the Big Navi. That means with more optimizations they can add more hw in the same power budget.
This is from a website friendly to Nvidia:
https://www.techpowerup.com/review/amd-radeon-rx-6800-xt/36.html
135W for the chipset, probably around 200W for the whole board. But it is still good and it shows there is still room for AMD to improve the perf/watt. Nvidia has this problem, even if they will move on a much smaller node, they will need to improve the perf/watt a lot. AMD just needs to add more hardware to their chips.At 2.05 to 2.1Ghz that's downclocked a bit, but not much, add 10% to that you're at 2.3Ghz.
130 to 135 Watt's, that is stupidly good.
135W for the chipset, probably around 200W for the whole board. But it is still good and it shows there is still room for AMD to improve the perf/watt. Nvidia has this problem, even if they will move on a much smaller node, they will need to improve the perf/watt a lot. AMD just needs to add more hardware to their chips.
I have never seen a GPU accept such a huge undervolt and is still able to maintain stock performance.
Don't forget all of Nvidia ampere cards can undervolt extremely well too, 3080s can use at least 100w less power for pretty much the same performance and I think the 3090 is similar too. Of course amd are still better for power efficiency this round.
Do the tensor cores add to power draw for nvidia?
Of course every piece of hardware is using power if/when it is used.Don't forget all of Nvidia ampere cards can undervolt extremely well too, 3080s can use at least 100w less power for pretty much the same performance and I think the 3090 is similar too. Of course amd are still better for power efficiency this round regardless of undervolting or not though.
Do the tensor cores use more power?
Do the tensor cores use more power?
It is not about worrying, it is a hint about how the things will evolve in the future. You can't increase the power draw by a lot, you will need to stop somewhere.Yes, I run my 3080 at 900mV/1920Mhz. That works with CP2077 at 1440p running Psycho RT, DLSS Quality. I can clock it higher when no RT is in use, while keeping the same undervolt, but I have not tried with no DLSS.
Who is buying a high end GPU though and worrying about power efficiency? Didn't the r290 use a fair bit of power, I bought one of these also
Look here at 8k in Doom Eternal
3090 420W average
6900xt 311W average
So again, AMD does not have a hardware problem. They can easily catch Nvidia by adding a lot more hardware and use the same power Nvidia does, it was probably a bad idea to bet on low power usage while Nvidia went full berserk but that may help them in the future.
Nvidia needs to work on perf/watts because they can't increase the power usage forever and the transition to 5nm will not be enough for a good performance increase in the next generation.
Makes no sense, the 3080 arrived as speculated and well in advance of RDNA2.
Do I consider RT more important than raster performance. Yes! Rasterisation has hit levels above and beyond what is required. I'd have been happy woth 1080Ti raster performance and double the RT cores on my 3080. I've already said this before. Rasterisation should be considered legacy by now, something that only IGPUs hold on to.
No? Care to explain that one?
I didn't mention Quake. I did mentioned Quake 2 RTX, a fully path traced engine and arguably our best example of raytracing within a game so far. I'd agree most gamers couldn't care less about as they have no understanding of what it is. Ignorance plagues tech forums.
Yes Nvidia offers the better product, which was partly the point of my original post.
I'm not a pro gamer though so FPS in legacy titles/engines has no interest for me. Both AMD and Nvidia offer cards that provide more FPS than requried in that respect. I simply want new tech that doesn't leave games looking as though thay are made from cardboard cutouts.