Associate
Where are these reports?
Never knew anyone had released data.
WCCFTech is one such outlet that reported it: https://wccftech.com/nvidia-geforce...k-2077-dlss-3-cuts-gpu-wattage-by-25-percent/
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Where are these reports?
Never knew anyone had released data.
The reason why I am going to say this theory is unlikely to be true; is because if it was that powerful, they would have been flashing those numbers throughout the presentation, without relying on DLSS.Here's a hot take, maybe we have been looking at this all wrong?
The 4090 is so powerful as it's turning out that nvidia knew this, hence the price, because it's so powerful they know that anyone buying one will almost certainly skip the next generation or two, so lower sales numbers. WHat else can they market for the 5090/5080? I mean here we have a card that can do 120fps at 4K in the latest games like Cyberpunk (using DLSS obviously) - For most people that is peak gaming right there and 8K monitors aren't a thing let alone screens with VRR at that res.
I can see how nvidia will have seen the 4090 as a long haul card hence the pricing. I can only see a 4090 owner getting a 50 series if the power usage has dropped dramatically whilst still being more powerful or just as powerful as the 4090 - Otherwise what's the point in upgrading from a 4090 for 2 maybe 3 more generations at least?
Same, I’m really wanting to step in to VR, hopefully get a few extra frames in games like Star Citizen and think the 4090 should be perfect for it…
I just need to decide do I go for an FE and mount it upright in my case or go for the ZOTAC and vertically mount it.
I'll not be getting one til near Christmas have to save lol.
WCCFTech is one such outlet that reported it: https://wccftech.com/nvidia-geforce...k-2077-dlss-3-cuts-gpu-wattage-by-25-percent/
i can imagine all the vfx and 3d studios dumping their 3090 builds for the new 4090NVIDIA GeForce RTX 4080 16 GB Graphics Card Benchmarks Leak Out, Up To 29% Faster in 3DMark Tests & 53 TFLOPs Compute
The first benchmarks of NVIDIA's GeForce RTX 4080 16 GB graphics card have leaked online and show over 20% performance gain in 3DMark tests.wccftech.com
Hi mate, good points and they are valid and I didn’t give the whole picture.Why would you buy a 4090 ahead of getting the VR headset? If you want to get into VR, get the VR headset. Then if you find your 3090 really cant cope... get the 4090?
Just seems bizarre to me to be buying a GPU at £2k, for something you havent started doing yet? The VR headsets aren't cheap... why not spend this cash on those?
Here's a hot take, maybe we have been looking at this all wrong?
The 4090 is so powerful as it's turning out that nvidia knew this, hence the price, because it's so powerful they know that anyone buying one will almost certainly skip the next generation or two, so lower sales numbers. WHat else can they market for the 5090/5080? I mean here we have a card that can do 120fps at 4K in the latest games like Cyberpunk (using DLSS obviously) - For most people that is peak gaming right there and 8K monitors aren't a thing let alone screens with VRR at that res.
I can see how nvidia will have seen the 4090 as a long haul card hence the pricing. I can only see a 4090 owner getting a 50 series if the power usage has dropped dramatically whilst still being more powerful or just as powerful as the 4090 - Otherwise what's the point in upgrading from a 4090 for 2 maybe 3 more generations at least?
Those on 20 series cards getting a 4080 16GB or 4090 will be in it for the long term, those on the 3080 12GB or greater has no real reason to upgrade unless 4K is their desire it seems.
That £2k saved could bring you far more smiles in other ways Id argue...
30 seconds of fun then a few hours waffling on about how I could have been playing cyberpunk with dlss 3.0 and getting 90 plus fps.Hookers and cocaine? Think the GPU would at least last longer!
"
Ada is incredibly energy efficient, over twice the performance at the same power as Ampere and you can really push Ada! We have overclocked Ada past 3 GHz in our labs.
NVIDIA CEO, Jensen Huang"
Sorry but I don't think most have a lab
i can imagine all the vfx and 3d studios dumping their 3090 builds for the new 4090
That's pretty impressive, so it's cheaper than a 3090 Ti (or the same price depending on where you buy) whilst being a marked step up. Finally some half decent 4080 16GB info.we got to lay our eyes on the AIDA64 GPGPU Benchmark which shows that the card offers up to 53.6 TFLOPs of single-precision performance which is 8% higher than the officially reported figure of 49 TFLOPs. For comparison, the GeForce RTX 3090 Ti produces 40 TFLOPs so this is a 32.5% improvement in single-precision compute.
Not on this forum and don't forget about DLSS 4.0Those on 20 series cards getting a 4080 16GB or 4090 will be in it for the long term
"
Ada is incredibly energy efficient, over twice the performance at the same power as Ampere and you can really push Ada! We have overclocked Ada past 3 GHz in our labs.
NVIDIA CEO, Jensen Huang"
Sorry but I don't think most have a lab
I doubt it, you need VRAM for that stuff and the lack of NVlink is an issue.i can imagine all the vfx and 3d studios dumping their 3090 builds for the new 4090
Is it the extreme version?What's with that leaked 3Dmark Timespy score though of the 4080 16GB? I just logged in to check what my 3080 Ti scored and it is higher than ones shown above?