• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

When are the RTX 3090 reviews out

7% on average at 1440p and 10% at 4K going by Hardware Unboxed figures. So half of what you and I think would be poor value.

Buy it for work but 3080 for gaming.

Judging from the benchmarks, a Titan performs better than a 3090 in a lot of work tasks I'd need it for..
 
Yep. I hope they take the crown and finally get some market share. I want to see a healthy balance so we the customers benefit.

Also would love to see Jensen squirm a bit :D

Cheeky bugger about 8K gaming... Lol.


The customers will not benefit until the market share splits more evenly. Unfortunately most people only want AMD to be competitive to drive down nvidia prices.
 
That's because its marketed for gaming but its not for gaming. Anyone buying this for gaming is just monumentally wasting their money, its 7% faster than a RTX 3080 in 1440p gaming at double the price. Its 10% faster at 4K for twice the price. Its cost per frame is almost as bad as the RTX 2080Ti and worse than the Radeon VII. Low and behold, its cost per frame is twice that of the RTX 3080.

Yet people will still buy it..... :rolleyes:

what about future proof though. if you want to game at 4k max settings 10gb is just about enough ..hows that going to fair next year when console games come out? i really dont think 10gb is going to last very long. which is the only reasson i would go for a 3090 now over the 3080.
 
what about future proof though. if you want to game at 4k max settings 10gb is just about enough ..hows that going to fair next year when console games come out? i really dont think 10gb is going to last very long. which is the only reasson i would go for a 3090 now over the 3080.

Based on what exactly? GPU-Z VRAM measurements? That only tells you how much VRAM is allocated, not how much is actually used and needed.

Besides that, when you're actually playing a game (particularly a frantic action game) you're not going to be able to tell the difference between ultra and high res textures. Turning a setting down a little is hardly the end of the world?
 
The customers will not benefit until the market share splits more evenly. Unfortunately most people only want AMD to be competitive to drive down nvidia prices.
One of the reasons for this is Nvidia always make sure they have the fastest card and are known to be the best. If AMD can keep taking the performance crown for 3-4 gens soon people will change their tune as when people talk about the best AMD will come to mind, not Nvidia.
 
Even worse then! (I was just going by the techpowerup comparison)
TPU don't have a FE card, so are comparing customs to the 3080 FE. That Strix has a 390W power limit out of the box (and an insane 480W with manual adjustment), so is getting there through brute force. Presumably the high end 3080s will do the same and close the gap up. I know EVGA said their higher end 3080s have a power limit north of 400W. The Zotac Trinity seems to be the closest to stock of the custom 3090s TPU reviewed, and the gains are more in line with the FE reviews from elsewhere.

relative-performance_j3k1m.png
 
Gamers Nexus did not hold back in their review. The card is obviously a Titan, a workstation card for the consumer, it's not a Ti, it's barely better than an OC'd 3080. I'm not opposed to spending money on cards, I bought two 2080Ti's, but I could not justify spending £1500 for sub 10% gains at 4k. I'm going to let AMD's launch play out, see if they can shake up the market, but if not I'll wait for the 3080Ti.
A Titan without certain Titan features such that it doesn't perform better in some workloads than the previous gen RTX Titan while consuming vastly more power.
 
TPU don't have a FE card, so are comparing customs to the 3080 FE. That Strix has a 390W power limit out of the box (and an insane 480W with manual adjustment), so is getting there through brute force. Presumably the high end 3080s will do the same and close the gap up. I know EVGA said their higher end 3080s have a power limit north of 400W. The Zotac Trinity seems to be the closest to stock of the custom 3090s TPU reviewed, and the gains are more in line with the FE reviews from elsewhere.

relative-performance_j3k1m.png

Ah i see. The 20% figure was based on the post here but that is for the Strix OC (ok it is actually 19%)

https://www.overclockers.co.uk/forums/posts/33981800/
 
Back
Top Bottom