Soldato
- Joined
- 14 Jul 2005
- Posts
- 9,154
- Location
- Birmingham
I think the 3080 Ti would be a killer GPU once it release at £999.
It will definitely kill the need for a 3090.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I think the 3080 Ti would be a killer GPU once it release at £999.
It is unfortunate but not for the reason you stated. Neither the 3080 nor 3090 has much overclock Headroom. But just going on speculation it makes the 3080ti nothing more than a 3090 with less vram.Unfortunately once NVIDIA releases 3080 Ti the 6900XT would be a rather hard sell. Firstly when you have so much cash to burn on 1 GPU these cards should be aimed at 4K 120hz. 3080 and 6800XT turned out competitive at 1440p and 1080p when AMD’s marketing compared both cards with SAM and rage disabled. But 3080 pulled ahead by 7.4% at 4K. The 3090 is about 13% faster than 3080 at 4K. AMD turned on Rage and Sam while comparing the 6900XT to 3090 indicating it won’t be all that competitive without them.
So even with the absence of the 3080 Ti, its hard to justify why you would pay £300 more than a 3080 for roughly 10% rasterisation boost at 4K over a 3080 but with RTX performance below that of a 3070 and no DLSS at the moment.
I think the 3080 Ti would be a killer GPU once it release at £999.
If you are a gamer and don't need to do any heavy or professional productivity tasks then it makes much more sense to spend the money on a better GPU than a mobo and CPU. A B550 with a 5600x costs £400 and you do not need to spend more than that on a CPU and mobo and it will max out any high-end GPU while still providing more than enough power for amateur or prosumer photo or video editing. By the time you need more power for gaming, AM5 will be mature.When we start casually talking £900-£1000 for a gpu and then realise, for that amount you could get a top end cpu and mobo.
It is unfortunate but not for the reason you stated. Neither the 3080 nor 3090 has much overclock Headroom. But just going on speculation it makes the 3080ti nothing more than a 3090 with less vram.
If that turns out to be true it would make the 3080TI nothing more than what people already have.
Which would be a shame as we won't see any competition until Hopper.
The performance of the 3080Ti depends entirely on the margin by which 6900XT would beat 3080 at 4k. If it beats it by a less than 10% range, NVIDIA may quite possibly just release the 3080 Ti with lower clocks than a 3090, which barely manages to eek out a win at 4k. If the 6900 XT beats the 3080 by 13% like the 3090, they may just release a lower VRAM 3090 with the same clocks to compete. Regardless, the 3080 Ti performance is already what people currently have.
It primarily depends on NVIDIA's willingness to throw the 3090 owners under the bus as they would be irate when they find out that the 3080 Ti provides the same performance just 3 months later at 50% lower prices.
Something that most won't care about but which I found interesting in the Linus review. He has a relatively wide range of rendering benches he does and the 6800xt performed very poor - not only taking much longer to render the image but the quality of the image was very poor, rendered images appear very noisy (AMD needs better denoising)
This reminds me of Zen 2 vs Coffee Lake, where Intel was good for straight gaming and nothing else while AMD was good for everything else. Now we have the same with graphics cards, where RDNA 2 is great for straight gaming but if you do anything else on your GPU (whether it be rendering, editing, streaming, AI, simulations etc) you're far better off with the Nvidia option.
Keeping same specs as 3090 but lowering memory amount would mean 3080ti would be a 12GB card.
Here is what's gonna happen with Star Citizen, mark my words call me wrong but when we look back.have two friends that work for Cloud Imperium Games - I think it's more of a case that the design is so expansive that it has taken along time (and will still require significant development).
Probably not - nearly all current dev costs (and a significant period of future costs) have been covered by kick-starters, backers, patrons etc.
Im sorry where is this huge drop @4k?? I see it's less than the Ampere range but I don't see a huge drop?I think the 6900xt will follow the same pattern in performance: great at 1080p and 1440p and a huge drop at 4k. (at least for a while) and it will offer far better RT performance.
Im sorry where is this huge drop @4k?? I see it's less than the Ampere range but I don't see a huge drop?
now before someone post cherry picked slids of a Nvidia bias game ill just post 3 back of an AMD bias game, across the board the AMD cards are slower at 4k but I think huge is a tad dramatic!
4k 6800xt gets almost a 8% spanking by 3080rtx
6900xt might just squeek past 3080 at 4k
with a 10% uplift over 6800xt.....