• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Ah I get you :D. That's definitely not my thought process, before the 1080Ti I had many AMD cards and would happily go with them if it made most sense.

1080Ti is a great card mate, hindsight dictates I should have saved for one but normally top tier cards (>£700) are never on my radar. I had a 1080p monitor for a long time so it would have been filing nails but great to look at! Now I have a 4k monitor..
 
1080Ti is a great card mate, hindsight dictates I should have saved for one but normally top tier cards (>£700) are never on my radar. I had a 1080p monitor for a long time so it would have been filing nails but great to look at! Now I have a 4k monitor..
You saw the light and ditched the potato resolution. I actually clearly see the individual pixels at that resolution. Was never a problem before, but once you see the crispness of 4K, it is hard to go back.
 
1080Ti is a great card mate, hindsight dictates I should have saved for one but normally top tier cards (>£700) are never on my radar. I had a 1080p monitor for a long time so it would have been filing nails but great to look at! Now I have a 4k monitor..

Yeah it has served me well. It's not like Ampere coming out will suddenly make the 1080Ti terrible either so waiting until all cards are on the table makes sense to me.

If AMD can't compete then it'll be easier to buy Ampere anyway, if AMD can compete then I will probably get a better deal one way or another.
 
The whole 3080 10GB is actually really clever by Nvidia. Because it will make the card much cheaper for them to make, but also. when the reviews all come out after release. The card will no doubt perform great, because the reviewers will be testing GTAV, Witcher, CSGO, etc etc, the usual suspects. And the VRAM won't all be used up. But when you find a title that does (MSFS) or more in 12 months time. The card will be seriously gimped. But by then you have already bought it based on the release reviews.
Basically the reviews won't show it gimped by mem saturation. But that is a real threat down the line.
 
The whole 3080 10GB is actually really clever by Nvidia. Because it will make the card much cheaper for them to make, but also. when the reviews all come out after release. The card will no doubt perform great, because the reviewers will be testing GTAV, Witcher, CSGO, etc etc, the usual suspects. And the VRAM won't all be used up. But when you find a title that does (MSFS) or more in 12 months time. The card will be seriously gimped. But by then you have already bought it based on the release reviews.
Basically the reviews won't show it gimped by mem saturation. But that is a real threat down the line.

yep sounds correct
 
There will probably be higher VRAM versions later on but I would buy anything until both companies launch their full ranges.

I'm on a ultrawide gsync for the time being that I'm not ready to replace so what AMD do next won't make too much of a difference to me sadly. Next time, once I've managed to change my monitor, I'd love to have a choice. I'll support AMD with a console buy :D
 
Last edited:
The whole 3080 10GB is actually really clever by Nvidia. Because it will make the card much cheaper for them to make, but also. when the reviews all come out after release. The card will no doubt perform great, because the reviewers will be testing GTAV, Witcher, CSGO, etc etc, the usual suspects. And the VRAM won't all be used up. But when you find a title that does (MSFS) or more in 12 months time. The card will be seriously gimped. But by then you have already bought it based on the release reviews.
Basically the reviews won't show it gimped by mem saturation. But that is a real threat down the line.

Now that is revealing one big issue IMO with reviewers and TechTubers and overall review mentality. It's pretty much release reviews and forget about the product. Not that many go back after 6-12 months and retest and even fewer actually dives deep into calibrating/tune the products(that's a separate pet-peeve of mine).
 
Gainward just made sure they got a boat load of free marketing by "accidentally" releasing the 3080/3090 full product page. In b4 the other board partners...
 
I'm on a £700 gsync for the time being that I'm not ready to replace so what AMD do next won't make too much of a difference to me sadly. Next time, once I've managed to change my monitor, I'd love to have a choice. I'll support AMD with a console buy :D

That's one way of consoling them! :D
 
I'm on a £700 gsync for the time being that I'm not ready to replace so what AMD do next won't make too much of a difference to me sadly. Next time, once I've managed to change my monitor, I'd love to have a choice. I'll support AMD with a console buy :D

Just look when Navi launched,the Super range was released and prices dropped. Always wait for both companies to show their hands. A famous example was with the release of the HD4870 when Nvidia had to crater prices.

Personally I wouldn't have locked myself into a brand. It would have been better to sell that monitor and get a decent FreeSync one as Nvidia now supports it.
 
I like how this thread went from I'm not buying these cards at these prices to complaining about vram on cards you have no intrest in buying.

So you're intrested in buying them then. ;) :D
 
It will if they are competitive and personally I wouldn't have locked myself into a brand. It would have been better to sell that monitor and get a decent FreeSync one as Nvidia now supports it.

Was a few years ago. You didn't really have a choice then - there was no cross support. If you had an AMD card you would've naturally gravitated towards a freesync monitor and Nvidia, g-sync. Much better now with nvidia supporting both. I'd imagine there's loads of us in a similar situation.
 
Yea, I never believed the 2K nonsense anyway. I would be surprised if it is priced much more than a 2080Ti personally. End of the day it is very likely going to be on a smaller die, so if priced more it just means even more profit in Nvidia's coffers.

I suspect the price is without GST though. So in the UK it could be with 20% more VAT added.

However,its a huge price increase over the RTX2080TI which was closer to £1000. So even if £1300 was with VAT,they have pushed up pricing.

Was a few years ago. You didn't really have a choice then - there was no cross support. If you had an AMD card you would've naturally gravitated towards a freesync monitor and Nvidia, g-sync. Much better now with nvidia supporting both. I'd imagine there's loads of us in a similar situation.

I probably would just buy a monitor which supported both. I still remember a few years ago some here were saying FreeSync would lock you in - most of us said Nvidia would eventually support it.
 
Back
Top Bottom