• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

They are all going to be big GPUs if they have a 350W TDP. Honestly,it would be better to wait and see how both companies stack up,after they release their GPUs.

With the expected performance numbers we have seen and the leaked prices, waiting for AMD's offerings is my current plan. If the leaks hold true, what they are offering just isn't that tempting at the prices they are rumored to be charging.
 
I probably would just buy a monitor which supported both. I still remember a few years ago some here were saying FreeSync would lock you in - most of us said Nvidia would eventually support it.

At that point in 2016 when I bought my first gsync monitor, freesync had the potential to be supported by both but there was no sign Nvidia would actually be generous enough to support it and they didn't for another 3 years, to be fair. In the meanwhile I've had years of being able to use adaptive sync tech on the only type of monitor I could with an Nvidia gpu.

Buying freesync now is a no brainer. It definitely wasn't back then.
 
With the expected performance numbers we have seen and the leaked prices, waiting for AMD's offerings is my current plan. If the leaks hold true, what they are offering just isn't that tempting at the prices they are rumored to be charging.

TBF,even if Nvidia were to have decent offerings,if AMD is going to have their releases so soon after,I would still wait and see what they have to offer. At least that way,you know you made the best choice!

At that point in 2016 when I bought my first gsync monitor, freesync had the potential to be supported by both but there was no sign Nvidia would actually be generous enough to support it and they didn't for another 3 years, to be fair. In the meanwhile I've had years of being able to use adaptive sync tech on the only type of monitor I could with an Nvidia gpu.

Buying freesync now is a no brainer. It definitely wasn't back then.

We will need to see how AMD and Nvidia prices this generation. It might be worth looking at AMD and a new monitor if they do price competitively(and that monitor would work with Nvidia GPUs too).
 
TBF,even if Nvidia were to have decent offerings,if AMD is going to have their releases so soon after,I would still wait and see what they have to offer. At least that way,you know you made the best choice!

I agree but with no timeline, people will get impatient. If AMD have any sense they would immediately announce when their products will be launched/available as soon as NVidia's event is over.
 
I looked at sold prices of auction style listings, and Im in the UK. £400 +/- £30 was the range on the first couple of pages, last couple of days.

Might want to check those pricings.... £400 is for "parts only" , "spares or repairs" or faulty cards. On Ebay working 2080tis are about £700/750 and they are selling on MM for about £650-700
 
Actually I may have exaggerated my framerates in Control a 4K Ultra with RT..:p

H0GugzC.jpg


So, 15fps might be doable with an RTX 3090 :D


That's not how it works.

andnif with everything you've got enables it's basically path tracing - the fact it even runs is amazing - that's about 10 times more demanding of a scene than anything AMD has pulled off so far
 
TBF,even if Nvidia were to have decent offerings,if AMD is going to have their releases so soon after,I would still wait and see what they have to offer. At least that way,you know you made the best choice!

So much that.

Can't believe people already bought new PSUs tbh. Or sold their 2080Ti.
 
I agree but with no timeline, people will get impatient. If AMD have any sense they would immediately announce when their products will be launched/available as soon as NVidia's event is over.

TBF,we know they are out this year,and lots of rumours say at least by November. Its still summer,so realistically for the sake of an extra month or two,when people can keep their GPUs for a few years,it doesn't seem much of a wait.

So much that.

Can't believe people already bought new PSUs tbh. Or sold their 2080Ti.

Unless your system is going to go absolutely kaput,I don't understand why people can't wait anymore to evaluate all the options they have.
 
The whole 3080 10GB is actually really clever by Nvidia. Because it will make the card much cheaper for them to make, but also. when the reviews all come out after release. The card will no doubt perform great, because the reviewers will be testing GTAV, Witcher, CSGO, etc etc, the usual suspects. And the VRAM won't all be used up. But when you find a title that does (MSFS) or more in 12 months time. The card will be seriously gimped. But by then you have already bought it based on the release reviews.
Basically the reviews won't show it gimped by mem saturation. But that is a real threat down the line.

This just isn't accurate though, there was 5 pages of discussion about this. You can't load Gbs of additional assets into vRAM (fill your game with additional high resolution textures, higher detailed models, etc) and keep your frame rate the same, when you do, you create more work for the GPU and the frame rate goes down, and eventually it becomes too low to be playable. The reason you wont see vRAM limitations at launch is because any game that can come even close to filling 10Gb of vRAM which from benchmarks and real world measurements so far has proven to be FS2020, your frame rate is completely and utterly unplayable.

Yes future games will demand more vRAM but they'll also have higher demands on the GPU itself and there's no point in having say 16Gb of vRAM on a card if the GPU is going to choke when you're using only 10, it's just a waste and a way to make the card much more expensive with no actual benefit. And we have benchmarks of games that do actually use >10gb and that's what we see even on a RTX Titan. Looking at lists of all the top games today and their vRAM usage most are around 5Gb or less, a handful go above that but dont get close to using all 8Gb of modern cards. 10 is going to be completely fine.
 
At one stage MLID was posting rumours that Nvidia had created a new memory compression technique using Tensor cores but I haven't heard anything recently. If that was true and rtx3000 had better memory compression than 10gb would be fine
 
I honestly don't think my 1080Ti can actually *use* all 11gb of RAM. I just don't think the GPU and bus can move that much data quickly enough.

We need some sort of metric that compares GPU "power" and Vram bandwidth, to find the cutoff point for capacity.

We all (instinctively) know there's a point where the GPU can't use an infinite amount of vram, but I don't think I've seen a reliable way to calculate where that point is from one card to another.
 
I've been to Rio de Janiero once. I arrived at night and when I awoke in the morning I looked out the 20th floor hotel window. As I looked down it looked like Crysis I **** you not. That's the first thing that occurred to me.
If a video game is 'that' good its really important in the grand scheme of things.

Crysis was a graphics demo masquerading as a game. It may have been amazing to look at but it wasn't representative of gaming in general at the time and thus benchmarks were merely a curiosity (i.e. the infamous 'can it run Crysis') rather than actually useful to any degree.
 
Why wouldn't you sell your 2080 Ti? If you know the value of the card is going to insta-drop when they announce the 3080 for £200-400 less than a 2080 Ti which is faster, has the most update support and smashes Turing in it's only trump card - RTX.

Sell 2080 Ti now at £700... or wait a day for that brand new car level depreciation.
 
yeah be a good mach for a 3080/3090, other may have opinions for other manufactures, i have a sf750w sfx platinum unit, on the low side but it should do, if not i'll have to opt for a bigger unit
So i am checking prices and the rm1000i is only a tenner more than the rm850i, that's how crappy prices are now i guess. Not sure what's best choice for the money now tbh
 
Back
Top Bottom