Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Once it becomes mainstream, the 3000 series will be too old and too slow to run it well.
Raw performance at what? Drawing polygons as fast as possible? So anti-aliasing has been a waste of time? And Variable Refresh-rate tech? And all the other bells and whistles that are now just considered part of that raw performance, but people said similar things about when they were introduced?
My graphics lecturer in the mid 90s was on about hardware accelerated raytracing being the future. It's not just 'candy', it's a major change.
Raw performance at what? Drawing polygons as fast as possible? So anti-aliasing has been a waste of time? And Variable Refresh-rate tech? And all the other bells and whistles that are now just considered part of that raw performance, but people said similar things about when they were introduced?
My graphics lecturer in the mid 90s was on about hardware accelerated raytracing being the future. It's not just 'candy', it's a major change.
Nah its just candy. Doesn't affect gameplay.
Likewise, think from both hardware and software perspective. For a number of generations you'd expect big leaps forward with this tech. Similar to tessellation, although that wasn't as big a step as RT but it took a few generations and now nobody talks about it. RT will just become the norm at some point.I've said it before, but I reckon that it will be the next generation of GPUs (i.e. 2022) when RT really hits its stride.
Not seeing anything in these leaks that will change my mind on skipping the 3xxx series and waiting for 4xxx cards in 2022-23 to replace my 2080, TBH.
I really can't see (Native) 4K 60fps RT being achievable until the 4080/Ti cards arrive. The 3xxx cards still look like they will only be able to do it by relying on 'smoke & mirrors' DLSS & VRS trickery.
I paid £650 for my 2080 in January 2019 and my simple rule for upgrading my GPU is that I won't upgrade until I can get (at least) double the performance of my current GPU for the same price, and that just doesn't look likely with these 3xxx cards.
Raw performance at what? Drawing polygons as fast as possible? So anti-aliasing has been a waste of time? And Variable Refresh-rate tech? And all the other bells and whistles that are now just considered part of that raw performance, but people said similar things about when they were introduced?
My graphics lecturer in the mid 90s was on about hardware accelerated raytracing being the future. It's not just 'candy', it's a major change.
8GB VRAM will make you replace your 2080 sooner than you think. Next gen consoles have 12-14GB reserved for their VRAM allocation.
Playing Half life Alyx on my card regularly shows 10, 11GB VRAM usage, on a Valve Index.
Yeah well it's redundant now at least top end cards like 2080ti and above can do 4K relatively well, that wasn't the case five years ago when I first went 4K.Replace dead with "becoming far less relevant" and it would be more accurate.
I'd say no, I'd want more than 2080ti performance for 4K as you have to dial down settings for the newer games even with 2080ti (e.g.RDR2, KCD etc) and hardly any games feature RT and when they do it's often barely perceptible. Additionally it has to be 4xRT performance as 2x 2080ti RT is simply nowhere near enough unless you're playing at 1080.If a 3080 was a 2080ti with double RT performance for £750 what would you say?
It's just not nearly good enough yet, 10xRT performance of 2080ti plus the 30-40% rasterization performance increase works. Drip feeding pathetic incremental RT performance improvements won't cut it simply because its still at such a low level and so barely perceptible for such a massive performance hit. That is unless we all want to start gaming at 1080p again.Lets agree to disagree rather How is the goal of creating vastly improved lighting in games "not important" to visuals and immersiveness? Improved lighting is one of the most, if not the most, important visual steps in making games look more realistic and true to life. So far it has been limited by the poor adoption and performance of Turing, but with Ampere and RT-enabled consoles arriving it will finally start to take off and be developed properly. In a few years game visuals should dramatically improve as a result of this.
+1Once it becomes mainstream, the 3000 series will be too old and too slow to run it well.
RT will eventually be nice, but the early-adopter tax just isn't delivering the goods right now and I seriously doubt it will with the 3000.
Now, if they can get pricing down where the cost of RT to the consumer is buried in the "noise", we can enjoy our equipment while the manufacturers continue advancing RT as best they can.
Yes and no. At 4k 60fps I’ve yet to see many games come anywhere near the 11gb on my 2080Ti. Even back in the day when I ran 4k 60 with SLI 980’s (4gb) they didn’t run out either and to be perfectly honest games haven’t advanced massively since then.
Obviously more is better but 8gb should still be able to last a good few years yet.
Also on a side note a card with more memory uses more. Your card may use 10/11 but an 8gb card will not use the same 10/11 at the exact same settings. I bet it uses between 7-8gb.
The next generation consoles if I remember correctly have 10gb of vram so not massively more than a current 8gb card and that is going to have to last for a number of years.
I recon on pc 32gb will be the new 16gb for ram and for gfx cards the new 8gb will be 16gb vram. This will be on Nvidia’s 4000 series and whatever AMD’s nvidia killer is at the time.
64gb of ddr5... Not going to be friendly on the wallet.works, in regards to accessing data from storage as well as the unpredictability for game developers knowing what hardware a user has, I suspect memory requirements for PC will increase significantly once developers start push these new consoles (around 2022). I think we will see gaming systems with 64GB of RAM and
I recon on pc 32gb will be the new 16gb for ram and for gfx cards the new 8gb will be 16gb vram. This will be on Nvidia’s 4000 series and whatever AMD’s nvidia killer is at the time.