• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Both MS and Sony have confirmed 4k @ 120 fps. While that could be Displayport (as I'm currently using), it's almost certainly HDMI 2.1.

I assume you mean for the new Xbox and PS then, which i know sod all about tbh. Fair enough if they've said it, and it'd built in to their APU's then great.
Still, I don't think it's actually been confirmed for GPU's has it? Other than the fact it'd be slightly commercially suicidal not too if consoles are including it for video playback.

Other than that, I think i'm right in saying TV's just dont come equipped with displayport 1.4 and only the high end 10xx and 20xx series Nv GPU's had a DP 1.4 output? also, wasnt there a load of frame rate and colour output stuff as well. I suppose HDMI 2.1 will solve most of those issues.
 
Any game can cripple any GPU. Hairworks and RTX are the two obvious culprits. I've been gaming at 4K since I got a 780 Ti.
Been gaming at 4K since 2014 myself. No problems if you are not twitch gamer and optimise your settings. Still looks better than 1440p ever can on any settings as far as I am concerned. Hence why I have never been able to go back.
 
"estimated", and people have been doing that estimation by comparing Tflops, which is usless when you are comparing different architectures let alone different architectures and vendors.
FYI, the estimation isn't just based on Tflops. It is also based on a Gears of War 5 demo that was shown. Article: https://wccftech.com/gears-5-xbox-series-x-demo-rtx-2080-ti-performance/
As you can see, the Gears 5 on Xbox Series X does look noticeably better than the Xbox One X version. On XSX the game basically runs PC Ultra settings (and sometimes beyond Ultra) at 60fps, even in cutscenes. We also see evidence of some advanced techniques, including more natural-looking Screen Space Global Illumination lighting and some impressive volumetric fog. Ray tracing and other new DirectX 12 Ultra features have not yet been implemented, but there’s a very good chance they’ll be added by the time the Xbox Series X launches later this year (again, this demo was thrown together in only two weeks). The Coalition has also said they hope to get multiplayer running at 120fps.

Digital Foundry also got to see the Gears 5 benchmark running on the Xbox Series X and performance was almost identical to a PC equipped with an AMD Ryzen Threadripper 2950x and GeForce RTX 2080 Ti graphics card. So yeah, at least when it comes to running current-gen games, Microsoft’s new console stands up to (almost) the best PC hardware you can get.
 
Any game can cripple any GPU. Hairworks and RTX are the two obvious culprits. I've been gaming at 4K since I got a 780 Ti.

Try running RDR 2 at 4k on that 780ti in native resolution. :)

I play on triple 1080p screens for a while as well (as I prefer the larger FoV and can always fall back on a single 1080p when performance is not enough or the game doesn't support multi displays), and that's about 75% of 4k. There are quite a lot of new and relatively older games that require tweaking to various degrees to keep it stable at 60fps - that means no dropping into 50pfs or even lower, no 50 to 70fps and call it a 60. And no, no "dirty" hairworks (aka The Witcher 3) or RT of any kind. For instance, RDR 2 has the majority of the settings to low or disabled. Never mind "medium" across the board as it would deep constantly into 50fps (and perhaps lower).

Anthem, Crysis 3, Deus Ex MD, Just Cause 4, Kingdom Come Deliverance (if I remember exactly), Metro Exodus (without RT), Tom Clancy's The Division 2 (and the 1st one I think), Watch Dogs 2, Quantum Break (I think) are just some that require various adjustments - some minor, some important (as in "it bothers me that I have to lower settings that much or is noticeable). Increase even more the resolution to 4k and the drop will be more significant where it would probably bother me a lot.

Sure, each to their own, some won't mind 30-40fps or even lower - after all, there are millions of people playing games that go into 20fps and don't have the best image quality (plenty example on consoles), or just drop settings as low as needed, ergo wanting constant 60fps at relatively high settings (not maximum, mind you!), may be seen as... ridiculous? :) Still, in my book, if is X resolution at Y fps, then that Y fps has to be maintained and frame rates should not drop lower. Everything else is just marketing.

Can the next consoles do 4k@60FPS now, at native resolution? Sure, for most part should be no problem, but in the future... well, that's another story.
 
From personal experience, a rtx2080 has kinda a lot of problems even lower than 4k if you don't want to sacrifice settings.

Which games? I mean with any game, you kind of have to be sensible with settings. Toggle on RTX or hairworks or some ridiculously intensive process which does very little and it can cause any GPU a few issues.

If you're insistent on going ultra on every single game.. then sure.. as with any other gen.. for that you'll need the top of the line card.

My RTX 2080 has troubled me with two games; KCD and RDR2. With setting optimisation, RDR2 got very very playable.

I imagine an RTX 2080 equivalent GPU in a console can easily power games to 60fps given they can optimise better and consoles normally hover around the low to medium settings territory for graphics.
 
Don't forget late. No point having a competitive product release years after the competition

Spot on! We're still waiting nearly 2 years for an AMD 2080ti competitor.

And I guarantee when they release it for.. £200 less than a 2080ti at launch, the AMD crowd will start screaming from the rooftops how amazing AMD are for value.
 
I think a lot of people (including me) are thinking along those lines.

I recently fired my PS4 up after a couple of years though, and in that period I've gone from a 16:9 monitor to a 21:9 one, and boy, do I miss it when playing the PS4 on a 16:9 TV, which is something to consider.

I thought it was fantastic when I first had the PS4, now it looks squashed!

I also run a 21:9 screen, love it. I’m hoping that pricing isn’t going to be as brutal as some are suggesting. My 1080Ti is still doing a grand job, but I’m definitely out of the super high end card market now. 2080Ti prices already make my eyes water and I can’t see it getting any better.
 
Spot on! We're still waiting nearly 2 years for an AMD 2080ti competitor.

And I guarantee when they release it for.. £200 less than a 2080ti at launch, the AMD crowd will start screaming from the rooftops how amazing AMD are for value.

If Nvidia had been able to continue their cadence from previous generations, The 2080Ti should have cost about what the 1080Ti launched at and been at least 30% faster. AMD will be very late if they come in at that price/performance.

Nvidia either shot for the moon on margin, or they just couldn't innovate well enough to provide a meaningful performance increase at the old price points.

Either way, it's not my problem as the consumer. They ask whatever price they want for their product and I decide if it's worth my money.
 
If Nvidia had been able to continue their cadence from previous generations, The 2080Ti should have cost about what the 1080Ti launched at and been at least 30% faster. AMD will be very late if they come in at that price/performance.

Nvidia either shot for the moon on margin, or they just couldn't innovate well enough to provide a meaningful performance increase at the old price points.

Either way, it's not my problem as the consumer. They ask whatever price they want for their product and I decide if it's worth my money.

AMD are already late AF. They STILL don't have a 2080ti competitive card. I'm not sure if we should even mention them in the high end GPU conversation now.
 
AMD are already late AF. They STILL don't have a 2080ti competitive card. I'm not sure if we should even mention them in the high end GPU conversation now.
I think amd are going to be quite competitive with big navi when they release, nvidia got complacent with Turin and had to re-release their midrange cards else amd would have been ahead. I fully expect them to surpass the 2080ti and be challenging the 3080ti especially if the latter only moves to 10nm.
 
I think amd are going to be quite competitive with big navi when they release, nvidia got complacent with Turin and had to re-release their midrange cards else amd would have been ahead. I fully expect them to surpass the 2080ti and be challenging the 3080ti especially if the latter only moves to 10nm.

+1

AMD have finally cleared out their technical debt with GCN with a big jump in perf/watt to RDNA, so hopefully they considered scaling over 40CUs where we all know GCN scaled terribly.

The other positive factor is AMD have moved away from one design for compute + graphics, which is going to render huge benefits to gamers.

Architectural improvements + doubling of CUs should be able to render a 2x performance increase (assuming good scaling), which I'm guessing is roughly where the 3080ti is going to come in.
 
Last edited:
+1

AMD have finally cleared out their technical debt with GCN with a big jump in perf/watt to RDNA, so hopefully they considered scaling over 40CUs where we all know GCN scaled terribly.

and a 7nm RDNA1 5700Xt only just about beats/reaches the performance per watt for 16nm 3 year old 1080ti or 12nm Turing chip (with a bigger die and more features).

So AMD have a long way to go still in that regard.
 
Back
Top Bottom