• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

really getting fed up with the posts stating RTX/DLSS does not work this gen

Not watched it, but I am guess it says it is disappointing but it lays out the foundation. That is my assessment 1 year on.

Just hope now that devs have had the tech for a while we will see something better next year.

It's a very interesting vid, worth watching imo. It's more on the technical side of what RT means & how it's achieved rather than an impressions video like Hardware Unboxed's. Very illuminating about the future of RT & Turing. I think much the same as he did for the same reasons, hence I'm waiting for a beast with HBM for RT rather than just these meh GDDR6 cards.
 
It's a very interesting vid, worth watching imo. It's more on the technical side of what RT means & how it's achieved rather than an impressions video like Hardware Unboxed's. Very illuminating about the future of RT & Turing. I think much the same as he did for the same reasons, hence I'm waiting for a beast with HBM for RT rather than just these meh GDDR6 cards.

Seems to have missed the point on a few things though - nVidia's current denoise solution is far superior on balance than other attempts and reusing the hardware for more general purpose processing units wouldn't be an advantage and likewise the RT cores provide a benefit performance wise you wouldn't get from a more general purpose use of the die space - currently around 6x greater RT performance than Pascal for instance whereas using that same space in a general purpose manner would provide an approx. 1.6x max improvement over Pascal instead (probably less than that).
 
Not worth the price at all. The 2080ti is way overkill for current games so it's a waste of money. Hardly anyone is playing in 4k and even mid range cards run RDR2 pretty well.
 
Find a game which needs a £1000+ gpu to run well at 1080 or 1440 (what most are using)...
There is also 1440p UW at 100+ Hz which requires same amount of power to run as 4K60.
I did have 2080ti and it struggled at that resolution in many games on high/ultra settings (when I say struggled I mean it didn't reach anywhere near 100fps).
I agree that £1000 is overkill to pay but performance wise not so much.
 
But like I said. 4k isn't what most gamers are using. It kills performance on anything and it doesn't actually look any better than 1440.
 
But like I said. 4k isn't what most gamers are using. It kills performance on anything and it doesn't actually look any better than 1440.
Head over to specsavers. Been using it since 2014 and I see the difference day and night.

Each to their own, but I tried to go back to 1440p and could not as you miss out on a lot of clarity in detail. I remember all the back in 2015 testing both side by side with a fifa game and the grass looked so much clearer and sharper on 4K and that is not exactly the most graphically intensive game. Don't get how people don't see the difference.
 
But like I said. 4k isn't what most gamers are using. It kills performance on anything and it doesn't actually look any better than 1440.

Doesn't look any better? Say what... Maybe if you have a 1440p screen, but putting out 1440p on a 4k screen is not even close to 4k visuals. If it was, I'd be playing all my games at 1440p 120hz on my TV, but I give up the refresh to play 4k 60hz.

There are games which needs a $1000 GPU at 1080/1440 - see RDR2 :p
 
Doesn't look any better? Say what... Maybe if you have a 1440p screen, but putting out 1440p on a 4k screen is not even close to 4k visuals. If it was, I'd be playing all my games at 1440p 120hz on my TV, but I give up the refresh to play 4k 60hz.

There are games which needs a $1000 GPU at 1080/1440 - see RDR2 :p

Non-native resolutions never look as good.

RDR2 runs fine on mid range cards. In fact, people are having a better experience with it on AMD cards.
 
Not worth the price at all. The 2080ti is way overkill for current games so it's a waste of money. Hardly anyone is playing in 4k and even mid range cards run RDR2 pretty well.

Using 4k for me is far more important than if a game has Ray Tracing.

Even when a game supports SLI like SOTTR a couple of RTX Titans are only just up to the job with all the settings turned on.

Having said that you can use almost any card to run 4k if you turn off enough settings.
 
Using 4k for me is far more important than if a game has Ray Tracing.

Even when a game supports SLI like SOTTR a couple of RTX Titans are only just up to the job with all the settings turned on.

Having said that you can use almost any card to run 4k if you turn off enough settings.
Somtimes people just say that 4K does not look any better to convince themselves they are not missing out on anything I recon. Either that or need to go specsavers :p
 
Somtimes people just say that 4K does not look any better to convince themselves they are not missing out on anything I recon. Either that or need to go specsavers :p

Unless you're spending a lot of money on the higher end 4k screens, they actually have a worse picture quality. If you want something with 120hz (more important for gaming) AND a good picture you need to spend closer to £2000...

Gaming on 1440 with 120hz is much nicer than a sloppy 4k panel.
 
Back
Top Bottom