• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,511
Location
Greater London
Yes, it's why I was able to play games at 4k on a 780 Ti 5 (6?) years ago.
Same here mate, been gaming at 4K since 2014. Best part is, even with the optimised settings it looks better than 1440p at maximum settings. But back then people pretended they couldn’t see the difference between 1440p and 4K. Now they are all upgrading in droves and finally admitting that it is better :p:D
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
So I just watched the video again but this time on my iPhone OLED and then my TV OLED and then a 3rd time back in the IPS monitor.

On the ips monitor there is a ton of black crushed darkness like you say. On the OLEDs there is like way way more detail to see and it's much clearer.

so it would appear they've redone the lighting the game at night to give it that game of thrones look - basically the game isn't supposed to look pitch black, it's a very low level of black - and unfortunately non OLED and non FALD screens have a super hard time trying to produce the image correctly often creating an image that looks darker and crushed.

Good guy CD Projekt Red is not just forcing you to buy a new graphics card but forcing you to buy a high end TV too, what a time to be alive!
Trouble is you can count on one hand the number of PC monitors with either OLED or FALD.

If the picture only looks right on those devices, that's an issue.

Simple: 2018 is during night time (or the lighting of the scene is faulty), 2020 is during day (Sun is up and to the right, judging by the shadow on that wall, so light comes from the front). Also, most likely in 2020 demo, the car is under/inside some sort of passage/bridge. Also, I guess the back window could turn "solid" if they so desire (see the demo with the "solid" windows when he gets inside and after he turns the car on they turn "clear"). It's perfectly valid. :)

PS: Just because RT can simulate how natural light works, it doesn't mean that the way it does in real life/game is desired by the artist composing the scene. Film studios or even a simple photographer could need quite a bit of equipment to light a scene properly for the dramatic effect that he's after. So if they don't add extra light somehow in the scene while using RT, although the light is natural/correct, it possibly could not be as dramatic as intended using the original lighting sources. That's why, I guess, having 2 ways of doing stuff (RT + faking it), can be troublesome in trying to maintain the same overall mood within the same scene.
Speculation aside, at the end of the day I don't want to play a game where everything is so pitch black I can't see a damn thing :p The 2018 version seemed to be on the dark side, but watchable. The 2020 version (on this IPS display) is virtually pitch black.
 
Soldato
Joined
3 Sep 2008
Posts
3,401
So I just watched the video again but this time on my iPhone OLED and then my TV OLED and then a 3rd time back in the IPS monitor.

On the ips monitor there is a ton of black crushed darkness like you say. On the OLEDs there is like way way more detail to see and it's much clearer.

so it would appear they've redone the lighting the game at night to give it that game of thrones look - basically the game isn't supposed to look pitch black, it's a very low level of black - and unfortunately non OLED and non FALD screens have a super hard time trying to produce the image correctly often creating an image that looks darker and crushed.

Good guy CD Projekt Red is not just forcing you to buy a new graphics card but forcing you to buy a high end TV too, what a time to be alive!

They aren't forcing you to do anything, ultimately developers have a vision for their game and what it takes to look its best. The game looks great to me and that's coming from watching compressed youtube and watching on an ips display, maybe I have a low tolerance? /shrug
 
Soldato
Joined
24 Aug 2013
Posts
4,549
Location
Lincolnshire
The build the people got to play was run on a system containing a 2080Ti, and was run at 1080p with DLSS (so lower than 1080p rendering) with only some of the ray traced features, and to top it off it still dropped well below 60fps.

Will need some serious optimisation before launch I think!

Imagine the hard time the consoles are going to have running with much lesser specs. Even the next gen consoles are well under a 2080Ti.

C2077 is going to be nvidias flagship RTX game so I fully expect ampere to be the generation they will be showcasing it with soon enough.

Just hoping RTX performance to be much improved as that’s what’s most likely to be hogging all the power.
 
Soldato
Joined
10 Oct 2012
Posts
4,421
Location
Denmark
So I just watched the video again but this time on my iPhone OLED and then my TV OLED and then a 3rd time back in the IPS monitor.

On the ips monitor there is a ton of black crushed darkness like you say. On the OLEDs there is like way way more detail to see and it's much clearer.

so it would appear they've redone the lighting the game at night to give it that game of thrones look - basically the game isn't supposed to look pitch black, it's a very low level of black - and unfortunately non OLED and non FALD screens have a super hard time trying to produce the image correctly often creating an image that looks darker and crushed.

Good guy CD Projekt Red is not just forcing you to buy a new graphics card but forcing you to buy a high end TV too, what a time to be alive!

Well, this has been a problem for long on the monitor market. Gaming monitors are generally ****. Even the very best Gaming monitors are barely acceptable when compared to the cheaper OLEDs on the TV market. We need an evolution on the PC monitor market before we start worrying about better fidelity from the game engines themselves. What point is there for a Game developer to spent 1000's of hours on fine details like that is just going to be ruined by ****** TN's, VAs and IPS panels with its 8-bit colors(if your lucky 8bit+frc), backlight bleed, Glow issues, black crush, bad color banding, and grading and so on and so forth? The weakest link will always drag down the rest of a system.
 
Soldato
Joined
9 Nov 2009
Posts
24,824
Location
Planet Earth
Imagine the hard time the consoles are going to have running with much lesser specs. Even the next gen consoles are well under a 2080Ti.

C2077 is going to be nvidias flagship RTX game so I fully expect ampere to be the generation they will be showcasing it with soon enough.

Just hoping RTX performance to be much improved as that’s what’s most likely to be hogging all the power.

It won't be unless tons of Pascal and Turing owners complain about performance. Saw what happened with W3,Kepler owners complained for months,AMD owners complained for months and eventually CDPR patched the game,and added extra options. They basically implemented Gameworks features months before release,as it was sponsored by Nvidia.

CDPR is also delaying "the next generation" console version until 2021,so it would be funny if performance was patched months after release....the same as with W3! ;)

Didn't even have all the RT on, and was even dropping under 30 in some places, in less than 1080p, ouch! :p

https://wccftech.com/cyberpunk-2077...dlss-2-0-enabled-on-an-rtx-2080ti-powered-pc/

With an RTX2080TI! Looks like Turing won't age well,and I remember a number here buying Turing with an eye to Cyberpunk 2077! :(

I told you won't have Loadsamoney. This game is a vehicle to sell Ampere GPUs. Even the "next generation" version on consoles,which supports RDNA2 properly won't be out until 2021:
https://www.videogameschronicle.com/news/full-next-gen-cyberpunk-2077-wont-be-a-launch-game/

Hopefully CDPR will see some of the negative feedback WRT to performance and actually try and improve it on Turing and Pascal.

Well, this has been a problem for long on the monitor market. Gaming monitors are generally ****. Even the very best Gaming monitors are barely acceptable when compared to the cheaper OLEDs on the TV market. We need an evolution on the PC monitor market before we start worrying about better fidelity from the game engines themselves. What point is there for a Game developer to spent 1000's of hours on fine details like that is just going to be ruined by ****** TN's, VAs and IPS panels with its 8-bit colors(if your lucky 8bit+frc), backlight bleed, Glow issues, black crush, bad color banding, and grading and so on and so forth? The weakest link will always drag down the rest of a system.

Exactly!! Most PC gamers are not going to be spending £500+ on super duper gaming monitors or £1000+ on OLED TVs too,so even if such displays were to exist they are a niche. Mainstream monitors are barely improving.
 
Last edited:
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
They were running on a 2080 ti.
2080's are also poor at RT, common knowledge and a weakness of the Turing series that will become apparent with new games. I would expect that this game runs pretty badly on a 2080Ti and great on a new-gen card because from what I have seen it is looking truly next-gen and something worth upgrading for.
 
Soldato
Joined
3 Sep 2008
Posts
3,401
2080's are also poor at RT, common knowledge and a weakness of the Turing series that will become apparent with new games. I would expect that this game runs pretty badly on a 2080Ti and great on a new-gen card because from what I have seen it is looking truly next-gen and something worth upgrading for.

Well having the expectation for a current cream of the crop GPU to struggle at 1080p (even with RTX features enabled) does not exactly inspire confidence. Next-gen GPUs will have to absolutely blow the current offerings out of the water and a lot of optimisation needs to be done to make the jump worth it.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
Unless all the crazy GDDR6X rumours turn out true then I don't know how people expect RT performance to sky rocket for the new arch.
They've been very good at keeping us completely in the dark, so who knows.

One thing I'm more sure of, is that if there is an increase in perf, there will be an increase in price :p(And in fairness AMD will be only too happy to follow where nV lead :p)
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,511
Location
Greater London
Trouble is you can count on one hand the number of PC monitors with either OLED or FALD.

If the picture only looks right on those devices, that's an issue.
He does not care, he has an OLED. Nana nana na na! :p

I got one to, but not as a monitor :(

Though I did a get a much smaller and lighter PC case recently do I might just take it downstairs for a while for one playthrough over Xmas holidays :D
 
Soldato
Joined
14 Aug 2009
Posts
2,755
This is true. Deus Ex Mankind Divided can be very hard to run, yet if you optimise the settings you can have the game look damn near as good where you need still shots to see the difference and gain a lot of fps. As I recall the main culprit is Contact Hardening Shadows which is meant to be more realistic shadows, but is it worth the huge fps drop? As I recall it was like 40% or more drop in fps. That is like the difference between a 1070 and 1080Ti. I would rather have the 1070 running cooler and quieter and have that option off :p

I find that in most games going from very high to ultra settings have a huge impact on fps with very little improvement in image quality, there are exceptions, but mostly it is this way. That is why I get away with a xx70 class GPU’s by tweaking settings on new and demanding games :D

Difference between that on and off on a rtx2080 is kinda 0%, so it depends heavily per architecture and card how each setting is handled. MSAA on the other hand... :)
Overall the level of detail in the games is quite impressive. Digital Foundry said Cyberpunk is Deus Ex level of detail, but on a huge open world, so that's that!

Speculation aside, at the end of the day I don't want to play a game where everything is so pitch black I can't see a damn thing :p The 2018 version seemed to be on the dark side, but watchable. The 2020 version (on this IPS display) is virtually pitch black.

Metro Exodus felt a bit dark as well, indeed, but only in some specific places. But that's the way it is, I guess. They'll need more light to be added to some of the scenes or more reflective surfaces that bounce around the light in the right way. So i doesn't "just works"! :p ... or maybe we need better, HDR monitors.

Anyway, you can always turn it off.
 
Caporegime
Joined
8 Sep 2005
Posts
27,421
Location
Utopia
Well having the expectation for a current cream of the crop GPU to struggle at 1080p (even with RTX features enabled) does not exactly inspire confidence. Next-gen GPUs will have to absolutely blow the current offerings out of the water and a lot of optimisation needs to be done to make the jump worth it.
If it truly is a next-gen engine (like Crysis was in its day) with RT properly implemented, then I assume we can expect a heavy penalty on older generation cards.
Unless all the crazy GDDR6X rumours turn out true then I don't know how people expect RT performance to sky rocket for the new arch.
According to reports it will have over double the RT cores. As far as I know, RT capability is not dictated by memory bandwidth.
 
Associate
Joined
23 Dec 2018
Posts
1,101
I'll either play Cyberpunk 2077 at decent settings at 1440p, 95hz on my 2070 Super or I just won't buy it at all for years.

Many in the same boat, I'm sure it'll still be decently playable with RTX off and other lower settings.

I'm of the slightly controversial opinion that CD Projeckt have yet to make a truly great or even very good game (Witcher 3, lack of customisation, very poor voice casting, boring, limited combat, made it all the way to Skellige so I gave it a good chance) so I'm definitely not upgrading on the strength of an unreleased game, granted one that does seem to improve a lot on my Witcher 3 complaints.
 
Soldato
Joined
22 Nov 2003
Posts
2,932
Location
Cardiff
In all honesty I’d rather have a scene lit for dramatic effect than accurately. As others have said before, accurate is not always “fun”. Many RT scenes just end up looking too dark for me.
 
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
It won't be unless tons of Pascal and Turing owners complain about performance. Saw what happened with W3,Kepler owners complained for months,AMD owners complained for months and eventually CDPR patched the game,and added extra options. They basically implemented Gameworks features months before release,as it was sponsored by Nvidia.

CDPR is also delaying "the next generation" console version until 2021,so it would be funny if performance was patched months after release....the same as with W3! ;)

Your memory is failing you, that isn't what happened. The keplar issues were nVidia's doing, and nVidia fixed it with a driver update, 353.06 to be exact. None of that had anything to do with CDPR. The options that were added were related to hair tessellation, something that affected almost everybody but it's worth noting the tessellation amount was already configurable in an .INI file - it just wasn't in the game at release.

Hopefully CDPR will see some of the negative feedback WRT to performance and actually try and improve it on Turing and Pascal.
Yeah CDPR are soo lazy :confused:
 
Soldato
Joined
9 Nov 2009
Posts
24,824
Location
Planet Earth
Your memory is failing you, that isn't what happened. The keplar issues were nVidia's doing, and nVidia fixed it with a driver update, 353.06 to be exact. None of that had anything to do with CDPR. The options that were added were related to hair tessellation, something that affected almost everybody but it's worth noting the tessellation amount was already configurable in an .INI file - it just wasn't in the game at release.


Yeah CDPR are soo lazy :confused:

Wrong my memory isn't failing - yours is(if you want to use terms like that). Because Kepler performance was so poor,CDPR added extra options in the menu system,so Kepler users could muck around with settings. If Nvidia had fixed it all,there wouldn't be a "need" for the options.

It also didn't explain why AMD GPUs had problems too,ie,when there was a GCN based console port - AMD called out CDPR publicly for what they were doing. The console port does some of the stuff such as hair animations in a different way. At Nvidia's behest they added Gameworks options 2 months before launch,which replaced the existing implementations.

"We've been working with CD Projeckt Red from the beginning," said Huddy. "We've been giving them detailed feedback all the way through. Around two months before release, or thereabouts, the GameWorks code arrived with HairWorks, and it completely sabotaged our performance as far as we're concerned. We were running well before that... it's wrecked our performance, almost as if it was put in to achieve that goal."

Nobody forced CDPR to use Gameworks or not to properly add options at launch,so Kepler users could easily adjust options. They knew very well most gamers are not dorks on forums,who would be searching for text files,etc.

I don't blame Nvidia for that,as they wanted to sell Maxwell GPUs,CDPR entered the agreement knowing what was required of them.

Most games companies will test their performance on lots of hardware prior to launch - or do you think they make up the technical specs required to run a game?

So CDPR,would have known at launch,the performance issues,yet chose to launch it that way. They are both developer and publisher of the game.

Most Nvidia users were not on Maxwell GPUs,by then,they were on Kepler.

Next time don't try to attack people to defend your love of CDPR. But I remember,how some reviewers were attacked over bugs/problems in W3,and the community harrassed them massively. Then months later CDPR actually patched those problems. It is never ever the fault of CDPR,just because they are "nice" to the community. They have got away with lots of bugs,optimisation problems,etc that most developers would get criticism over.

The problem is the community isn't doing themselves any favours,as lots of other developers started that way,and then morphed into different entities as they knew they could get away with more and more stuff.

Well who knows,maybe I will be wrong,and it isn't some poorly optimised game when it is launched. Hopefully I am!
 
Last edited:
Back
Top Bottom