• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Associate
Joined
16 Jan 2010
Posts
1,415
Location
Earth
That looks.....really,really average,TBH.

If the machine & specs hadn't been pointed out, I'd have assumed it was running on a current gen console, albeit the Pro or X.
I must agree I haven't seen anything impressive yet, maybe its compression, maybe because its lower res but I certainly wouldn't be creaming my pants over that. AND its at less than 1080 ffs with DLSS wizardry upscaling improving the image quality to boot....with a £1200 GPU lol. It's sort of where I expected video games to be about 5 years ago although I'd hope that at 'better than native' res it will look better. I remember being in Rio for the first time looking out of the top floor hotel window and literally thinking 'Wow it looks like Crysis'. With this I'm suspecting I'll see a Fallout game with slightly better lighting at 45fps 1080 on a £1200+GPU. It's really graphically pedestrian and iterative for me.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,570
Location
Greater London
I must agree I haven't seen anything impressive yet, maybe its compression, maybe because its lower res but I certainly wouldn't be creaming my pants over that. AND its at less than 1080 ffs with DLSS wizardry upscaling improving the image quality to boot....with a £1200 GPU lol. It's sort of where I expected video games to be about 5 years ago although I'd hope that at 'better than native' res it will look better. I remember being in Rio for the first time looking out of the top floor hotel window and literally thinking 'Wow it looks like Crysis'. With this I'm suspecting I'll see a Fallout game with slightly better lighting at 45fps 1080 on a £1200+GPU. It's really graphically pedestrian and iterative for me.
Huh? I thought it was 1080p running at 4K? That is what digital foundry said. Also Skill Up said this also.

I will be testing out many graphical settings before starting to play Cyberpunk 2077. First try native 4K, then DLSS 2.0 1440p and 1080p upscaled to 4K. Then again the settings above with RT on. Then after that will start turning off depth of field and a bunch of others settings that don’t do it for me one by one and see how that impacts the image quality. Finally lower settings from say Ultra to High and see if I can tell the difference without needing to do a screenshot. I would rather put the grunt into settings that have a noticeable impact in image quality :D
 
Soldato
Joined
19 Dec 2010
Posts
12,031
You know exactly what I mean. I'm not getting into tribalism. Intel also good at it, most of successful companies are, especially Apple. Plenty of consumers have bought inferior products by only looking at the headlines.

A good example today, lots of stories about Nvidia being "first" to DX12 Ultimate support. Academic at the moment but great marketing. The consumer hears "Nvidia first", "new technology" keeps a positive vibe around the name of the company regardless of whether the end user understands it.

AMD are getting better at this. Few buyers research things into the depth the posters here go so that kind of positive news influences buying decisions. Most sales are made on feelings not facts unfortunately.

One problem with your post, Nvidia have had the better products most of the time since the 9700pro. And I am not just talking high end.

AMD haven't lived up to their marketing, that's part of the problem. Poor Volta, Overclockers dream etc.

Then they have messed up a lot of launches, Hawaii, Fury, Polaris, Vega. Which leaves a bad impression.

Nvidia been first to market with DX12 ultimate isn't academic. If you have an RDNA 1 card you will have to upgrade to an RDNA 2 card to get those features. Whereas Turing cards support those features now, that's despite been out almost a year before Navi.

Based on AMD's performance in the GPU market over the last ten-fifteen years I would say it's damn lucky for AMD that people buy based on feelings over fact. I have bought AMD cards knowing that the Nvidia card was better, purely because I have always preferred AMD. There have been a few good moments for AMD, but, few and far between. I hope RDNA 2 is the start of a glorious future for AMD. And I really hope that they give me a reason to switch back.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,147
Huh? I thought it was 1080p running at 4K? That is what digital foundry said. Also Skill Up said this also.

I will be testing out many graphical settings before starting to play Cyberpunk 2077. First try native 4K, then DLSS 2.0 1440p and 1080p upscaled to 4K. Then again the settings above with RT on. Then after that will start turning off depth of field and a bunch of others settings that don’t do it for me one by one and see how that impacts the image quality. Finally lower settings from say Ultra to High and see if I can tell the difference without needing to do a screenshot. I would rather put the grunt into settings that have a noticeable impact in image quality :D

TBH personally I'd only be happy with DLSS in a situation where it made ray tracing (as in being the core of the lighting engine not just incidental effects) feasible at good framerates - I'm not personally a fan of something that isn't 100% faithful to the original, even when the result is consistent. In a multiplayer game for instance, say BF4 kind of stuff, some of the distant details it reproduces could skew your perception of actual events.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,570
Location
Greater London
TBH personally I'd only be happy with DLSS in a situation where it made ray tracing (as in being the core of the lighting engine not just incidental effects) feasible at good framerates - I'm not personally a fan of something that isn't 100% faithful to the original, even when the result is consistent. In a multiplayer game for instance, say BF4 kind of stuff, some of the distant details it reproduces could skew your perception of actual events.
Not a fan myself, but I suppose they have to start somewhere. That said I still skipped the 2000 series cards and looking back I am glad I did. Not one game I am interested in playing got RT/DLSS. Now at least I will have a much improved implementation of DLSS available to me and more RT horse power in the form of a 3070 than even the best Turing could provide.

Going forward I may just upgrade every gen (assuming RT performance keeps improving exponentially) and keep going for the xx70 or whatever AMD’s equivalent is. Will be keeping my RX 580 as a spare and just use that for a few months in between transitions. As right now it seems people seem happy to pay near full price on used cards even when new cards are around the corner. 2070S’ still selling for over £400 second hand on members market, bonkers but works for me :)
 
Man of Honour
Joined
13 Oct 2006
Posts
91,147
Not a fan myself, but I suppose they have to start somewhere. That said I still skipped the 2000 series cards and looking back I am glad I did. Not one game I am interested in playing got RT/DLSS. Now at least I will have a much improved implementation of DLSS available to me and more RT horse power in the form of a 3070 than even the best Turing could provide.

Going forward I may just upgrade every gen (assuming RT performance keeps improving exponentially) and keep going for the xx70 or whatever AMD’s equivalent is. Will be keeping my RX 580 as a spare and just use that for a few months in between transitions. As right now it seems people seem happy to pay near full price on used cards even when new cards are around the corner. 2070S’ still selling for over £400 second hand on members market, bonkers but works for me :)

Yeah not a single game I've been interested in playing so far has justified a Turing card for me - I do quite a bit with Quake 2 RTX but on its own it just doesn't justify the price even if I do have to play it at 800x600 with all RTX settings turned up to get playable framerates on my 1070 LOL.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,570
Location
Greater London
Yeah not a single game I've been interested in playing so far has justified a Turing card for me - I do quite a bit with Quake 2 RTX but on its own it just doesn't justify the price even if I do have to play it at 800x600 with all RTX settings turned up to get playable framerates on my 1070 LOL.
Don’t worry, the 3070 will sort your RT frames right out. Will be a huge upgrade for you. They just need to provide near 2080Ti rasterisation performance and keep price below £499.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,147
Don’t worry, the 3070 will sort your RT frames right out. Will be a huge upgrade for you. They just need to provide near 2080Ti rasterisation performance and keep price below £499.

Personally I will spend whatever money to get what I need/want - just nothing demands or justifies it at the moment for what I do (that and I'd feel like a right mug buying some of the cards when you are paying top money for what is really mid-range hardware).
 
Caporegime
OP
Joined
8 Jul 2003
Posts
30,062
Location
In a house
Huh? I thought it was 1080p running at 4K? That is what digital foundry said. Also Skill Up said this also.

I will be testing out many graphical settings before starting to play Cyberpunk 2077. First try native 4K, then DLSS 2.0 1440p and 1080p upscaled to 4K. Then again the settings above with RT on. Then after that will start turning off depth of field and a bunch of others settings that don’t do it for me one by one and see how that impacts the image quality. Finally lower settings from say Ultra to High and see if I can tell the difference without needing to do a screenshot. I would rather put the grunt into settings that have a noticeable impact in image quality :D

It was upscaled to 1080p :p

"The way it's meant to be paid"?

Nvidia 'The way you're meant to be played'

:D
 
Last edited:

ljt

ljt

Soldato
Joined
28 Dec 2002
Posts
4,540
Location
West Midlands, UK
Huh? I thought it was 1080p running at 4K? That is what digital foundry said. Also Skill Up said this also.

I will be testing out many graphical settings before starting to play Cyberpunk 2077. First try native 4K, then DLSS 2.0 1440p and 1080p upscaled to 4K. Then again the settings above with RT on. Then after that will start turning off depth of field and a bunch of others settings that don’t do it for me one by one and see how that impacts the image quality. Finally lower settings from say Ultra to High and see if I can tell the difference without needing to do a screenshot. I would rather put the grunt into settings that have a noticeable impact in image quality :D

The build the people got to play was run on a system containing a 2080Ti, and was run at 1080p with DLSS (so lower than 1080p rendering) with only some of the ray traced features, and to top it off it still dropped well below 60fps.

Will need some serious optimisation before launch I think!
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,570
Location
Greater London
Personally I will spend whatever money to get what I need/want - just nothing demands or justifies it at the moment for what I do (that and I'd feel like a right mug buying some of the cards when you are paying top money for what is really mid-range hardware).

Yeah, pretty much feel the same. That is why I don’t fancy going for more than a 3070 as there are huge diminishing returns in price for performance at that point.


It was upscaled to 1080p :p



Nvidia 'The way you're meant to be played'

:D
The build the people got to play was run on a system containing a 2080Ti, and was run at 1080p with DLSS (so lower than 1080p rendering) with only some of the ray traced features, and to top it off it still dropped well below 60fps.

Will need some serious optimisation before launch I think!
Bloody hell. That’s crazy. No wonder they are not near release yet, they must have a lot of optimisations to do. Worst part is apparently in that video there were many instances where RT was not even on? How the hell will this game run on an Xbone? Going to look rubbish at 480p or something :p
 
Soldato
Joined
6 Aug 2009
Posts
7,071
One problem with your post, Nvidia have had the better products most of the time since the 9700pro. And I am not just talking high end.

AMD haven't lived up to their marketing, that's part of the problem. Poor Volta, Overclockers dream etc.

Then they have messed up a lot of launches, Hawaii, Fury, Polaris, Vega. Which leaves a bad impression.

Nvidia been first to market with DX12 ultimate isn't academic. If you have an RDNA 1 card you will have to upgrade to an RDNA 2 card to get those features. Whereas Turing cards support those features now, that's despite been out almost a year before Navi.

Based on AMD's performance in the GPU market over the last ten-fifteen years I would say it's damn lucky for AMD that people buy based on feelings over fact. I have bought AMD cards knowing that the Nvidia card was better, purely because I have always preferred AMD. There have been a few good moments for AMD, but, few and far between. I hope RDNA 2 is the start of a glorious future for AMD. And I really hope that they give me a reason to switch back.

I have to say I prefer the image on my 5700XT Vs my 1070 that said my point still stands Nvidia are the better marketeers. People wetter still buying their cards in large numbers even when AMD were better. I'm quite sure if AMD have a better card this time Nvidia will still out sell them, their mindshare is so strong. It's not just about "better", look at those in this thread that would never buy anything else. It's not either/or better or marketing it's a combination but when the chips are down the faithful still buy the marketing. That's not a criticism is just the way it is.
 
Soldato
Joined
6 Feb 2019
Posts
17,594
Yeah, pretty much feel the same. That is why I don’t fancy going for more than a 3070 as there are huge diminishing returns in price for performance at that point.




Bloody hell. That’s crazy. No wonder they are not near release yet, they must have a lot of optimisations to do. Worst part is apparently in that video there were many instances where RT was not even on? How the hell will this game run on an Xbone? Going to look rubbish at 480p or something :p

The resolution used for the demo is low because the game was using full scene rayvtracing - as good as the 2080ti is, it's too taxing - the game Control has the same issues of you enable all the rayvtracing stuff to max - it can't even hold 1440p 60fps.

I really feel sorry for the consoles - Xbox one is probably gonna have to run at 600p all low settings and even the Series X will probably get stuck with 1080p is they give ray tracing to consoles.


Good thing is that the game is future proofed, the developer should not downgrade anything - in fact the above video shows the games graphics are constantly getting upgraded
 
Soldato
Joined
8 Jun 2018
Posts
2,827
Well I have no intention of buying any new Turing/RDNA2 GPU unless it's a good performance jump for each of the price tiers,and I am happy to wait until CDPR actually again has to do a climb down and improve performance. In the end,looking at what the IMF said yesterday,they said many stocks are very overvalued now,and have not kept up with reality,so it makes me wonder how long the current pricing structure is going to stay,if things get shaky.
With the way things are looking brute force doesn't appear to be enough this go around.
I'm looking for cards that has the same amount of gddr6 as well.

We've been clearly told that a 2080ti isn't capable of playing cp2077 at 1080p. Part of that,imo, is do to the lack of bandwidth. If it was pacman rayteaced it would be fine but it's those open world environments that become an issue.
 
Soldato
Joined
6 Feb 2019
Posts
17,594
With the way things are looking brute force doesn't appear to be enough this go around.
I'm looking for cards that has the same amount of gddr6 as well.

We've been clearly told that a 2080ti isn't capable of playing cp2077 at 1080p. Part of that,imo, is do to the lack of bandwidth. If it was pacman rayteaced it would be fine but it's those open world environments that become an issue.

Doesn't look like anyone has been able to brute force real time ray tracing. When you introduce a new graphics tool that suddenly increases the gpu load by several hundred percent that's going to take time to iron out. Most rayvtraced content in the world today is still movies that get rendered on super computers over a period of weeks or months - not even close to real time
 
Soldato
Joined
10 Oct 2012
Posts
4,421
Location
Denmark
The build the people got to play was run on a system containing a 2080Ti, and was run at 1080p with DLSS (so lower than 1080p rendering) with only some of the ray traced features, and to top it off it still dropped well below 60fps.

Will need some serious optimisation before launch I think!

Or maybe the hardware just isn't fast enough?
 
Back
Top Bottom