Caporegime
- Joined
- 18 Oct 2002
- Posts
- 31,179
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I must agree I haven't seen anything impressive yet, maybe its compression, maybe because its lower res but I certainly wouldn't be creaming my pants over that. AND its at less than 1080 ffs with DLSS wizardry upscaling improving the image quality to boot....with a £1200 GPU lol. It's sort of where I expected video games to be about 5 years ago although I'd hope that at 'better than native' res it will look better. I remember being in Rio for the first time looking out of the top floor hotel window and literally thinking 'Wow it looks like Crysis'. With this I'm suspecting I'll see a Fallout game with slightly better lighting at 45fps 1080 on a £1200+GPU. It's really graphically pedestrian and iterative for me.That looks.....really,really average,TBH.
If the machine & specs hadn't been pointed out, I'd have assumed it was running on a current gen console, albeit the Pro or X.
Huh? I thought it was 1080p running at 4K? That is what digital foundry said. Also Skill Up said this also.I must agree I haven't seen anything impressive yet, maybe its compression, maybe because its lower res but I certainly wouldn't be creaming my pants over that. AND its at less than 1080 ffs with DLSS wizardry upscaling improving the image quality to boot....with a £1200 GPU lol. It's sort of where I expected video games to be about 5 years ago although I'd hope that at 'better than native' res it will look better. I remember being in Rio for the first time looking out of the top floor hotel window and literally thinking 'Wow it looks like Crysis'. With this I'm suspecting I'll see a Fallout game with slightly better lighting at 45fps 1080 on a £1200+GPU. It's really graphically pedestrian and iterative for me.
You know exactly what I mean. I'm not getting into tribalism. Intel also good at it, most of successful companies are, especially Apple. Plenty of consumers have bought inferior products by only looking at the headlines.
A good example today, lots of stories about Nvidia being "first" to DX12 Ultimate support. Academic at the moment but great marketing. The consumer hears "Nvidia first", "new technology" keeps a positive vibe around the name of the company regardless of whether the end user understands it.
AMD are getting better at this. Few buyers research things into the depth the posters here go so that kind of positive news influences buying decisions. Most sales are made on feelings not facts unfortunately.
Huh? I thought it was 1080p running at 4K? That is what digital foundry said. Also Skill Up said this also.
I will be testing out many graphical settings before starting to play Cyberpunk 2077. First try native 4K, then DLSS 2.0 1440p and 1080p upscaled to 4K. Then again the settings above with RT on. Then after that will start turning off depth of field and a bunch of others settings that don’t do it for me one by one and see how that impacts the image quality. Finally lower settings from say Ultra to High and see if I can tell the difference without needing to do a screenshot. I would rather put the grunt into settings that have a noticeable impact in image quality![]()
Not a fan myself, but I suppose they have to start somewhere. That said I still skipped the 2000 series cards and looking back I am glad I did. Not one game I am interested in playing got RT/DLSS. Now at least I will have a much improved implementation of DLSS available to me and more RT horse power in the form of a 3070 than even the best Turing could provide.TBH personally I'd only be happy with DLSS in a situation where it made ray tracing (as in being the core of the lighting engine not just incidental effects) feasible at good framerates - I'm not personally a fan of something that isn't 100% faithful to the original, even when the result is consistent. In a multiplayer game for instance, say BF4 kind of stuff, some of the distant details it reproduces could skew your perception of actual events.
Not a fan myself, but I suppose they have to start somewhere. That said I still skipped the 2000 series cards and looking back I am glad I did. Not one game I am interested in playing got RT/DLSS. Now at least I will have a much improved implementation of DLSS available to me and more RT horse power in the form of a 3070 than even the best Turing could provide.
Going forward I may just upgrade every gen (assuming RT performance keeps improving exponentially) and keep going for the xx70 or whatever AMD’s equivalent is. Will be keeping my RX 580 as a spare and just use that for a few months in between transitions. As right now it seems people seem happy to pay near full price on used cards even when new cards are around the corner. 2070S’ still selling for over £400 second hand on members market, bonkers but works for me![]()
Don’t worry, the 3070 will sort your RT frames right out. Will be a huge upgrade for you. They just need to provide near 2080Ti rasterisation performance and keep price below £499.Yeah not a single game I've been interested in playing so far has justified a Turing card for me - I do quite a bit with Quake 2 RTX but on its own it just doesn't justify the price even if I do have to play it at 800x600 with all RTX settings turned up to get playable framerates on my 1070 LOL.
Don’t worry, the 3070 will sort your RT frames right out. Will be a huge upgrade for you. They just need to provide near 2080Ti rasterisation performance and keep price below £499.
Huh? I thought it was 1080p running at 4K? That is what digital foundry said. Also Skill Up said this also.
I will be testing out many graphical settings before starting to play Cyberpunk 2077. First try native 4K, then DLSS 2.0 1440p and 1080p upscaled to 4K. Then again the settings above with RT on. Then after that will start turning off depth of field and a bunch of others settings that don’t do it for me one by one and see how that impacts the image quality. Finally lower settings from say Ultra to High and see if I can tell the difference without needing to do a screenshot. I would rather put the grunt into settings that have a noticeable impact in image quality![]()
"The way it's meant to be paid"?
Huh? I thought it was 1080p running at 4K? That is what digital foundry said. Also Skill Up said this also.
I will be testing out many graphical settings before starting to play Cyberpunk 2077. First try native 4K, then DLSS 2.0 1440p and 1080p upscaled to 4K. Then again the settings above with RT on. Then after that will start turning off depth of field and a bunch of others settings that don’t do it for me one by one and see how that impacts the image quality. Finally lower settings from say Ultra to High and see if I can tell the difference without needing to do a screenshot. I would rather put the grunt into settings that have a noticeable impact in image quality![]()
Personally I will spend whatever money to get what I need/want - just nothing demands or justifies it at the moment for what I do (that and I'd feel like a right mug buying some of the cards when you are paying top money for what is really mid-range hardware).
It was upscaled to 1080p
Nvidia 'The way you're meant to be played'
![]()
Bloody hell. That’s crazy. No wonder they are not near release yet, they must have a lot of optimisations to do. Worst part is apparently in that video there were many instances where RT was not even on? How the hell will this game run on an Xbone? Going to look rubbish at 480p or somethingThe build the people got to play was run on a system containing a 2080Ti, and was run at 1080p with DLSS (so lower than 1080p rendering) with only some of the ray traced features, and to top it off it still dropped well below 60fps.
Will need some serious optimisation before launch I think!
One problem with your post, Nvidia have had the better products most of the time since the 9700pro. And I am not just talking high end.
AMD haven't lived up to their marketing, that's part of the problem. Poor Volta, Overclockers dream etc.
Then they have messed up a lot of launches, Hawaii, Fury, Polaris, Vega. Which leaves a bad impression.
Nvidia been first to market with DX12 ultimate isn't academic. If you have an RDNA 1 card you will have to upgrade to an RDNA 2 card to get those features. Whereas Turing cards support those features now, that's despite been out almost a year before Navi.
Based on AMD's performance in the GPU market over the last ten-fifteen years I would say it's damn lucky for AMD that people buy based on feelings over fact. I have bought AMD cards knowing that the Nvidia card was better, purely because I have always preferred AMD. There have been a few good moments for AMD, but, few and far between. I hope RDNA 2 is the start of a glorious future for AMD. And I really hope that they give me a reason to switch back.
Yeah, pretty much feel the same. That is why I don’t fancy going for more than a 3070 as there are huge diminishing returns in price for performance at that point.
Bloody hell. That’s crazy. No wonder they are not near release yet, they must have a lot of optimisations to do. Worst part is apparently in that video there were many instances where RT was not even on? How the hell will this game run on an Xbone? Going to look rubbish at 480p or something![]()
With the way things are looking brute force doesn't appear to be enough this go around.Well I have no intention of buying any new Turing/RDNA2 GPU unless it's a good performance jump for each of the price tiers,and I am happy to wait until CDPR actually again has to do a climb down and improve performance. In the end,looking at what the IMF said yesterday,they said many stocks are very overvalued now,and have not kept up with reality,so it makes me wonder how long the current pricing structure is going to stay,if things get shaky.
With the way things are looking brute force doesn't appear to be enough this go around.
I'm looking for cards that has the same amount of gddr6 as well.
We've been clearly told that a 2080ti isn't capable of playing cp2077 at 1080p. Part of that,imo, is do to the lack of bandwidth. If it was pacman rayteaced it would be fine but it's those open world environments that become an issue.
The build the people got to play was run on a system containing a 2080Ti, and was run at 1080p with DLSS (so lower than 1080p rendering) with only some of the ray traced features, and to top it off it still dropped well below 60fps.
Will need some serious optimisation before launch I think!
Or maybe the hardware just isn't fast enough?
Are you saying the question for the next 15 years will be "But can it run Cyberpunk?"?