Caporegime
- Joined
- 18 Oct 2002
- Posts
- 29,855
It will likely be no more than 20% faster on average than an LC imo.
Yup, they're comparing it to their own stuttering Reference version imo.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
It will likely be no more than 20% faster on average than an LC imo.
The move to 7nm gives either 25% more performance or same performance for half the power consumption for the same performance.
I think AMD should have gone for the latter route. Vega 64 performance at 150W to compete against the RTX 2060.
Instead of eking out that performance with the same high power consumption as Vega64.
People wont notice the different unless examining really closely. There's been one review already that mentioned they could see imperfections but when they examined the native image, it too had imperfections.It can't/won't look better than native, it's purely a performance hack - attempting to get as close to native image as possible with the least power requirement.
People wont notice the different unless examining really closely. There's been one review already that mentioned they could see imperfections but when they examined the native image, it too had imperfections.
I don't know the technical reasons but it's more than simple "upscaling". People are not giving it enough credit IMO, usually those who still justifying to themselves that they must not buy a 20 series .
IMO, if DLSS gives a 25-40% increase in performance for no or little easily noticeable degradation in image quality (unless screenshotting, examining closely), it's a winner. If it improves image quality in some cases, even better.
I'm curious if you'd still be praising it if it was an AMD feature? The way I see it, it's very similar to other "cheating" methods of improving performance with little/no noticeable IQ drop -just as an example AMD reducing tessellation levels. The fact of the matter is, the over tessellation wasn't providing any IQ benefit it was just demolishing AMD's performance (perhaps that was the intent, but we'll never truly know) - however, AMD still got slammed for "cheating" - likely by the same people praising DLSS, which as you say is definitely more than simple upscaling - but upscaling none-the-less.
Personally, I don't mind DLSS - I think it's very clever and I definitely see a lot more performance hacks coming in the future to provide higher frames without affecting IQ. It is definitely interesting to see peoples' reactions to it though.
https://www.techspot.com/article/1712-nvidia-dlss/
DLSS does not give better IQ than native 4K but it certainly give less performance hit than native 4K. Techspot compared it to 1800p up-scaled and found similar IQ and almost identical performance.
I have had to correct some people actually mistakenly (and laughably) assuming that DLSS was going to give a free performance boost. I explained what DLSS actually does and provided the link above and clarified that everything has a trade off. They looked perplexed because all the marketing and BS from most tech press seems to imply when it is enabled in games it magically boost performance for free.
I don't care who implemented the feature. I'm happy to buy AMD or NVidia or Intel products.I'm curious if you'd still be praising it if it was an AMD feature? The way I see it, it's very similar to other "cheating" methods of improving performance with little/no noticeable IQ drop -just as an example AMD reducing tessellation levels. The fact of the matter is, the over tessellation wasn't providing any IQ benefit it was just demolishing AMD's performance (perhaps that was the intent, but we'll never truly know) - however, AMD still got slammed for "cheating" - likely by the same people praising DLSS, which as you say is definitely more than simple upscaling - but upscaling none-the-less.
Personally, I don't mind DLSS - I think it's very clever and I definitely see a lot more performance hacks coming in the future to provide higher frames without affecting IQ. It is definitely interesting to see peoples' reactions to it though.
I expect it to initially compete with the 2070 and then with driver enhancements it should catch the 1080ti, but I also expect the 2070 to catch the ti.Don't know if already posted but Radeon 7 is slower than both the 2080 and 1080Ti
https://www.eurogamer.net/articles/...-first-radeon-7-benchmark-results-in-25-games
Don't know if already posted but Radeon 7 is slower than both the 2080 and 1080Ti
https://www.eurogamer.net/articles/...-first-radeon-7-benchmark-results-in-25-games
Where's the cheap 7nm cards AdordedTV promised, dammit!
If you look at @Shaz12 other posts, you'll see he is completely pro nVidia. Just another poster who can't be happy in his choice of GPU and has to put down anything else.In two games that this article chose to highlight with nice performance tables, then go on to say AMDs card beats the 2080 in other games in the narrative below the Assassins Creed table.
Definitely showing bias in their article. Why only show graphs of R7 losing to 2080?
I voted wait for more info. If the price is lower than 2080 for similar performance then I will contemplate it. If the $699 price is converted to UK prices it is ~£650 inc VAT but of course we will get the inevitable price gouging.
AgreedAnyone that believes lowering resolution doesn't lower image quality needs eyes testing.
Anyone that believes lowering resolution doesn't lower image quality needs eyes testing.
DLSS is very much like what playstation has been using checkerboard rendering.
It's also just like using the resolution scale in game from 100 to lower. But the main difference is DLSS does it on the fly based on demand from the GPU.