Yup, it was first posted by @docsonic![]()
Posted above I've just seen. It's starting!
AMD need to keep dropping little hints like this right through until October.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Yup, it was first posted by @docsonic![]()
Posted above I've just seen. It's starting!
Trouble is Nvidia would have cleaned up by then. Lol.If it's $549 or even $599 for a 16GB 3080-matching RDNA2 using less power, they will surely clean up.
So much hype around DLSS and RTX IO though I think people need assurances that AMD will have equivalent features.
I do use the sharpening from 2160p down to 1800p and things look fine. Below that picture quality suffers too much on my 4k monitor. DLSS can provide an extra 50% performance on top of that, that's still a huge deal.We already know they do, at least in some form. They have sharpening and IO technology in the consoles. It would be unfathomable of them not to replicate this in RDNA2 cards.
This gets trotted out every single time... bare in mind the people claiming yield issues also claimed the 3080/90 would be on TSMC 7nm.
I do use the sharpening from 2160p down to 1800p and things look fine. Below that picture quality suffers too much on my 4k monitor. DLSS can provide an extra 50% performance on top of that, that's still a huge deal.
The problem is that a DLSS-alternative would still mostly likely require per-game implementation, and that means it will be excluded from Nvidia games which already feature DLSS. Most importantly Nvidia will still have an insurmountable advantage in Cyberpunk 2077 (and a host of other upcoming major releases this holiday season), and I don't think most people are going to be happy giving that up. I know I sure as heck wouldn't.
Of course every logical person is waiting and for many of us the 3080 is dead on arrival with its 10gb vram.FYI I'm still waiting. If AMD drop a 3070 equipvalent card (which is what I'm expecting) with 16GB VRAM, should change pricing of the 3080 and 3070 a little. Also we're all buying new GPUs for Cyperpunk.. why not wait till its released?
That's a 3080 competitor.
Nvidia knew this and got in first.
according to Coreteks its a 3070 competitor
Not necessarily, depends on how Consoles want to handle resolution scaling, i'm sure they will want an API level version of it.
RDNA1 already does a pretty good job of it without the use of special hardware, i use it globally, it just needs that extra 5% to bring it in line with DLSS2.
Not necessarily, depends on how Consoles want to handle resolution scaling, i'm sure they will want an API level version of it.
RDNA1 already does a pretty good job of it without the use of special hardware, i use it globally, it just needs that extra 5% to bring it in line with DLSS2.
Right. So again we dont need to reiterate RDNA2 dGPU is going to be a nice alternative in the space above the 5700's and a 3080. Even better will be if people dont want RTX or IO hype they can pick up this in a brand new card supporting HDMI 2.1 with 16Gb vram for less?![]()
Idk that I'd expect uniformity even then. On PS they had their own hardware for checkerboard rendering, but wasn't available on Xbox, and then wasn't ported to PC builds always. Now for next-gen Xbox sort of talked about some ML-assisted upscaling but didn't elaborate, and Sony said nothing and AMD says nothing also.
Let's hope they figure out a global solution though, I'd be happy.
I'll pay £400 for a card a bit faster than my 5700XT with 16GB of GDDR6, Ray Tracing is a bonus but it will have that.
On Coreteks, he's clearly not thinking straight, the consoles are a competitor to a 3070. What is he saying? That AMD cannot better their Console APU's?