• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
I'm sure when it does they will remind us how they now beat consoles. Failing to understand the most rudimentary aspect of price between the 2. Along with the number of games on hand. :cool:

The Leather Jacket will use every marketing trick he knows but he's not fighting AMD in this space anymore, i think AMD are just keeping their foot here because there is still some revenue for them and to see if they can turn it around without bankrupting themselves.

Jens is not going to make a 2080TI £500 or less with Ampere, neither are AMD, except in consoles where you get a whole gaming box for that, AMD don't care, they will take their $150 for every console sold, probably more than that now.... for doing literally nothing, Nvidia can't compete with that, Nvidia are now fighting to keep PC gaming viable because that's all they have, even the Nintendo Switch replacement is moving to a Samsung / AMD SoC.
--------------


Anyway.... on DLSS, games in 1080P look like milk on my 32" 1440P IPS screen, that is they look washed out and blurry.

One of two reason i don't think DLSS is the magic button that gives you native resolution gaming with lower resolution performance, one it depends on developer implementation and two not all screens are equal in upscaling to native resolution.
 
The Leather Jacket will use every marketing trick he knows but he's not fighting AMD in this space anymore, i think AMD are just keeping their foot here because there is still some revenue for them and to see if they can turn it around without bankrupting themselves.

Jens is not going to make a 2080TI £500 or less with Ampere, neither are AMD, except in consoles where you get a whole gaming box for that, AMD don't care, they will take their $150 for every console sold, probably more than that now.... for doing literally nothing, Nvidia can't compete with that, Nvidia are now fighting to keep PC gaming viable because that's all they have, even the Nintendo Switch replacement is moving to a Samsung / AMD SoC.
--------------


Anyway.... on DLSS, games in 1080P look like milk on my 32" 1440P IPS screen, that is they look washed out and blurry.

One of two reason i don't think DLSS is the magic button that gives you native resolution gaming with lower resolution performance, one it depends on developer implementation and two not all screens are equal in upscaling to native resolution.

Jacket man has a real good understanding how to set a marketing campaign for disposable income for them doesnt he? Cyberpunk2077 is all it takes for them to spend $1400. Just one game and the failure of the 2080ti. Roughly $2600 of disposable income, for most, in the last 2 or so years for just 2 video cards. That's a mind bending emotional twist that even M. Night" Shyamalan has yet to achieve.

Let's not fool ourselves the majority of the games on PC are coming from console. There are very few PC exclusive games or in other words games exclusively designed for a particular GPU. In which case we know cyberpunk 2077 is going to be one of those games.

2080ti under the RTX branding looks to me to be a complete failure when these new games come out. In my opinion the 2080 TI should have stuck under the GTX branding. But that is the topic for another discussion. But it makes sense why Nvidia stop production of turing.

As for ampere I can see the slogan for it now...
'$1,400 and a few games'
lol
 
Last edited:
The Leather Jacket will use every marketing trick he knows but he's not fighting AMD in this space anymore, i think AMD are just keeping their foot here because there is still some revenue for them and to see if they can turn it around without bankrupting themselves.

Jens is not going to make a 2080TI £500 or less with Ampere, neither are AMD, except in consoles where you get a whole gaming box for that, AMD don't care, they will take their $150 for every console sold, probably more than that now.... for doing literally nothing, Nvidia can't compete with that, Nvidia are now fighting to keep PC gaming viable because that's all they have, even the Nintendo Switch replacement is moving to a Samsung / AMD SoC.
--------------


Anyway.... on DLSS, games in 1080P look like milk on my 32" 1440P IPS screen, that is they look washed out and blurry.

One of two reason i don't think DLSS is the magic button that gives you native resolution gaming with lower resolution performance, one it depends on developer implementation and two not all screens are equal in upscaling to native resolution.
I doubt AMD gets $150 per unit unless they make the units in which case the production cost comes out of that, margins on consoles are very low. They will still make a nice amount though considering the volume but only because they have the IP from other areas.
 
The Leather Jacket will use every marketing trick he knows but he's not fighting AMD in this space anymore, i think AMD are just keeping their foot here because there is still some revenue for them and to see if they can turn it around without bankrupting themselves.

Jens is not going to make a 2080TI £500 or less with Ampere, neither are AMD, except in consoles where you get a whole gaming box for that, AMD don't care, they will take their $150 for every console sold, probably more than that now.... for doing literally nothing, Nvidia can't compete with that, Nvidia are now fighting to keep PC gaming viable because that's all they have, even the Nintendo Switch replacement is moving to a Samsung / AMD SoC.
--------------


Anyway.... on DLSS, games in 1080P look like milk on my 32" 1440P IPS screen, that is they look washed out and blurry.

One of two reason i don't think DLSS is the magic button that gives you native resolution gaming with lower resolution performance, one it depends on developer implementation and two not all screens are equal in upscaling to native resolution.

?? to the first paragraph. What are you writing?

For the second part, If games are looking washed out and blurry using DLSS, then you are either using DLSS 1.0 or something else is messed up in your settings. DLSS 2.0 is not blurry.

The screen has no effect on the upscaling from DLSS. As for the developers, it's simple to implement DLSS 2.0 in games. No matter what upscaling technique comes out it will be up to the developers to decide to use it in their games. So it's not just a negative for DLSS.
 
But it makes sense why Nvidia stop production of turing.

Nvidia have stopped production of Turing because Ampere is coming. They do the same before every new release. They stopped producing Pascal GPUs before Turing came out, they stopped producing Maxwell GPUs before Pascal came out. AMD do the same.
 
I doubt AMD gets $150 per unit unless they make the units in which case the production cost comes out of that, margins on consoles are very low. They will still make a nice amount though considering the volume but only because they have the IP from other areas.

MS and Sony would have paid a fixed fee for a licence for the design and maybe a very small per unit cost (talking $1-2 if that).

That why sony/ms have there own branding on the chips, have some say on QC and any savings made on the chip cost as it matures on the same node they pocket and not AMD. AMD don't make that much money on there semi custom side of the biz
 
MS and Sony would have paid a fixed fee for a licence for the design and maybe a very small per unit cost (talking $1-2 if that).

That why sony/ms have there own branding on the chips, have some say on QC and any savings made on the chip cost as it matures on the same node they pocket and not AMD. AMD don't make that much money on there semi custom side of the biz

$1 or $2 is nothing, assuming over the life of the consoles that would be $150m to $300m total, over the 5 year life that would be $30m to $60m per year, R&D would cost 10x that. AMD's revenue was 100x that per year at the time while selling Bulldozer CPU's and Polaris GPU's with almost 0 margins.

If they had they would have lots of $$$ as the last gen sold > 150M units.
I don't think that was for the life of the console, probably much less after the first year.
 
Last edited:
In 2016 AMD earned $2.3bn from Embedded semi-custom, that's consoles.

Sony and MS didn't sell 2.3 billion consoles in 2016, they sold maybe <30 million in that year, the XBox One was released in 2013, so already 3 years old and if you do the maths AMD got roughly about $75 a unit in that year.

2016 is as far back as i can find with financials.

https://ir.amd.com/financial-information/quarterly-results
 
I doubt AMD gets $150 per unit unless they make the units in which case the production cost comes out of that, margins on consoles are very low. They will still make a nice amount though considering the volume but only because they have the IP from other areas.
Margins on console are very low for Sony and Microsoft, not AMD.
 
Yes, Turing production was stopped because of Ampere. Happens every time a new GPU gen is about to come out.

Also Watch Dogs Legion 1080p runs RT on "ULTRA" which the consoles won't be running (watch the DF video). I'm pretty sure by the time the game comes out, you'll be easily playing it on turing 1440p60fps with DLSS 2.0 enabled with RT on medium/high - console equivalent.
It's funny to read how some people are imagining the next-gen consoles, or even the RDNA2 GPUs beating Nvidia in RT performance. Everything that has been shown so far regarding next-gen consoles (PS5 , XBX) point to a lower performance in RT than a 2080, nevermind 2080ti. Minecraft is one of the examples, was running at 30 fps on XBX. All PS5 games that showed RT used it sparingly, occasionally for reflections (not even full reflections, just static objects, dynamic ones didn't reflect - look at ratchet and clank), nevermind AO/GI/etc.

RT reflections in PS5 games also were completely mirror-like, which is the easiest hit on performance, compared to varying degrees of roughness like in control or wolfenstein - another bad sign. Even AMD"s own "tech demo" had only mirror-like reflections, which again, are the cheapest to render.
The other PS5 exclusive - Pragmata which showed some form of GI RT had very visible noisiness, which again is a bad sign.

Logical conclusion is the RT perf on RDNA2 hit is massive, capabilities limited, even moreso than on NVIDIA hardware. Best case scenario you get 2080 raster perf and maybe 2060-2070 RT perf on next-gen consoles. PC GPU? 2080ti (or better) Raster with 2080ti RT perf on their top-end gpu.

Same thing always happens. Irrational hype regarding AMD products with the usual disappointment and denial later on. Not sure why people keep doing this to themselves instead of thinking rationally and looking at the facts that are already known, drawing logical conclusions out of them.
 
No one knows yet what the RT performance is going to be on RDNA2, there are speculations about different solutions, some of which cite being different to consoles.

The overall assumption is that Ampere will be a more powerful RT card and that will be Nvidia's primary marketing spiel.
 
I think my rational look on things is that consoles will release and nvidia will still be clearing stock of 2080tis at £900+. If the console costs £500 then I won't be that offended if it doesn't quite reach level of the £900 pc gpu, because.. well... It cost nearly half the price.

Nvidia will launch the next gen gpu shortly after and charge £1400 quid for their top tier one. Everyone with money to burn will buy it then start preaching about how much its trouncing a next gen console.

They will then just sit at their pc waiting for the next console port to land and keep telling themselves that people playing on the consoles don't know what they are missing.

All while the majority of console gamers are just buying a game, playing it with friends and not freaking out about whatever performance they get.

My gtx 1080 has played every game I've thrown at it at 1440p and I've never felt even the slightest bit of envy towards people who spunked over a grand on a 2080ti to just play the same games as everyone else with omg 30% more frames.
 
Last edited:
I know what you mean. Pre-ordering the ps5 as soon as it gets listed. I'm also happy that for once the consoles will be much closer to previous tier high-end pc components which will mean better looking games overall, and lasting/competing longer. Doesn't change everything else I've said though.
 
I think my rational look on things is that consoles will release and nvidia will still be clearing stock of 2080tis at £900+. If the console costs £500 then I won't be that offended if it doesn't quite reach level of the £900 pc gpu, because.. well... It cost nearly half the price.


If they want to "clear stock" they'll have to do better than £900, more in the region of £700 if not less. If not they're just gonna gather dust. Their hands are far more tied now because SLI is all but dead, so nobody wanting to buy a second card new for sli. And people that thought a grand for a card was ******** aren't going to be swayed by a £150-£200 discount.
 
That's funny I haven't come across an official release date of ampere in the next 60 days or so. What I do believe is that the 2080ti damages the RTX branding with 1080p/30fps @ $1200. When compared against consoles making it it look very weak at its price point. And that's that.

Meanwhile nothing about AMD gpus yet. Genius.
 
Status
Not open for further replies.
Back
Top Bottom