• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Funny how some only see the posts where I'm critical of AMD and/or defend nVidia.

Won't hide I don't like AMD, never have hidden it. But I would jump on any other brand if there was a viable alternative to nVidia I dislike them also.

That's why you are still rocking Haswell :p - You call yourself a tech :p - The door is that way ---->
 
People won't have it but I honestly believe that they massively underestimated demand - global volume of the cards available is slightly down on previous launches but not much - the charts someone posted before in terms of supply aren't really that different from the first few weeks of previous GPU launches.

Link? I mean there are some models from some manufacturers where nobody in the world except reviews have the cards and we are more than 2 weeks after launch.

I read somewhere that Nvidia only had 5000 cards ready for worldwide launch. If that is the same as previous launches then it just proves that they were paper launches as well. It was the same with the 1080 launch. I waited 93 days for my preorder card ordered on launch day. I presume Nvidia miscalculated demand then as well?

Or perhaps if at every launch they keep making so few cards that nobody can buy one for three months that perhaps its them not making enough which is the issue rather than demand?

I mean comparing the 3080 numbers to the appalling 1080 launch shortages for three months and saying you are making as many as you did with 1080 is just about getting the demand being more than expected. You should at least expect demand as strong as 1080 was and made enough cards accordingly.
 
EDIT: I'm especially looking forward to watching people's attitudes towards ray tracing change over time.
To be fair, people's attitudes towards ray tracing have been purely based on the woeful implementation of real time hardware in Turing. It was never about "ray tracing", it was about price gouging beta-test hardware. And now it seems Ampere hasn't been enough of a forward push after 2 years. With AMD joining the fray, there will be a significantly greater incentive for game devs to implement it because there's just going to be more hardware available to warrant the effort. And therefore there will be a greater incentive for Nvidia, AMD and Intel to ramp up the development of their RT tech because it's getting used.

As I've said before, thank you Nvidia for taking the first step and getting this ball rolling, an thank you AMD for actually popularising it.
 
Funny how some only see the posts where I'm critical of AMD and/or defend nVidia.

Won't hide I don't like AMD, never have hidden it. But I would jump on any other brand if there was a viable alternative to nVidia I dislike them also. (EDIT: But I don't go out of my way to hate on AMD either).

It's like modern day politics. They're all useless so you go with what you think is the best of a bad bunch.

What a world.

*Useless is perhaps the wrong word. Useless or completely untrustworthy.
 
if AMD bring the hammer with similar performance at a cheaper price to offset the lack of "Nvidia" features such as DLSS and RTX (AMD will have inferior Ray Tracing we know this, but we expect it to be better than Nvidias 1st gen Ray tracing anyhow, so id call that a win

Elephant in room.

My take on this is people that invested in the original RTX hype (I will let slide the 2080Ti owners) also bought into a false promise. People may laugh at "AMD is once again late to the party" but they were actually smart. Raytracing is only just getting into some (<10) games, so it makes sense to release the cards (with added feature price premiums to manufacture) now. Dont excuse the fools that bought Turing jumping in too early. There are even more fools now queuing up to get hold of 30 series cards paying over the odds for what should be fixed prices under £700 (all cards below 3090).

I'd absolutely expect that to happen though... other than the most ardent of naysayers, most that speak negatively about ray tracing do so not because they think the technology is fundamentally crap, just that it has not yet matured to the point where they care about it enough to justify the performance impacts.

Correct, see above what I thought on it.
 
I'm assuming AMD's launch will be pretty decent otherwise I don't think there would have been mention of them pursuing the mantra of being most power efficient. If they were lagging in performance surely power efficiency would have been sacrificed.
 
To be fair, people's attitudes towards ray tracing have been purely based on the woeful implementation of real time hardware in Turing. It was never about "ray tracing", it was about price gouging beta-test hardware. And now it seems Ampere hasn't been enough of a forward push after 2 years. With AMD joining the fray, there will be a significantly greater incentive for game devs to implement it because there's just going to be more hardware available to warrant the effort. And therefore there will be a greater incentive for Nvidia, AMD and Intel to ramp up the development of their RT tech because it's getting used.

As I've said before, thank you Nvidia for taking the first step and getting this ball rolling, an thank you AMD for actually popularising it.

imo ray tracing will be a beta feature in RDNA2, Ampere.. you may also add 2 more generations to that list
 
Sales volumes are pretty high, but Nvidia could have seen it coming. The RTX3080 launch sales volume was higher than the GTX1080, GTX1080ti, RTX2080 and RTX2080ti combined!

It was. nvidia has said demand was 4 times what they expected or previous launch. Since all cards sold out worldwide in 2 to 3 minutes, even if it had been a quarter the demand, they would have sold out in 10 minutes.

You and Nvidia may see having stock for 10 minutes is sufficient stock but I dont.
 
Oh I thought Rroff's signature was for jokes lol

DDR3, 4820k, 250gb sata, 1070 - ouch lol, next gen consoles launching in 4 weeks are several times faster than that PC.

You guys can't let the consoles beat you, its time to upgrade.
 
Doh so it is.. I see 4XXX and straight away default to haswell :) Ill get my coat!

I'm in half a mind between sticking a Xeon 1650 or 1680 V2 in there or buying a new setup - the X79 platform holds up surprisingly well if you get a good overclocking chip. But only if I can source the CPUs fairly cheap.

Oh I thought Rroff's signature was for jokes lol
DDR3, 4820k, 250gb sata, 1070 - ouch lol, next gen consoles launching in 4 weeks are several times faster than that PC

It holds up surprisingly well - the X79 platform has quad channel RAM and with 4.6GHz on the CPU in many games it is only 15-18% behind the latest Ryzen CPUs for 1440p gaming. I've been holding on to the GPU as I don't want to spend money on Turing and most of the time I keep going back to BF4 and The Division 1 as most of the new games I've tried don't hold my attention and/or I find them clunky to play, etc.
 
I find that a lot around here... unfortunately in the long run it tends to turn out I was right.

EDIT: I'm especially looking forward to watching people's attitudes towards ray tracing change over time.

I think it is not so much about RT it is the fact it isn't very good at moment as you really need DLSS and people are hyping it up a lot, almost every one knows about RT it has been about for decades.

Consoles will bring it to the masses even if it is weak compared to 3*** series.
 
Also the 4770k probably better at gaming by quite a bit :D

To be fair my 1700 was a great CPU (3800X Now) but as i mainly play ARPGs and MMORPGs and most of those run on a potatoe, it was hard to tell any difference.

I mean i still play Everquest which is 20+ years old, raiding 3 times a week, i could probably run it on my smartwatch! I think the most demanding thing ive played lately is Cod Modern Warfare, but i am 100% getting Cyberpunk, probably on both my PC and Xbox Series X. So i want a GPU to replace my 5700X as i know for sure it will struggle at 3440x1440p with decent settings.
 
That's wafers not chips, we are also then assuming in the posts on this page that CPU's, GPU's, consoles etc are all produced on the same N7 node rather than mixed over the different N7, N7+ N5 etc nodes.
This is what I've tried to point out and get some input on (only melmac chimed in). TSMC have 3 7nm nodes and a design-compatible 6nm. We know that Zen 3 isn't using the original N7 that Zen 2 uses, so straight away that's "free". It's entirely likely that AMD are splitting their products across the nodes to maximise throughput. I'd even wager that the consoles are using N6 - it's design compatible with N7 which is what Zen 2 was designed for and utilises some EUV which would benefit RDNA, without the need of going high-power nodes like N7+ and N7P.
 
I'm in half a mind between sticking a Xeon 1650 or 1680 V2 in there or buying a new setup - the X79 platform holds up surprisingly well if you get a good overclocking chip. But only if I can source the CPUs fairly cheap.

The Xeon might extend it a few years. The question is, is it worth it? Even the Xeon is going to get slapped about by mainstream stuff now and the platform still has some value. Personally id get shot and buy a celeron :)
 
This is what I've tried to point out and get some input on (only melmac chimed in). TSMC have 3 7nm nodes and a design-compatible 6nm. We know that Zen 3 isn't using the original N7 that Zen 2 uses, so straight away that's "free". It's entirely likely that AMD are splitting their products across the nodes to maximise throughput. I'd even wager that the consoles are using N7 - it's design compatible with N7 which is what Zen 2 was designed for and utilises some EUV which would benefit RDNA, without the need of going high-power nodes like N7+ and N7P.

They have enough capacity that if managed correctly supply shouldn't be a problem. It's not down to TSMC to manage that, its down to AMD to not **** it up.
 
Status
Not open for further replies.
Back
Top Bottom