Associate
- Joined
- 19 Jun 2017
- Posts
- 1,029
Sales volumes are pretty high, but Nvidia could have seen it coming. The RTX3080 launch sales volume was higher than the GTX1080, GTX1080ti, RTX2080 and RTX2080ti combined!
Data pls?
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Sales volumes are pretty high, but Nvidia could have seen it coming. The RTX3080 launch sales volume was higher than the GTX1080, GTX1080ti, RTX2080 and RTX2080ti combined!
Funny how some only see the posts where I'm critical of AMD and/or defend nVidia.
Won't hide I don't like AMD, never have hidden it. But I would jump on any other brand if there was a viable alternative to nVidia I dislike them also.
People won't have it but I honestly believe that they massively underestimated demand - global volume of the cards available is slightly down on previous launches but not much - the charts someone posted before in terms of supply aren't really that different from the first few weeks of previous GPU launches.
To be fair, people's attitudes towards ray tracing have been purely based on the woeful implementation of real time hardware in Turing. It was never about "ray tracing", it was about price gouging beta-test hardware. And now it seems Ampere hasn't been enough of a forward push after 2 years. With AMD joining the fray, there will be a significantly greater incentive for game devs to implement it because there's just going to be more hardware available to warrant the effort. And therefore there will be a greater incentive for Nvidia, AMD and Intel to ramp up the development of their RT tech because it's getting used.EDIT: I'm especially looking forward to watching people's attitudes towards ray tracing change over time.
That's why you are still rocking Haswell- You call yourself a tech
- The door is that way ---->
Funny how some only see the posts where I'm critical of AMD and/or defend nVidia.
Won't hide I don't like AMD, never have hidden it. But I would jump on any other brand if there was a viable alternative to nVidia I dislike them also. (EDIT: But I don't go out of my way to hate on AMD either).
if AMD bring the hammer with similar performance at a cheaper price to offset the lack of "Nvidia" features such as DLSS and RTX (AMD will have inferior Ray Tracing we know this, but we expect it to be better than Nvidias 1st gen Ray tracing anyhow, so id call that a win
I'd absolutely expect that to happen though... other than the most ardent of naysayers, most that speak negatively about ray tracing do so not because they think the technology is fundamentally crap, just that it has not yet matured to the point where they care about it enough to justify the performance impacts.
That's why you are still rocking Haswell- You call yourself a tech
- The door is that way ---->
Haswell? IvyBridge-E!
Lol i also ditched my 4770k to get a small upgrade to the Ryzen 1700 just to get off Haswell, mine was particularly hot and thirsty for power, a bad OC'er.
To be fair, people's attitudes towards ray tracing have been purely based on the woeful implementation of real time hardware in Turing. It was never about "ray tracing", it was about price gouging beta-test hardware. And now it seems Ampere hasn't been enough of a forward push after 2 years. With AMD joining the fray, there will be a significantly greater incentive for game devs to implement it because there's just going to be more hardware available to warrant the effort. And therefore there will be a greater incentive for Nvidia, AMD and Intel to ramp up the development of their RT tech because it's getting used.
As I've said before, thank you Nvidia for taking the first step and getting this ball rolling, an thank you AMD for actually popularising it.
Sales volumes are pretty high, but Nvidia could have seen it coming. The RTX3080 launch sales volume was higher than the GTX1080, GTX1080ti, RTX2080 and RTX2080ti combined!
Doh so it is.. I see 4XXX and straight away default to haswellIll get my coat!
Oh I thought Rroff's signature was for jokes lol
DDR3, 4820k, 250gb sata, 1070 - ouch lol, next gen consoles launching in 4 weeks are several times faster than that PC
I find that a lot around here... unfortunately in the long run it tends to turn out I was right.
EDIT: I'm especially looking forward to watching people's attitudes towards ray tracing change over time.
Also the 4770k probably better at gaming by quite a bit![]()
This is what I've tried to point out and get some input on (only melmac chimed in). TSMC have 3 7nm nodes and a design-compatible 6nm. We know that Zen 3 isn't using the original N7 that Zen 2 uses, so straight away that's "free". It's entirely likely that AMD are splitting their products across the nodes to maximise throughput. I'd even wager that the consoles are using N6 - it's design compatible with N7 which is what Zen 2 was designed for and utilises some EUV which would benefit RDNA, without the need of going high-power nodes like N7+ and N7P.That's wafers not chips, we are also then assuming in the posts on this page that CPU's, GPU's, consoles etc are all produced on the same N7 node rather than mixed over the different N7, N7+ N5 etc nodes.
I'm in half a mind between sticking a Xeon 1650 or 1680 V2 in there or buying a new setup - the X79 platform holds up surprisingly well if you get a good overclocking chip. But only if I can source the CPUs fairly cheap.
This is what I've tried to point out and get some input on (only melmac chimed in). TSMC have 3 7nm nodes and a design-compatible 6nm. We know that Zen 3 isn't using the original N7 that Zen 2 uses, so straight away that's "free". It's entirely likely that AMD are splitting their products across the nodes to maximise throughput. I'd even wager that the consoles are using N7 - it's design compatible with N7 which is what Zen 2 was designed for and utilises some EUV which would benefit RDNA, without the need of going high-power nodes like N7+ and N7P.