• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

then 6 months down the line release a 3090 super with the factory overclock.

They won't do that until they switch to TSMC next year IMO. No one would buy it any way, given it's the same.

None of the RT games out now are any good IMO, and most have probably played them already. Cyberpunk may be decent, but who in their right mind pays £1400 for a GPU for one game?

Games are stale IMO. Making them look better doesn't change the fact they're all the bloody same.
 
Controlling costs is part of generational innovation.

If Intel were to find a manufacturing method which produced GPU's that beat Nvidia's performance by 30% but cost 500% more to manufacture, it would not make it to market. Intel would have to go back to the drawing board and come up with a better idea.

Now, Turing wasn't *that* ^ bad, but Nvidia should have considered that ray tracing might not be ready for prime time if it was going to cost so much and deliver so little.

Turing was never supposed to happen. It was supposed to be Ampere. Turing was a slightly shrunk Pascal with tensor cores bolted on for the sales pitch and nothing else. How many 20 series cards do you think they would have sold without RT? who would have paid £1400+ for a GPU that's 15-20% quicker than a 1080Ti? no one.
 
However, look at this pic.

Yy1rX6g.jpg

Let me explain that. There's a GPU die core, and a tensor core co processor.
Not seen this picture before. It makes a little bit more sense to have the co-processor away from the GPU die for cooling purposes but even then it would just be a cooling pad and backplate.

The 3090 custom card PCB with the co-processor directly behind the GPU die makes less sense. Heat transfer from the GPU and just a pad and backplate to cool it.
 
Look on the bright side at least you won't need to turn the heating on this winter. Summer is a different issue maybe the hose off the air con unit could be hooked up the exhaust fan on the PC case and shove it out the window. Global warming what global warming?!
Goes to show you that Silicone chips and associated memory modules etc are ASTONISHINGLY inefficient. Of a card drawing 200W power. How much of that is actually used as computation? I bet 199W of it is wasted as heat energy. Amazing how far we yet have to go. Look at light bulbs and LED's. About time computing got similar advances.
 
They won't do that until they switch to TSMC next year IMO. No one would buy it any way, given it's the same.

None of the RT games out now are any good IMO, and most have probably played them already. Cyberpunk may be decent, but who in their right mind pays £1400 for a GPU for one game?

Games are stale IMO. Making them look better doesn't change the fact they're all the bloody same.

People bought the 2080 Super and it was basically the same as the 2080 overclocked about 5% difference people still bought the 2080 super. Nvidia will always sell something.

I've said the same thing as you a few weeks ago and I got shouted down with "yes but look at all the console games with RT that will be ported over". We've gone back to eye candy over substance.

Swings and roundabouts. Few years time we will be talking about cards being so much faster than Xbone X1 or whatever it is. Another thing people forget is consoles last a few generations of GPU's. Yeah you could get a £500 console and not touch your PC but I bet a dollar to a pound people still buy GPU's over £500.
 
This way they can make GPU dies on one wafer, and the possible tensor core dies on another wafer. Making the tensor cores smaller in die size means better yields all around. Same goes for the GPU die itself.

And see, that is why Turing was so expensive. I've said it before but people don't listen, Turing was *really* expensive to produce. The dies were honking great monoliths, the failure rates were really high and the cost was high also (being TSMC). So only part of it was Jen doing you up the Gary Glitter, the rest was solely down to costs. The 2080Ti FE cooler cost them about $60 just to produce.

Alright lets roll with this for a min....
1. how exactly does failure rates effect Nvidia, as far as I know they pay TSMC per chip not per wafer they are a customer not a partner
2. TSMC 12nm is/was a mature node I don't know what the yields and failure rates were but it should be well within tolerance for them to manufacture big monolithic dies they were doing Volta and Tesla for years
3. If Nvidia build something that is more expensive to produce they could absorb some of that increase, its a commercial decision to pass it on to the customer
4. There is no way Nvidia isn't making crazy margin on the 2080+ Turing parts, the sales volumes went down and their GP went up
5. So Ampere should be cheaper than Turing then, now they've split out the tensor cores, done a die shrink and moved to a cheaper manufacturing process?

aka it was party round Gary's house

I agree that Turing was a bad design for consumers and cost of manufacture Im sure did go up... it didn't go up that much and Pascal was hardly a charitable product series either. They might seem like value now but holy cow they weren't cheap no sir that was more pink Gin and costumes round Gary's. Don't take my word for it I think it was Steve from GN who was told by AIBs that they make stoopid margin on the 2080Ti. Don't let the narrative of manufacture costs lead you away from what is going on.
 
Im very interested in a cheaper, no raytracing card with same raster perf. "GTX3080/3090" etc. I would definately buy that over a more expensive RTX card.

I think most people looking for a high-end GPU would happily drop the RTX for a lower price. However, Nvidia are looking to use RTX to make up for shortfalls in traditional performance and thus they will never provide a non-RTX card for the high-end GPU space (and probably not the mid-range either). You also have to remember that the long game is for Nvidia to saturate the market with RTX hardware such that developers are more willing to utilise it, and in turn create an ecosystem that necessitates RTX. Once this happens we move from RTX as a gimmick or nice-to-have, to an essential part of the gaming experience. Nvidia will then be positioned to make new generations of cards off this technology rather than potentially being hamstrung by ever-strinking generational improvements in traditional performance. Smartphone manufacturers have been doing the same thing with camera sensors, lenses etc. in recent years - it simply provides them with more options to distinguish one generation of products from another.
 
Don't take my word for it I think it was Steve from GN who was told by AIBs that they make stoopid margin on the 2080Ti. Don't let the narrative of manufacture costs lead you away from what is going on.
Someone will be along soon to attempt to debunk this :p

Manufacturing costs are up, r&d costs more etc. Like as if that was not the case every gen up until Pascal, yet prices never suddenly sky rocketed did they? And even yet we see Nvidia stocks reach record levels, I wonder why?

That is why I don’t get why people defend price increases with nonsense like the above.
 
We've gone back to eye candy over substance.

So in your mind RT is "eye-candy over substance", but all the other techniques are not?
Should we all be happy with Commodore-64 levels of graphics?

Personally I love the idea of more accurate reflections and better modelling of transparent and semi-transparent materials, it can all add to the immersion.
(I also have a lot of time for C64 games, before anyone asks...)
 
That is why I don’t get why people defend price increases with nonsense like the above.

Then maybe you should try and understand how it works.

Pascal dies were tiny. Margins were enormous. Maxwell dies (most of them) were tiny, margins were enormous. Kepler dies were mostly tiny too, margins were slightly less.

Tiny dies with high clocks will not work any more. For RT they need to be enormous, or at least contain a lot of stuff, even shrunk, yet still be quite large. It's partially to do with why Turing saw an enormous price hike, to new levels. Were the margins still there for Nvidia? of course they were ! *BUT* the part you are trying to dismiss is very real. Manufacturing costs were more than ever. The cooler alone cost them $50+ a piece TO MAKE. All of that gets tacked on at the end, then the margin is added.

That is why, even if the 3080 is faster than the 2080Ti in *everything* it "only" costs £800. That is a what? £600 saving over the 2080Ti, and in RT it will absolutely blow it away.

Whether you like that price? I don't really care man. I don't like it either and I won't be paying it.BUT. That does not dismiss the facts.

AMD won't do you any favours either. They'll just price slightly less than Nvidia. And still charge far too much. Get used to it, or, this December buy a console and walk away from it.
 
Someone will be along soon to attempt to debunk this :p

Manufacturing costs are up, r&d costs more etc. Like as if that was not the case every gen up until Pascal, yet prices never suddenly sky rocketed did they? And even yet we see Nvidia stocks reach record levels, I wonder why?

That is why I don’t get why people defend price increases with nonsense like the above.

Not in anyway suggesting the margins aren't going up on consumer cards, but you can't necessarily correlate it with stocks/revenue.

I don't have any facts and figures to back it up, but what other pies do they have fingers in? Thinking about growth through the Tegra stuff, to go alongside Tesla's growth, or the supercomputer/AI stuff. Suppose it would be interesting to see if they go in to such detail on financial records...
 
Turing was never supposed to happen. It was supposed to be Ampere. Turing was a slightly shrunk Pascal with tensor cores bolted on for the sales pitch and nothing else. How many 20 series cards do you think they would have sold without RT? who would have paid £1400+ for a GPU that's 15-20% quicker than a 1080Ti? no one.

If Tensor cores are the excuse for the high prices, why would 15-20% cost $1400

They can't have it both ways. A 2080ti without ray tracing would have been okay at $700
 
People bought the 2080 Super and it was basically the same as the 2080 overclocked about 5% difference people still bought the 2080 super. Nvidia will always sell something.

I've said the same thing as you a few weeks ago and I got shouted down with "yes but look at all the console games with RT that will be ported over". We've gone back to eye candy over substance.

Swings and roundabouts. Few years time we will be talking about cards being so much faster than Xbone X1 or whatever it is. Another thing people forget is consoles last a few generations of GPU's. Yeah you could get a £500 console and not touch your PC but I bet a dollar to a pound people still buy GPU's over £500.

No it wasn't the same and that statement is false.

RTX 2080, just CUDA cores alone = 2944. RTX OPs 60t.
2080 Super = 3072 CUDA cores and RTX OPs 63t.

So it wasn't basically the same. It also clocked much higher, but that's beside the point. It wasn't just a re released 2080 with the same spec and a higher clock. So they won't do that with the 3090 because you know? some one would notice.

They may release a Super with more CUDA cores but I doubt that. They only launched the Super cards AT ALL because of Navi. Or they wouldn't have bothered.
 
Back
Top Bottom