• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA GeForce RTX 3080Ti to be "available" on June 3rd, RTX 3070Ti on June 10th

I'm not sure the margins are that bad for nVidia - people tend to look at the prices when buying in single or 10s of units but they will have negotiated agreements/commitments and purchasing in multiples of 1000s where the price goes waaay down. For instance for me or you buying the parts to build some of the power regulation circuitry, even assuming we could get the parts at retail bulk discount, would cost like £50 but for a company like nVidia it is only about £2-3.

Likewise with GDDR nVidia have relationships there, have been involved with supporting development and taken risk production at times, etc. and have access to pricing even many organisations don't have.
Yes they will have negotiated contracts, which will likely have an end date, to then be renegotiated. Or there may be termination clauses if parts manufacturer costs change by a given percentage.
Yup, bulk discounts are a powerful thing, but they do have limits, again when manufacturer costs go up, discounts shrink.
Again Yes, nvidia have relationships, which potentially save them money at this stage, but did cost them in r&d and seed money earlier in the development cycle which would need to be recouped.
However this is all speculation, we are not privy to actual figures, so in the end, who knows if any of us are even close.
 
but did cost them in r&d and seed money earlier in the development cycle which would need to be recouped.

An aspect that is often overlooked when people are trying to estimate margins by trying to figure out the BoM but there are other factors to that such as being spread out over all the product lines where it is applicable, etc.
 
Yes they will have negotiated contracts, which will likely have an end date, to then be renegotiated. Or there may be termination clauses if parts manufacturer costs change by a given percentage.
Yup, bulk discounts are a powerful thing, but they do have limits, again when manufacturer costs go up, discounts shrink.
Again Yes, nvidia have relationships, which potentially save them money at this stage, but did cost them in r&d and seed money earlier in the development cycle which would need to be recouped.
However this is all speculation, we are not privy to actual figures, so in the end, who knows if any of us are even close.

But we are privy to public figures: https://nvidianews.nvidia.com/news/nvidia-announces-financial-results-for-first-quarter-fiscal-2022

Record revenues for the gaming division, data centre division and overall total. +106% rev increase year on year in the gaming division. Gross margin across the business 64%. Would indicate to me they aren't making a loss on any of their products.
 
let me ask you a question. If Nvidia are so wrong, why are they 70-80% of the consumer market? (not mining)

Why, when I offer AMD systems 10%+ cheaper than equivalent performance Nvidia systems do I still sell ten Nvidia systems for every AMD one?

As much as I'd LOVE for AMD to be a realistic competitor. They aren't. Sure, their cards have raw horsepower but they are about five years behind on the tech side and as a result, real world performance doesn't quite stack up enough to generate the demand that we'd like to see. Seriously! if we could get Radeon to a competitive share without having to sell the cards at prices where we lose money, we'd absolutely do it, because we know that certain competitors really struggle when AMD are strong.

When Nvidia pricing started to rise, we had lots of AMD stock at decent pricing (on the SI side I still do) but even WHEN 3070 pricing was close to 50% more than my 6700XT options the split was more than 30:1 :(

According to this intel power most graphics on PCs, followed by AMD, then Nvidia. The reason Nvidia sell more high end GPU is that AMD and intel haven't had the products to compete. • PC GPU market share worldwide by vendor 2020 | Statista
As far as I can see OcUk has sold every 6800 & 6800xt that they have had near MRSP. I suspect that the 6700xt would be selling much better if it was near MRSP. If you have any decent AIB 6800xt in stock near MRSP I will gladly take one.

Whatever the merits of either Nvidia or AMD it seems clear at the moment that both Nvidia and Amd are working to increase profits, eg we have yet to see the lower end 6600xt from AMD on desktop
 
An aspect that is often overlooked when people are trying to estimate margins by trying to figure out the BoM but there are other factors to that such as being spread out over all the product lines where it is applicable, etc.

And interest rate risk, and forex risk and, and, and.
TLDR - too many factors ‐> who the hell knows.
 
An aspect that is often overlooked when people are trying to estimate margins by trying to figure out the BoM but there are other factors to that such as being spread out over all the product lines where it is applicable, etc.

Nvidia are doing ok by the looks of it and will likely further increase revenue in Q2 thanks to the higher margin on products like the 3080ti

Gaming.
  • First-quarter revenue was a record $2.76 billion, up 106 percent from a year earlier and up 11 percent from the previous quarter.
 
OK, I over simplified and maybe overestimated, it won't take AMD five years to catch up I'm sure, but I was asking a genuine question.

You are talking to a guy who gained an electronic engineering degree before moving into this industry in my early 20s and has worked on both sides of it, vendor and retail and I've probably owned more Radeon than GeForce over the years. That said, from my time in vendorland, I know how long Nvidia have been working on technologies to differentiate themselves from their competition, where as, perfectly illustrated by your bulldozer example, AMD focus was on raw power until Lisa Su stepped in. That said, her focus has been firmly on CPU until recently and they've see the benefits of that. RDNA is still a very "power" focused development and their "technologies" just appear, from the outside to be reverse engineered from what NV are doing regards RT and DLSS. As I said, I wish I could convince more people to buy Radeon rather than fighting over Nvidia, but right now it's a one sided fight and the winners are sorely aware of that fact.
Out of interest, in your opinion, how much do you think the 3080 should have been priced at launch?

I must admit, when they was doing the launch vid and showing the numbers, I was expecting a grand minimum.
 
my experience with aftermarket cards varies from this. Yes the money is an issue, especially when compared to launch pricing for the 3080, BUT, comparing current pricing for the 3080 with launch pricing for the Ti isn't as much of a shock and..in every game I threw at the cards that I was testing, the performance increase over an already top of the range 3080 OC was a few times what you are quoting there.



You're right but I've lost count of the number of times that I've said that the 3080 FE was too cheap and Nvidia were certainly having to eat into their usual profit margins heavily to maintain their MSRP for that card.

Honestly, are we in a position yet to look past launch day prices? I may have missed card sales from OcUK, but I thought launch day orders are still outstanding and people are in a queue for cards at launch day prices, and no cards have subsequently been sold (at current market prices).

Virtually no one has AIB 3080 in stock, therefore 3080 FE cards are actually the easiest to purchase (I'm not saying easy, I'm saying easiest).

Therefore, is fair to compare 3080 Ti process vs 3080 launch day prices.

I think the GN review was spot on, the Ti fills a price gap, not a performance gap. If a GA102 chip does not meet spec for a 3090, that's a big price drop to the 3080.
 
Exactly. If going forward the cost of building a gaming PC is going to be over 2K then most people will choose a console instead.

I won't be upgrading for the foreseeable as prices are just silly.

I'm wary of second hand graphics cards, as they'll likely have been thrashed by a miner.

So, I'm thinking that unless I see a reduction in new item prices, consoles will tart to become very attractive in the new year.
 
Can anyone hazard a guess or if they know what time we might be able to fail at buying one of these tomorrow? I seen 14:00 is that official?
Thanks, reading through the pages I may of missed this detail.
 
That is marketing though, which was my point. The same reason people still have the same hangover and doubt when looking at Ryzen. Tell me what are the percentages in system sales for CPU brands, and how has that changed over the last 3 years? I am sure that is data you collect, or is pretty obvious from the sales volume of the SI parts purchased/consumed
it varies by a few percent month on month slightly but as an average over the last couple of years it's 50:50.

Interesting take especially when you consider the about face Nvidia pulled not so long ago with VRR, cough, I mean G-Sync. I think AMD have always tended to go with a more open approach, rather than a proprietary one, and that harms them from a marketing point-of-view and maybe even sometimes a performance or best-in-class at XYZ.
It's funny that you should use that as an example. I was a really pessimistic when Nvidia first came to us with G-Sync. I did a LOT of tests at one point on various G-Sync/Freesync monitors and found G-sync to be the much better technology, working from as low as 15FPS reliably. When the premium for the G-Sync module was $150 it was a real no brainer then Nvidia realised what they had on their hands and the pricing started to increase.

Not when you look at how the new consoles are priced. Which is an important factor imo.
Consoles are loss leaders with 95% of profits generated after sale. Don't even TRY to make that argument because it demonstrates a massive lack of understanding.
 
I think this is where we have to disagree, but I think that's because I judge things based on performance rather than some arbitrary model number (I find people saying "but X060 cards should cost XXXX" really irritating to be totally honest)
There's two ways of looking at it.
Either the 3080 was too cheap or the 3080 was too fast. Judging from the small performance difference between the 3080 and the 3090, maybe the latter is more viable.
I think you judge this as a business man who also sells these products and not a consumer who must fork out his hard earned cash to buy them at full price.
 
Consoles are loss leaders with 95% of profits generated after sale. Don't even TRY to make that argument because it demonstrates a massive lack of understanding.

Hence still sell at MSRP aside from retailers doing silly bundles like the £100 t-shirt thing :s
 
Just watched some reviews of the 3080Ti.

All I can say is. "HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHHAHAHAHHHHHAAAHHAHAHA Ah ahHah hah HAHAHHhahh H a hhah ha .. ha... haaaaa.. ha. ha."
 
Out of interest, in your opinion, how much do you think the 3080 should have been priced at launch?

I must admit, when they was doing the launch vid and showing the numbers, I was expecting a grand minimum.
I feared somewhere around a grand but I thought about £800.
The biggest issue though wasn't the price, it's the unrealistic MSRP for the FE which, based upon NVs GPU cost, didn't give their partners a cat in hell's chance of getting close to it.
The result of this tactic is two-fold...better review results and a bigger direct market share.

It's no secret that, in a usual (not right now admittedly) Nvidia GPU sale, Nvidia make 65% margin and expect their AIB partner, distributor and reseller to make less than 10% each. That's untenable when costs are so high and supply is so limited.

Considering how little they usually allow their partners to make, tipping the balance so far in their favour as to cut their "partners" out of the majority of sales altogether shows questionable morals IMO.
 
I feared somewhere around a grand but I thought about £800.
The biggest issue though wasn't the price, it's the unrealistic MSRP for the FE which, based upon NVs GPU cost, didn't give their partners a cat in hell's chance of getting close to it.
The result of this tactic is two-fold...better review results and a bigger direct market share.

It's no secret that, in a usual (not right now admittedly) Nvidia GPU sale, Nvidia make 65% margin and expect their AIB partner, distributor and reseller to make less than 10% each. That's untenable when costs are so high and supply is so limited.

Considering how little they usually allow their partners to make, tipping the balance so far in their favour as to cut their "partners" out of the majority of sales altogether shows questionable morals IMO.
I do wonder how that "normal" 65% breaks down across product lines. 65% for all lines, or across the range, with different percentages per product.
 
I think you judge this as a business man who also sells these products and not a consumer who must fork out his hard earned cash to buy them at full price.

I would agree with him. I would say that I think £600-650 is probably the right price range for XX80 class cards, but Nvidia (likely due to the fear of competition) gave away more core than they usually would.

Since the 900 series and the 980, the XX80 cards have always been the 2nd best gaming core. So GM204, GP104, TU104. Now with the 3080 it got an upgrade for the 'big' core in the GA102. That's all well and good, but it rather compromises the highend product stack. There's far less wriggle room than usual to fit a 3080Ti in.

It's also no suprise, we knew the core configs back in September last year so it was already known where a 3080Ti would fit. The only thing we didn't know was the price and VRAM config, and now we do. As a 3090 owner even I think $1199 is a very steep ask for the 3080Ti, but this generations flagships products were always going to be a mess from how they were configured. The mining and stock issues have just made a mockery of what would otherwise have still been a weird and relatively 'poor' 3080 Ti (judged from previous generations).
 
I do wonder how that "normal" 65% breaks down across product lines. 65% for all lines, or across the range, with different percentages per product.
Oh, their margins are even higher on professional products but, don't forget, when they report 60-70% profits that's after costs such as R&D and marketing.

I'm certain that they're no longer making 60%+ on 3080FE but that's because thry are also the only entity in the he chain who make enough to hide the increased costs.
 
Back
Top Bottom