• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series


Needs to drop more
It must. The 79xx cards are going to take 4080 sales otherwise. Perhaps they are hesitant to not make the first move on moving down to the £1k and below price until the 79xx is out.
A 160 bit bus on a 10gb 4070 is likely going to hobble performance similar to what we've just seen with the 3060 8gb, even the unlaunched 4080 12gb didn't look any faster than a 3090 and that had a 192 bus and a less cut down die.
I’ll have to consider the price cut 4080 16gb if it comes down to a reasonable price. If not then AMD may be best as that will be significant cheaper. I can afford a 4090 but it’s just way too much money.
 
It must. The 79xx cards are going to take 4080 sales otherwise. Perhaps they are hesitant to not make the first move on moving down to the £1k and below price until the 79xx is out.

Worst case scenario is very little stock of the 79xx so Nvidia wont adjust their pricing and from the hints I have seen its going to be the case.
 
It must. The 79xx cards are going to take 4080 sales otherwise.

Nope, not going to happen, Nvidia will still outsell AMD 100 - 1 even if their card is faster and cheaper.

Nvidia is pulling a fast one anyway, They will likely drop the 4080 price within £100 of so of the 7900xtx, Then people will think "What a bargain" at 1k - 1100ish. Even though it should be at most a £750 card.
 
Didn't someone point out that margins (avg across all products) are down to something like 7-8% once you take into account R&D etc. costs? Halo products will have higher margins for sure but they have lower volume too. Doesn't justify stupid prices but does explain why you can't use bill of materials alone to set pricing.
I doubt it. It was something like 50-60% gross margin when someone posted the earning reports a month or two back for nvidia and amd.
To keep things simple:
5800x -> 81mm sq. (main die) @ $449 MSRP -> 1mm sq will cost $5.54
6800xt -> 520mm sq @ $649 MSRP -> 1mm sq will cost $1.25
Sure, you get the I/O die on the CPU (which is on 12nm), but also you get the vRAM on the GPU, much more complicated power delivery, PCB, cooler, etc., all which will probably cost significantly more to make than the CPU. I'd also assume that the development of a GPU is more difficult/more expensive than a CPU.

Basically AMD were happy with the profit they've made for 6800xt, but decided to charge 4.4x higher for the CPU... So I wouldn't put much weight on "costs". Is just matters how much you can convince the buyers to pay crazy money for your products.
 
Just citing wafer costs misses the point of R&D.

To put it another way, if I were to give TSMC whatever amount of money for a wafer, that wafer would be useless since I have no idea how I want the transistors on that wafer laid out.

Organizing those tiny transistors into something useful is its own expense.

I'm not saying the net profit margins are large or small, but without knowing the money that went into designing the various architectures, we don't have an accurate measure of profit or loss.
 
Last edited:
Just citing wafer costs misses the point of R&D.

To put it another way, if I were to give TSMC whatever amount of money for a wafer, that wafer would be useless since I have no idea how I want the transistors on that wafer laid out.

Organizing those tiny transistors into something useful is its own expense.

I'm not saying the net profit margins are large or small, but without knowing the money that went into designing the various architectures, we don't have an accurate measure of profit or loss.
Dave Jones of EEVBlog did a very good video a few years ago on this very subject. He said take the total BOM (Bill of Materials) costs and multiply by a factor of 3-4 before you can even think about making a profit in electronics.
 
Colourful shows off the 4070ti, has exact same specs as the cancelled 4080 12gb

 
Probably an announcment. Nobody in their right mind launches a product right after christmas.
when is CES? That is the one that happen in January right?

They would get slated in reviews as it would be them openly and publicly admitting, that they did try and sell us a rebadged 4060ti 4070ti as a 4080

...I _think_ the 4080 not 4080 chips were already with AIB partners when the name was pulled, and if there is a new SKU being released in January it must have had a lengthy production run at TSMC and the AIB partners already to be relesed at any real volume. If there is a new Nvidia gaming GPU released in early Jan I'm almost certain it will be the 4080 not 4080.
 
Last edited:
Colourful shows off the 4070ti, has exact same specs as the cancelled 4080 12gb

Wintermute makes a good point the chips were in the hands of AIB so they can’t recut them to something lower.

Nvidia really shot themselves in the foot in the naming department with all this nonsense. So what price are they going to flog this for. The price brakcet they wanted is firmly occupied by AMD. Good luck trying to sell £700 4070ti (actually I don’t think they need it, it will probably sell). But it will look like terrible value next to the 3080.
 
Just citing wafer costs misses the point of R&D.

To put it another way, if I were to give TSMC whatever amount of money for a wafer, that wafer would be useless since I have no idea how I want the transistors on that wafer laid out.

Organizing those tiny transistors into something useful is its own expense.

I'm not saying the net profit margins are large or small, but without knowing the money that went into designing the various architectures, we don't have an accurate measure of profit or loss.
Ryzen 5xxx was almost 4.5 times more expensive in R&D than 6xxx was? I doubt it. Also, since R&D would be the "ultimate cost" wouldn't that make redundant any talk about increasing costs of the new processes (the very thing a lot of people liked to through around in the past to support the hike in prices) ? :)
 
Ryzen 5xxx was almost 4.5 times more expensive in R&D than 6xxx was? I doubt it. Also, since R&D would be the "ultimate cost" wouldn't that make redundant any talk about increasing costs of the new processes (the very thing a lot of people liked to through around in the past to support the hike in prices) ? :)
At the risk of missing the point (and coming across as very boring)...It’s all in the 10-Q reports. In excruciating detail….

NVIDIA Q3


AMD Q3



Revenue is everything booked to end customers (which I think for Gaming is chips to AIB partners…even the FE cards are not manufactured by Nvidia). Cost of Goods Sold (COGS) represents all the direct selling costs, including manufacturing/fabs, logistics, marketing etc. Revenue less COGS is Gross Profit. Gross Profit/Revenue is gross margin. You then have to deduct selling and general expenses – which includes a whole bunch of costs that can’t be directly attached to sales (think central costs, HQ etc.) to arrive at EBITDA – earnings before interest, taxation, depreciation and amortisation. You can then deduct depreciation – the cost of investments in property (offices, hardware) with a bit expensed each year as its useful life runs out – and amortisation – the cost of past development/design costs, expensed over its useful economic life. You then arrive at EBIT or operating profit. Take off your financing costs, and you’re left with what owners of the company (shareholders) get. It’s all in the notes to the accounts.

It's a bit more complicated than the above, but gross margin is only a very small part of the story!
 
Last edited:
Back
Top Bottom