• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA GeForce RTX 3080Ti to be "available" on June 3rd, RTX 3070Ti on June 10th

Yet everyone else thinks I'm nuts for saying that the 3080 FE was too cheap...

I really can't follow you're rational in the slightest. The 20 series was RT beta testing generation and paying for Nvidia's R&D. The bump in performance this gen was to account for that. The 3080Ti is just a straight money grab. Some execs read the market and realized they have a license to print money right now.

3070 vs 3080 = 1440P 20% AVG FPS increase - 4K 25% AVG FPS increase. €200 difference. Roughly €10 per %
3080 vs 3080ti = 10/12% - 4K 10/12% = €500 difference. Roughly €50 per %

If it wasn't for the current unprecedented stock shortages, Nvidia would get ridiculed to oblivion for this utter irrelevance of a GPU. I expect they won't even bother churning out any more 3080's even if they could.

Pretty sure that's the point.
 
I tend to want to measure on neither performance nor model number, but on manufacturing and distribution costs.
Pretty much impossible for a non insider I know, but I suspect Nvidia made very slim margins on the 3080fe from the start.
As has been said before the 3080 price was most likely a reaction to AMD cards price to performance ratios. This likely then came round to bite them when fab costs started rising, If they had insufficient measures to offset or mitigate this, they have likely been acting as a loss leader on 3080fe cards.
The 3080ti fe is probably priced closer to cost, possibly set more to the side of profit in order to offset losses on the 3080.

If you think NV made losses on any 3080 you are nuts mate.

I also don't think it was a reaction to AMD cards price/perf, but the launch of PS5/XONEX with their 4K/RT marketing that changed their thinking on price.
 
I tend to want to measure on neither performance nor model number, but on manufacturing and distribution costs.
Pretty much impossible for a non insider I know, but I suspect Nvidia made very slim margins on the 3080fe from the start.
As has been said before the 3080 price was most likely a reaction to AMD cards price to performance ratios. This likely then came round to bite them when fab costs started rising, If they had insufficient measures to offset or mitigate this, they have likely been acting as a loss leader on 3080fe cards.
The 3080ti fe is probably priced closer to cost, possibly set more to the side of profit in order to offset losses on the 3080.

I'm not sure the margins are that bad for nVidia - people tend to look at the prices when buying in single or 10s of units but they will have negotiated agreements/commitments and purchasing in multiples of 1000s where the price goes waaay down. For instance for me or you buying the parts to build some of the power regulation circuitry, even assuming we could get the parts at retail bulk discount, would cost like £50 but for a company like nVidia it is only about £2-3.

Likewise with GDDR nVidia have relationships there, have been involved with supporting development and taken risk production at times, etc. and have access to pricing even many organisations don't have.
 
I'm glad that is what you took away from the whole post. A complaint about the level of stock that AMD can offer. Also 5 years behind. :cry::cry::cry::cry::cry: I've worked in the computing (hardware engineering) industry most of my life, and they were 5 years behind with Bulldozer, but you saying they are 5 years behind with RDNA2 just shows either total bias or lack of real engineering knowledge.

If you don't understand that it is only about marketing, then you are a lost cause I am afraid, it mostly bluff and bluster, and no I am not saying they don't have 'faster' cards, but you are saying FIVE YEARS, that is 2016, so AMD would actually be making as card as fast as a 980 Ti if that were the case. The yare 15 years behind, in marketing BS, I'd agree with that. ;)
I'm glad that is what you took away from the whole post. A complaint about the level of stock that AMD can offer. Also 5 years behind. :cry::cry::cry::cry::cry: I've worked in the computing (hardware engineering) industry most of my life, and they were 5 years behind with Bulldozer, but you saying they are 5 years behind with RDNA2 just shows either total bias or lack of real engineering knowledge.

If you don't understand that it is only about marketing, then you are a lost cause I am afraid, it mostly bluff and bluster, and no I am not saying they don't have 'faster' cards, but you are saying FIVE YEARS, that is 2016, so AMD would actually be making as card as fast as a 980 Ti if that were the case. The yare 15 years behind, in marketing BS, I'd agree with that. ;)
OK, I over simplified and maybe overestimated, it won't take AMD five years to catch up I'm sure, but I was asking a genuine question.

You are talking to a guy who gained an electronic engineering degree before moving into this industry in my early 20s and has worked on both sides of it, vendor and retail and I've probably owned more Radeon than GeForce over the years. That said, from my time in vendorland, I know how long Nvidia have been working on technologies to differentiate themselves from their competition, where as, perfectly illustrated by your bulldozer example, AMD focus was on raw power until Lisa Su stepped in. That said, her focus has been firmly on CPU until recently and they've see the benefits of that. RDNA is still a very "power" focused development and their "technologies" just appear, from the outside to be reverse engineered from what NV are doing regards RT and DLSS. As I said, I wish I could convince more people to buy Radeon rather than fighting over Nvidia, but right now it's a one sided fight and the winners are sorely aware of that fact.
 
If you've got a stash of msrp radeon cards laying around i'll be first in line to buy those instead :cry: no convincing needed
Sorry AMD allocated them for our SI business (yet another reason why I have NO reason to be biased towards Nvidia), all of the Retail stock sold instantly.
 
OK, I over simplified and maybe overestimated, it won't take AMD five years to catch up I'm sure, but I was asking a genuine question.

You are talking to a guy who gained an electronic engineering degree before moving into this industry in my early 20s and has worked on both sides of it, vendor and retail and I've probably owned more Radeon than GeForce over the years. That said, from my time in vendorland, I know how long Nvidia have been working on technologies to differentiate themselves from their competition, where as, perfectly illustrated by your bulldozer example, AMD focus was on raw power until Lisa Su stepped in. That said, her focus has been firmly on CPU until recently and they've see the benefits of that. RDNA is still a very "power" focused development and their "technologies" just appear, from the outside to be reverse engineered from what NV are doing regards RT and DLSS. As I said, I wish I could convince more people to buy Radeon rather than fighting over Nvidia, but right now it's a one sided fight and the winners are sorely aware of that fact.
if the 6800XT was in stock and available for its £600 MSRP then I think many would go for this over paying double for a 3080ti but unfortunately since its also overpriced then people just think if they are having to pay ridiculous prices for either then they may as well get an Nvidia card.
 
Also while nvidia outsell AMD that will also mean the games written will make full use of Nvidia tech like DLSS before thinking about adding support for whatever AMD add.
 
OK, I over simplified and maybe overestimated, it won't take AMD five years to catch up I'm sure, but I was asking a genuine question.

Yes, and a 4x-5x discrepancy in accuracy brings yours motives in to doubt, but thank you for the clarification it is appreciated.

I wish I could convince more people to buy Radeon rather than fighting over Nvidia, but right now it's a one sided fight and the winners are sorely aware of that fact.

That is marketing though, which was my point. The same reason people still have the same hangover and doubt when looking at Ryzen. Tell me what are the percentages in system sales for CPU brands, and how has that changed over the last 3 years? I am sure that is data you collect, or is pretty obvious from the sales volume of the SI parts purchased/consumed.

RDNA is still a very "power" focused development and their "technologies" just appear, from the outside to be reverse engineered from what NV are doing regards RT and DLSS.

Interesting take especially when you consider the about face Nvidia pulled not so long ago with VRR, cough, I mean G-Sync. I think AMD have always tended to go with a more open approach, rather than a proprietary one, and that harms them from a marketing point-of-view and maybe even sometimes a performance or best-in-class at XYZ. In the end though it is what prevails as the dominant choice that matters, not how fast XYZ is for a couple of years in the middle. I'd say it's a 70-30 split for the systems I design, but then again a huge number of them have no GPU in at all, not even on the CPU.
 
Sorry AMD allocated them for our SI business (yet another reason why I have NO reason to be biased towards Nvidia), all of the Retail stock sold instantly.

Can't you "install" them in a t-shirt :p (like some retailers are doing with other stuff).

(Not a serious suggestion)
 
The cards only are cheap if you just ignored the distorted market,and Nvidia using smaller and smaller dies at higher and higher price points. So for enthusiasts and others who like throwing money at the hobby it has not been as noticeable,but for most gamers you can see how things are moving south at mainstream and entry level price-points. It gets even worse for prebuilt systems,as its even more evident.

80 series cards tended to use at least a bin of the top gaming GPU,and Nvidia quietly with Maxwell/Pascal pushed the GPUs another tier up,as AMD couldn't compete. Also people have conveniently forgotten the price rise with Turing,as its all relative to one of the worst generational improvements at launch. Even the GTX1080TI wasn't cheap,as the GTX1080 was at over £500 before that,and the GTX980TI was at that price point before,but at least the performance was there.

If anything you can see Nvidia has essentially pushed up the 80TI price point above £1000,just like they did with Turing. So realistically Turing price-points really didn't change that much. Also the FE is a bait and switch,as Nvidia can say they hit the review price,but the reality is the actual streetprice for aftermarket models will be higher for the lifespan of the product. Its just more noticeable now because of miners.
 
I'm grabbing my popcorn and getting ready for all the people complaining about the price and NVidia money grabbing yet will still pre-order then complain for 5 months waiting for the order to be fulfilled :D
 
Back
Top Bottom