• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Possible Radeon 390X / 390 and 380X Spec / Benchmark (do not hotlink images!!!!!!)

Status
Not open for further replies.
I still remember people insisted the extra 1GB of the 7950/7970 comparing to the 2GB 670/680 offer no benefit, with the textbook responds of "by the time that games use more than 2GB of vram, there wouldn't be enough GPU grunt on those card".

Fast forward to today...crossfire 7950 or 7970 3GB still do quite alright, while SLI 670/680 or GTX690 have reached the point of getting choked by their 2GB vram...

It's still the case now, 99 out of 100 times you'd run out of grunt before VRAM.
When arguing for the case for the 7970 over 680 (at the time....2 new gens of cards have come out since then) the higher specs didn't matter and certainly didn't offer anything over the 680.

Sorry to ruin that for you :o

or he plays different games than you do :p he didn't say recent games after all ;)

Must be the case and he's still playing minesweeper or something, shouldn't need to say recent games when talking about games in a general sense. Surely it's implied that recent would be included and it's a nonsense statement that 7970s CF can 'max' them out.
 
i would not say max out but i remember my old 7970 xfire could play crysis 3 at 2560x1440 on highest settings apart from lowering the aa a bit, not bad for gpu's there age
 
It's still the case now, 99 out of 100 times you'd run out of grunt before VRAM.
When arguing for the case for the 7970 over 680 (at the time....2 new gens of cards have come out since then) the higher specs didn't matter and certainly didn't offer anything over the 680.

Sorry to ruin that for you :o
I was talking about Crossfire 3GB vs SLI 2GB, not single card.

Have seen at least couple threads where 690, SLI 680 users stating they are "running out of vram" and have to keep lowering quite a lot settings in games to keep the vram usage in-check and avoid stuttering, while I don't really recall seeing crossfire 7950/7970/280/280x or 7990 user saying they are lowering settings because of "running out of vram" yet.

The GK104 had a good run, but reality is that it is entering retirement home sooner than the Tahiti.

So I assume there will be zero arguments that the Titan X will last longer than the R9 390X since it has 12GB of VRAM against the 4GB or 8GB the AMD card will have??

:p
Didn't someone already said the Titan X will last longer than 390x due to having more vram? :D

Can't wait for the awesome double-standard comment of "Titan X will last longer than 390x due to having 4GB more vram, but 390x will not last longer than the 980Ti (assuming GPU grunt on par with one another) due to having 2GB more vram" :D
 
Last edited:
Heh it's nothing to how I feel...I was merely commenting of what I've seen :p

To let you the truth I don't really care all that much anymore...like the GameWorks argument, it's quite clear that there will never be an agreement on what's responsible what is truly responsible for the frequent underperforming of AMD cards in those titles. Since we will never get to the bottom of it...so I would just point it out as a "known occurance that is very likely to happen" to people that are considering buying new cards, and that if they don't not wish to take the gamble on the possibility of AMD cards underperforming in GameWorks title, they should cough up the extra and go for Nvidia instead.
 
It maybe not if NV stop optimizing for older cards like they did for Kepler.

Agreed, Nvidia's practices just get worse lol. The amount of posts I've read about people losing performance on their 7XX series cards after Maxwell launched..

Oh well it is what it is, if you want Nvidia you need to pay a lot and pay often..

Where you at AMD? Hurry up and launch something new !
 
Agreed, Nvidia's practices just get worse lol. The amount of posts I've read about people losing performance on their 7XX series cards after Maxwell launched..!
So Nvidia drivers are like ISO...if you don't keep up with their annual release like the iphone, you should expect the software update will depreciate the performance of your existing hardware? :eek:
 
Unless you're running 640x480, they're not...are they. Unless of course, you're not really maxing out the games ;)

Well I mainly play counter strike, lol. But also currently playing gta 5 an it's all maxed out except lowered aa advance settings. Though I'm still in the 100+fps range so can push a little further on the eye candy.
 
This might be what AMD make 14nm chips on:

https://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=7046977

14nm SOI finfets, able to handle any size die

This presentation was made by IBM 5 months ago. IBM recently sold their fabs to GF and will provide them with the process so they can be ready in time for the Oak Ridge Titan replacement in 2018, which they have the CPU contract for.

tl;dr 14nm viable for big chips with all the benefits of SOI (more power efficient than plain silicon)
 
How are people judging the Titan X as overpriced, because they cant afford it? Is there a cost to make breakdown anywhere?

Margin is per Titan X is around =>60% per card but that 60% is once you have removed the 20% VAT, 8% Retailer Margin and import costs duty.

On a $1000 card, purchased in UK I would estimate Nvidia are making around $360-$400 per Titan X in profit. For them to make no profit the purchase cost would be around $600inc (around £380inc) per Titan X.

This margin is similar to Intel but lower than other companies such as those who produce purely Software where margins of 200% are normal.
 
+1 ^^

Profits will be huge, very mature 28nm (Cheaper). Full chip being sold for way more than it could if competitors had competitive products in the market already.

The proof is in the pudding >>> http://www.bit-tech.net/news/hardware/2015/02/12/nvidia-record-revenue-2015/1

That was before Titan X :eek:

Imagine the profits now..

T4ALGEx.gif
 
Margin is per Titan X is around =>60% per card but that 60% is once you have removed the 20% VAT, 8% Retailer Margin and import costs duty.

On a $1000 card, purchased in UK I would estimate Nvidia are making around $360-$400 per Titan X in profit. For them to make no profit the purchase cost would be around $600inc (around £380inc) per Titan X.

This margin is similar to Intel but lower than other companies such as those who produce purely Software where margins of 200% are normal.

Source data?
 
Average gross margin for the Nvidia product range is 55%. This is publicly available information updated on a quarterly basis.

Titan X will be higher margin than the lower end products, they always are. It may be higher than 60.

Intels margin is 60% across the range, lower at the low-end and higher at the top end.

e-tailors which sell such products will operate in a 0-20% bracket with 8-12% being the norm. I would expect the industry average in computer components is 8% per product as they operate on volume.
 
Status
Not open for further replies.
Back
Top Bottom