Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
.... dependent on global supply of wafers on advanced nodes - noting that there will be many companies vying for a finite wafer supply.In a year or three, the in house AI accelerators will hopefully start to reduce GPU sales by a big chunk and AMD/NVidia will be back waving the we love gamers flag.
You are wrong still, even apple uses tmsc among other companies so yes it's relevant because it's paints a picture where the prescalped prices only exist on gpus. I've explained already the chain is the same yet other companies have even achieved reduction in prices.
We need Intel to get its fabs sorted and working on custom designs for other companies..... dependent on global supply of wafers on advanced nodes - noting that there will be many companies vying for a finite wafer supply.
... and hope that nothing untoward happens to the world leading fabs in Taiwan, noting that TSMC has opened two fabs in Arizona already with at least one more planned.We need Intel to get its fabs sorted and working on custom designs for other companies.
The more fabs the better, should help reduce wafer prices as capacity increases. This wont reduce GPU prices, sales will need to plumet for that to happen.... and hope that nothing untoward happens to the world leading fabs in Taiwan, noting that TSMC has opened two fabs in Arizona already with at least one more planned.
If demand were to plummet then supply would likely be reduced (noting relatively few players in the advanced fabrication node space) - akin to the way that OPEC restricts production to keep the price up.The more fabs the better, should help reduce wafer prices as capacity increases. This wont reduce GPU prices, sales will need to plumet for that to happen.
Maybe the other parts don't have the same demand? I can't see people clamouring to clear the shelves of MacBooks honestlyWafers are finite for everyone yet all other parts are not following the same cost increase pattern.
You can't seem to explain the disparity here.
AMD and Nvidia aren't the only ones that use silicon wafers at tmsc and other manufacturing plants.
/snip images
I would say the 4080/90 were the tipping point in time where it became worth it.
I would say the tipping point would be about 4070Ti levels of RT raster.
To be honest If i was NVidia/AMD/apple, I wouldn't trust intell fabs with my designs.We need Intel to get its fabs sorted and working on custom designs for other companies.
My point was the 4070ti was barely better than my 3090 so not quite justifying any switch unless it was a 4080 or better. Which seem to comfortably or just about run RT games at acceptable frame rates. As you can see in your chart @humbug the 4080 is over 60 fps so kind of backs up my point above.![]()
To be honest If i was NVidia/AMD/apple, I wouldn't trust intell fabs with my designs.
Its also 20% ahead of the 3090, in Cyberpunk.
Its also not worth (or wasn't at the time) the outlay.
The 3090? i think the 3090 was and still is a good GPU but too expensive for what you got, its no better today in raster than the sub £500 card i have now.
I am referring to the point in time where I think RT was playable post #11489. I said the 4080 or stronger cards are this. My 3090 was almost there, but the 4080 is around 30% faster than it.
I think Cyberpunk RT has got more difficult to run over the years because Nvidia had new cards they wanted you to upgrade to.
@CAT-THE-FIFTH said same in another thread. The RT experts never mentioned this in the dedicated thread though...
Turing and Ampere flagships pretended, and the lower sku's have always lacked the grunt tbh.
GeForce RTX 40 Series users can benefit from all the performance and image quality enhancements that DLSS 3.5 has to offer - including performance-multiplying Frame Generation for the fastest frame rates - in the demanding Ray Tracing: Overdrive Mode.
My point was the 4070ti was barely better than my 3090 so not quite justifying any switch unless it was a 4080 or better. Which seem to comfortably or just about run RT games at acceptable frame rates. As you can see in your chart @humbug the 4080 is over 60 fps so kind of backs up my point above.![]()