Soldato
- Joined
- 14 Aug 2009
- Posts
- 3,106
True, but (from a consumer side) that's irrelevant.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
True, but (from a consumer side) that's irrelevant.
AMD have no money to develop more graphics processors - this is why we see basically the same Polaris 10 in RX 480 (2016)/580 (2017)/590 (2018) 3 years as rebrands.
While nvidia develops different processors for basically every card - TU102, TU104, GP102, TU106, GP104, GP106.
6 different dies!
It's not true though, it a conclusion he has come up with because he didn't like the conclusion of the video in the other thread.
Certainly not a fact, based on his CS go images that supposedly proved his point.
Consumer mentality that put AMD where is today, is not about how many dies you have to market, but how those dies perform in terms of price/performance and user experience. You could have 1 or 2 Polaris, plus Vega 56/64 (like it was HD7850/7870 and HD7950/7970), as long as they offer the performance and perceived quality of use desired by consumers. Even 3 dies would be enough if at the price range they're sold offer something solid.
The same mentality that killed 3DfX and forced the best manufacturer Matrox to stop releasing any new graphics products.
Must be something wrong with the consumer.
AMD have no money to develop more graphics processors - this is why we see basically the same Polaris 10 in RX 480 (2016)/580 (2017)/590 (2018) 3 years as rebrands.
While nvidia develops different processors for basically every card - TU102, TU104, GP102, TU106, GP104, GP106.
6 different dies!
I'm not convinced. Lisa Su has injected a hell of a lot of cash into R&D, and we all know they're trying a "Zen" approach in the GPU side of the business.
R&D increased by 25% in 2018 over 2017:
https://www.overclock3d.net/news/cp...s_increased_by_25_since_this_time_last_year/1
R&D increased every year by a good amount since Lisa Su came in in late 2014 (2015 budget probably set before she got into post):
https://www.statista.com/statistics/267873/amds-expenditure-on-research-and-development-since-2001/
Further, AMD paid off a lot of debts a few years back and are making profits for the first time in donkeys years. Lisa Su has said she will keep pushing R&D as a high priority.
She's got a PHD in semi-conductors and knows exactly what it takes to make successful products.
Also note that throwing money at the problem isn't a solution. You need capability and that takes years to build up, hence the gradual but noticeable increase in R&D funding year on year.
Just wondering, Lisa Su hasn't been CEO for that long at AMD in fairness, isn't Ryzen the first CPU that was developed from the ground up under her?
I'm assuming Polaris and Vega from the ground up were not, they seemed to be in development before her.
Long ass video, but he has some interesting ideas that are at least plausible - whether true is yet to be seen.
Look, if "mainstream" Navi is in par of GTX1080Ti performance at mid range prices ( sub £300), is a huge jump in price/perf for the segment.
And being "mid range" is nothing to be shy about tbh. As a GTX1080Ti is perfect card to run 2560x1440 & 3440x1440 144hz.
I think 1080 Ti is a bit too much to ask for. I'm personally expecting 1080 non-Ti perf for ~£300.Look, if "mainstream" Navi is in par of GTX1080Ti performance at mid range prices ( sub £300), is a huge jump in price/perf for the segment.
And being "mid range" is nothing to be shy about tbh. As a GTX1080Ti is perfect card to run 2560x1440 & 3440x1440 144hz.
Where are all these chips going to be made? Intel are up against the wall not able to keep up and they have heaps of fabs. From what I can see we are looking at all the chiplets coming from TSMC and the IO chip coming from glofo.
Sure the chiplet idea should hopefully give great yields, but still - can TSMC produce enough for AMD to actually take significant share from Intel and Nvidia even if the chips are better?
because fill rates do not make a GPU. Vega has plenty of bottlenecks to not be able to use other advantages. nVidia on the other hand worked out lean mean gaming arch, without any compute stuff, then added RTX circuitry with RTX and called it a day. AMD has to design single arch for both server market and gaming market since, they were a bit broke few years back, so they could not make two archs simultaneously. I'm hoping now, while they are in the black AMD will have two designs, one for servers, another for gaming, and if they do, they might kick some major arseRadeon RX Vega 64 was released in August 2017. This year, AMD skipped to launch a Radeon replacement - if they don't do it till August 2019, Radeon RX Vega 64 will be already a 2-year-old.
Question, if Radeon RX Vega 64's pixel and texture fillrates are respectively 104.3 Gpixels/s and 417.3 GTexels/s, while nvidia RTX 2080 Ti's pixel and texture fillrates are respectively 136 Gpixels/s and 420.2 GTexels/s, why is the nvidia RTX 2080 Ti 100% faster than RX Vega 64?