Caporegime
Oops, wrong thread
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
LOL, so wishful some of you.
At every release in recent years AMD have set prices for the mid and high end cards that equal Nvidia's. FuryX slower than 980ti but same RRp and more expensive on the street. Vega 64 vs 1080. Radeon 7 vs 2080.
OF course AMD are going to price these at $499 for 2070 performance. The 7nm node is very expensive.
If the next gen arch next year is a flop then I'll be fully in agreement. I get what AMD were trying to do when they bought ATi, but with the CPU division falling on its ass as well it just messed the company up too much to make good on their plans in any reasonable time frame.
Give Lisa Su her props though, she seems to have worked some magic and gotten momentum back, but "Next Gen" will be entirely on her watch, not leftovers from RTG still trying to be independent. If her magic touch doesn't extend to graphics as well come Q1 2021 then perhaps it might be time to cut it loose again.
Unless, of course, graphics is still sustainable and doesn't require propping up with CPU profits.
One of the issues is the benefits of combined GPU + CPU just haven't been realized and it is not really clear where this will go. AMD did score the console deals because of this, although it is unlikely to have paid off the cost of buying ATI in the first place (especially given indirect costs). And it is not like AMD would be left on the sidelines without GPUs if they had a good CPU offering. However many year later and the market still following an Intel CPU with IGPU, or intel CPU + Discrete nvidia GPU. As great as Ryzen is, it is not like people people can buy a Ryzen APU that makes buying a discrete GPU obsolete so Ryzen owners still buy Nvidia GPUs. Nvidia and Intel are as strong as ever despite only servicing CPU or GPU and not both.
The failure of AMD to push APU up to the mid-level is puzzling to me. I really expected that cards card to the 1060/580 level would just cease to exist because you could get that level in a APU for less money. If Intel rumours are true then it will be Intel that causes the death of low-end discrete GPUs and not AMD. I think this even took Nvidia by surprise, they expected low end discrete GPUs to die off and concentrated on mid and high end along with data center/HPC but low end GPUs still thrive.
AMD do need a next generation architecture, and 2021 is the earliest that will happen. I still feel that there will be limitations in R&D for such a product. Especially since AMD will really need to divide R&D between HPC, deep-learning, embedded Autonomous vehicle, as well as consumer gaming focused parts. Nvidia has been splitting up their GPU designs since Pascal.
I haven't read the thread but AMD seem to have pretty much not gone anywhere in the last 3 years (in terms of gpus). Just same performance under different names. At least they are making progress with their cpus.
Nvidia pretty much no better either, also same performance under difference names with just 1 better card (excluding titan) for double as much. How nvidia are charging over Titan prices for their standard high end card is staggering.
I feel i keep beating the same drum but ill continue to moan about this because in 19 years of owning a pc we have hit the point where it doesnt seem worth it/fun anymore. I miss the days of reasonable prices for decent upgrades and crazy fun overclocking (especially with cpus, getting those old 1.6Ghz intel chips to what was it 2.8Ghz+, big AMD opteron clocks, 3dmark2001, 2003, 2005, 2006, o and counter strike of course/uni... i have nostalgia). Its not like that anymore. Hell its been over 10 years since crysis, hardly moved on from that graphically. Maybe im just getting old...
Interesting theory about future APUs killing off the mid range GPU. Why isn't there PS4 Pro level APUs? I assume it's because everything would have to be engineered on a board, from the cpu, apu, ram rather than just the chip. Even if you could get an APU You have the RAM bottleneck, right? People have talked about HBM2 being on the actual chip to get around that problem.
The APU on Zen 2 will be quite interesting
Shared memory and memory bandwidth probably. Buying in 16GB RAM just to give the graphics portion sufficient leg room and still have enough left over to run the system nicely defeats the point getting a cheap APU to begin with. And even then, we always see the graphics held back because system RAM is never fast enough to keep the APU fed.The failure of AMD to push APU up to the mid-level is puzzling to me. I really expected that cards card to the 1060/580 level would just cease to exist because you could get that level in a APU for less money.
Memory bandwidth plus power required to get that going, maybe some deal between AMD and Sony+Microsoft to not have competing products on the market in the same performance bracket? But, besides the form factor, I don't see why, as a gamer, you'd went with an APU instead of a dedicated CPU+GPU, which you can mix and mach as you want, upgrade as per your needs, etc. So perhaps the market isn't big enough just yet
Shared memory and memory bandwidth probably. Buying in 16GB RAM just to give the graphics portion sufficient leg room and still have enough left over to run the system nicely defeats the point getting a cheap APU to begin with. And even then, we always see the graphics held back because system RAM is never fast enough to keep the APU fed.
I remember when the Kaveri APUs came out and I was toying with building a tiny HTPC with an A8 or A10. I commented on an AMD Facebook post about it casually saying "but what we really want is dedicated HBM strapped to the APU, guys, to let that GPU core fly!". They actually replied with "all in good time ". That was 2014 I think?
I do genuinely believe there is a crazy master plan at work, but it's so ambitious it was always going to be a long term endeavour and then further compounded by AMD's financial woes. 5 years after a meaningless Facebook exchange we get an AdoredTV video speculating on a custom Zen 3 EPYC with 22 CPU and GPU chiplets, and HBM2 strapped directly to the I/O die. Give it another 5 years and that Facebook exchange might just come to pass on the desktop.
Shared memory and memory bandwidth probably. Buying in 16GB RAM just to give the graphics portion sufficient leg room and still have enough left over to run the system nicely defeats the point getting a cheap APU to begin with. And even then, we always see the graphics held back because system RAM is never fast enough to keep the APU fed.
I remember when the Kaveri APUs came out and I was toying with building a tiny HTPC with an A8 or A10. I commented on an AMD Facebook post about it casually saying "but what we really want is dedicated HBM strapped to the APU, guys, to let that GPU core fly!". They actually replied with "all in good time ". That was 2014 I think?
I do genuinely believe there is a crazy master plan at work, but it's so ambitious it was always going to be a long term endeavour and then further compounded by AMD's financial woes. 5 years after a meaningless Facebook exchange we get an AdoredTV video speculating on a custom Zen 3 EPYC with 22 CPU and GPU chiplets, and HBM2 strapped directly to the I/O die. Give it another 5 years and that Facebook exchange might just come to pass on the desktop.
You remember Hybrid Crossfire and SidePort memory on Phenom II boards? Bring that pupper back. If you're not going to put all the RAM into a single package, just slap the GDDR6 and DDR 4 onto the motherboard. OK, it's not going to be upgradable but it you're designing a system for pre-determined purposes then you can just spec the amount of RAM you'd need.Well I assumed it would be possible to have 8GB GDDR5/6 for the APU but have a separate DDR3/4 sytem RAM as a hybrid solution.
If there is a fundamental reason thios ism;t achiveable then it makes the whole purchase of ATI by AMD somewhat limited in scope.
It's not what I necessarily want, it's more that direction mainsteam gaming is expected to be going if you listen to the analysts. If you suffer from upgraditis, you may want to upgrade to 4k, need a GPU to match that 2080 ti and you'll upgrade more than the gamer who's happy at 1080p for now, has a 290 and I7 2700x and will upgrade when. The latter gamer will be very happy with a custom APU that would offer Vega 56 and Ryzen 3600 performance and let him upgrade to 1440p when he's at it. That would be a compelling mass market product but the memory bandwidth would need to be worked around.
That's why 7nm Zen 2 should be fascinating from an APU perspective. How many compute units can they fit in at 7nm and how fast can it go? The last gen APUs had 11 compute units. There was a part tested with 20 compute units that hit Vega 56 performance. Now that could well be fabricated and I highly doubt Zen 2 hits that on both the size of the APU and with the memory bottleneck
I seem to recall the whole idea behind the acquisition was to bring in a era of fusion and heterogeneous computing where a CPU core and GPU could be used interchangeably by an application. As with most things AMD it was an approach that was ahead of it's time for a market that wasn't quite ready for such a fundamental shift in technology. IMO AMD would have been better off just licensing the technology from ATI, ATI needed the money after the disastrous launch of their DX10 cards and I can't see them saying no to such an offer (ah the beauty of hindsight).Well I assumed it would be possible to have 8GB GDDR5/6 for the APU but have a separate DDR3/4 sytem RAM as a hybrid solution.
If there is a fundamental reason thios ism;t achiveable then it makes the whole purchase of ATI by AMD somewhat limited in scope.
Knowing AMD ?? Immature bioses and drivers like always
Yes. Definitely price.What's the after-the-fact answer to this thread then?
Price?
this time around. Price bit high combined with that ****** blower cooler.. It actually blows lolAMD drivers are better than Nvidia's these days.