• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Apple M1 Pro and M1 Max

Soldato
Joined
12 May 2014
Posts
5,236
Decided to post here rather than make a new thread,

Considering what Apple has done with their M1 chips, is there a chance that future CPUs will have significantly larger "RAM"/cache on the/next to the processors. This would be alongside the standard DDR4/5 arrangement we have now.
I'm talking 1GB on Ryzen 3 and more the higher up you go. Say a Ryzen 7 sporting 4GB and threadrippers with 16GB on the low end and 64GB on the high end. How much would computational speed benefit from such an approach? Would we need to change how software is programmed to take advantage of this?

What about accelerators, will we see more HW accelerators for professional work?
 
Soldato
OP
Joined
6 Oct 2009
Posts
3,998
Location
London
Decided to post here rather than make a new thread,

Considering what Apple has done with their M1 chips, is there a chance that future CPUs will have significantly larger "RAM"/cache on the/next to the processors. This would be alongside the standard DDR4/5 arrangement we have now.
I'm talking 1GB on Ryzen 3 and more the higher up you go. Say a Ryzen 7 sporting 4GB and threadrippers with 16GB on the low end and 64GB on the high end. How much would computational speed benefit from such an approach? Would we need to change how software is programmed to take advantage of this?

What about accelerators, will we see more HW accelerators for professional work?

It's generally expected that cache is going to increase in future CPUs, that's quite uncontroversial and has been the trend for decades. If a computation task is memory-intensive or requires core-to-core communication, it could benefit from more and/or faster cache. If it's not, there's no difference.

It will be quite a while before we see many GBs of cache though, that requires a huge amount of space and will be incredibly expensive. The highest end EPYC has 256MB of cache, to put it in context. We're a very long time off before we see the likes of 1GB in cache for a low-end CPU, likely decades away.
 
Soldato
Joined
12 May 2014
Posts
5,236
It's generally expected that cache is going to increase in future CPUs, that's quite uncontroversial and has been the trend for decades. If a computation task is memory-intensive or requires core-to-core communication, it could benefit from more and/or faster cache. If it's not, there's no difference.

It will be quite a while before we see many GBs of cache though, that requires a huge amount of space and will be incredibly expensive. The highest end EPYC has 256MB of cache, to put it in context. We're a very long time off before we see the likes of 1GB in cache for a low-end CPU, likely decades away.
Just had a proper look and apparently the RAM on the M1 is a separate chip on the same package so I've worded my question wrong.

Will we see RAM added onto the same package as the rest of the CPU.

Lets take AMD, stick RAM off to the side, attach it with IF or something else. For desktop purposes the on board RAM would be in addition to your standard DDR4/5 not a replacement. As an example, you would have a Ryzen 7 with 8GB of onboard RAM and 32GB of DDR4.

Is this something that could happen? Would windows and/or software need to be coded differently to take advantage of this?
 
Soldato
OP
Joined
6 Oct 2009
Posts
3,998
Location
London
Just had a proper look and apparently the RAM on the M1 is a separate chip on the same package so I've worded my question wrong.

Will we see RAM added onto the same package as the rest of the CPU.

Lets take AMD, stick RAM off to the side, attach it with IF or something else. For desktop purposes the on board RAM would be in addition to your standard DDR4/5 not a replacement. As an example, you would have a Ryzen 7 with 8GB of onboard RAM and 32GB of DDR4.

Is this something that could happen? Would windows and/or software need to be coded differently to take advantage of this?

No we won't see that because it makes no sense, and it won't make any difference anyway. Putting DRAM on the same package as a SoC just adds to manufacturing complexity and takes away flexibility from OEMs. Intel and AMD don't make DRAMs. Some laptops have soldered ram as well as SO-DIMM slots, but those are not on the same package as the SoC itself. Being there makes no difference to performance or software though.

The only reason Apple does this is because it's space efficient and reduces complexity of the boards, as they make both the CPU and the end-product, something Intel/AMD don't do. There are no performance benefits to this approach for Intel/AMD. But for Apple, it makes perfect sense.
 
Associate
Joined
22 Jun 2018
Posts
1,583
Location
Doon the watah ... Scotland
As an example, you would have a Ryzen 7 with 8GB of onboard RAM and 32GB of DDR4.

Is this something that could happen? Would windows and/or software need to be coded differently to take advantage of this?

it could, it’s technically possible, but you would need a lot of control circuitry over what data that you store on that local RAM as opposed to the sticks. And that cost/complexity likely outweighs the benefits.

As fantastic as the M1 stuff is, I Can’t help think that the apple route will plateau due to die size unless they can manage to split it towards the chip let type route.
 
Soldato
OP
Joined
6 Oct 2009
Posts
3,998
Location
London
it could, it’s technically possible, but you would need a lot of control circuitry over what data that you store on that local RAM as opposed to the sticks. And that cost/complexity likely outweighs the benefits.

As fantastic as the M1 stuff is, I Can’t help think that the apple route will plateau due to die size unless they can manage to split it towards the chip let type route.

M1's DRAMs are not on the die and make no difference to die size.
 
Soldato
Joined
12 May 2014
Posts
5,236
So funny enough, it seems like my suggestion has been done before. I happened to come across this review where a broadwell chip (released in 2015) it was fitted with eDRAM (128MB). It is pitted against a number of chips one of them being the i7-10700 (released in 2020).

So for non-gaming workloads it didn't help with performance and in some instance caused a performance regression. But it showed some impressive gaming numbers.

https://www.anandtech.com/show/1619...ective-review-in-2020-is-edram-still-worth-it
 
Associate
Joined
22 Jun 2018
Posts
1,583
Location
Doon the watah ... Scotland
I do laugh at this gaming criticism. Is it designed as a gaming platform? nope. Is it marketed as a gaming platform? nope. Is there a eco-system around it for gaming? nope. Is it good at gaming? nope.

Is it designed as a production platform? Yup. Is it marketed as a production platform? Yup. Is there an eco-system around it for production? Yup. Is it good at production? Yup.

So why there is a fixation on gaming is completely beyond me.
 

Deleted member 209350

D

Deleted member 209350

But Jigger has told us that the M1X is going to dominate the gaming laptop market :cry:

Hilarious tbh, some people really blow things WAY out of proportion. There were certain individuals claiming on this very forum claiming that these chips more powerful than the latest consoles and going to perform better whilst also giving Nvidia desktop GPU's a run for their money. What utter nonsense and I hope they were hit hard by the reality.

Truth is that whilst these are very good for video/photo editors who specifically use apple apps, almost everything outside of that you'll be able to find something that performs better for less money. Except the battery life of course, as that is unrivalled
 

Deleted member 209350

D

Deleted member 209350

I do laugh at this gaming criticism. Is it designed as a gaming platform? nope. Is it marketed as a gaming platform? nope. Is there a eco-system around it for gaming? nope. Is it good at gaming? nope.

Is it designed as a production platform? Yup. Is it marketed as a production platform? Yup. Is there an eco-system around it for production? Yup. Is it good at production? Yup.

So why there is a fixation on gaming is completely beyond me.

yeah exactly? its made for apple app users who are into photo and video editing, not gaming... But people just love to make the comparison for whatever reason.

I hope Apple never gets into any gaming stuff, and just sticks with their productivity stuff
 
Soldato
OP
Joined
6 Oct 2009
Posts
3,998
Location
London
I do laugh at this gaming criticism. Is it designed as a gaming platform? nope. Is it marketed as a gaming platform? nope. Is there a eco-system around it for gaming? nope. Is it good at gaming? nope.

Is it designed as a production platform? Yup. Is it marketed as a production platform? Yup. Is there an eco-system around it for production? Yup. Is it good at production? Yup.

So why there is a fixation on gaming is completely beyond me.

Certain idiots blew the expectations for these chips way out of proportion, specifically for gaming, despite as you say Apple made no claims about gaming. There are some people whose understanding of GPU is "game game game" so when they heard powerful GPU their minds automatically went to "must be awesome for gaming". Then there's a user here who made asinine claims about how these are the best gaming laptops and has been quiet ever since the reviews came out :D

There's also another crew who basically hate everything Apple does or makes, so they got their bone and can't stop talking about how uncompetitive these are at gaming, literally running Windows through Parallels, without any GPU drivers, then using Windows' emulation to run some x86 Windows games, again without drivers, to show that these chips can't get good fps even at 720p gaming, so they can come and say they suck at gaming and this GPU sucks or is the same as 2050 Ti or something.

Now both sides blame the other idiots for their own stupidity and claim all they're doing is to refute the other side, they got their excuses to keep going.

Apple for some reason really brings the worst out of people, fans or haters alike. It brings the idiot inside all of us to the surface :cry:
 

Deleted member 209350

D

Deleted member 209350

Certain idiots blew the expectations for these chips way out of proportion, specifically for gaming, despite as you say Apple made no claims about gaming. There are some people whose understanding of GPU is "game game game" so when they heard powerful GPU their minds automatically went to "must be awesome for gaming". Then there's a user here who made asinine claims about how these are the best gaming laptops and has been quiet ever since the reviews came out :D

There's also another crew who basically hate everything Apple does or makes, so they got their bone and can't stop talking about how uncompetitive these are at gaming, literally running Windows through Parallels, without any GPU drivers, then using Windows' emulation to run some x86 Windows games, again without drivers, to show that these chips can't get good fps even at 720p gaming, so they can come and say they suck at gaming and this GPU sucks or is the same as 2050 Ti or something.

Now both sides blame the other idiots for their own stupidity and claim all they're doing is to refute the other side, they got their excuses to keep going.

Apple for some reason really brings the worst out of people, fans or haters alike. It brings the idiot inside all of us to the surface :cry:

The main problem with apple seems not to actually be apple themselves, but the people who are massive apple fanboys! A lot of them talk so much crap and are usually very condescending, which in turn brings out people to try smack apple down as their way to argue and vice versa.

If people were just normal and didnt fanboy over specific company's (not just apple), the tech world would be a better place. Would be much better in general if people were just a fan of specific products and technologies, regardless of where it comes from or who made it, but sadly I dont think thats ever going to happen
 
Soldato
OP
Joined
6 Oct 2009
Posts
3,998
Location
London
The main problem with apple seems not to actually be apple themselves, but the people who are massive apple fanboys! A lot of them talk so much crap and are usually very condescending, which in turn brings out people to try smack apple down as their way to argue and vice versa.

If people were just normal and didnt fanboy over specific company's (not just apple), the tech world would be a better place. Would be much better in general if people were just a fan of specific products and technologies, regardless of where it comes from or who made it, but sadly I dont think thats ever going to happen

It was always obvious that these chips won't be any good for gaming for anyone who uses macs, for the simple reason that there are no AAA games to even play on them, no matter the hardware! It doesn't get simpler than that. Apple has not taken mac gaming seriously in any way for a very long time, and made no mention of it anyway during the launch.

Hopefully this gaming meme will die off as every reviewer has been clear about it now, that nobody should get these chips for gaming, as if it wasn't obvious to begin with. You can spend half the price, get 2x gaming performance, and access to hundreds of titles that you can't even play on the Macbooks regardless of performance.
 
Back
Top Bottom