• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Apple M1 Pro and M1 Max

Pretty decent performance for the M1 max on the games it can run:

Looks like native games are OKish - but a long way off the higher end Windows gaming laptop GPUs - even 3060 mobile equivalent might be being generous when it comes to real games - some games native are a generation or more behind performance wise and anything emulated looks like 2-3 generations behind.

Yeah totally gonna kill gaming laptops even if we ignore the whole lack of a comprehensive gaming software ecosystem side of it and that people do more than just play games but also run companion applications and so on when gaming which often don't have non-Windows variants.
 
Looks like native games are OKish - but a long way off the higher end Windows gaming laptop GPUs - even 3060 mobile equivalent might be being generous when it comes to real games - some games native are a generation or more behind performance wise and anything emulated looks like 2-3 generations behind.

Yeah totally gonna kill gaming laptops even if we ignore the whole lack of a comprehensive gaming software ecosystem side of it and that people do more than just play games but also run companion applications and so on when gaming which often don't have non-Windows variants.

It’s totally going to kill sales of gaming laptops with the performance and battery life on offer. The installed base of Mac users and ability to run Windows will be a tempting market for games developers and that will drive more sale of the M1.
 
It’s totally going to kill sales of gaming laptops with the performance and battery life on offer. The installed base of Mac users and ability to run Windows will be a tempting market for games developers and that will drive more sale of the M1.

Lol at this point I think you're trolling. GPU, games availability, emulation, etc aside, the screen on these Macbooks have awful response times and ~4-5 trailing frames, as mentioned by Anandtech's reviewer on Twitter. Very similar to previous Apple IPS screens, they heavily prioritised accuracy over speed, for good reason given their use-case.
 
Lol at this point I think you're trolling. GPU, games availability, emulation, etc aside, the screen on these Macbooks have awful response times and ~4-5 trailing frames, as mentioned by Anandtech's reviewer on Twitter. Very similar to previous Apple IPS screens, they heavily prioritised accuracy over speed, for good reason given their use-case.

It’s interesting you can’t see the potential with this chip. It’s simply miles ahead of anything else on the market.
 
You can’t see developers wanting to reach a new market with a large user base top level hardware?

Not when the hardware has already been optimised for a specific market (productivity), with fixed function acceleration to suit that rather than games
 
Not when the hardware has already been optimised for a specific market (productivity), with fixed function acceleration to suit that rather than games

That doesn’t seem like it’s the situation though. This new Apple chip looks very similar to write for.
 
It’s totally going to kill sales of gaming laptops with the performance and battery life on offer. The installed base of Mac users and ability to run Windows will be a tempting market for games developers and that will drive more sale of the M1.

Honestly, you're embarrassing yourself.
 
You can’t see developers wanting to reach a new market with a large user base top level hardware?

I don't know why I'm bothering, maybe for the benefit of someone who's reading. But here it is:

First, it takes several years for this user-base to be meaningful in numbers. It's not a large user base now and won't be in a couple of years. Apple is still selling high-end Intel macs with AMD GPUs and will continue to do it into 2022. What you describe (large Apple silicon user base with powerful GPUs) is several years away. These current products will be 3-5 years old, and quite likely obsolete.

Second, Apple will need to develop the software side of things to allow this to happen. So far, they haven't done so and they've shown no indication that they're even interested. What they've done for gaming so far in Metal has been heavily focused on iOS gaming rather than desktop AAA games for macs. Maybe this changes in the future, but it won't change overnight. It will take years, and again, by then current generation of products are old and obsolete.

Third, the rest of the hardware is just not gaming optimised, e.g. the screens have very high response times, because Apple cares about colour accuracy and sharpness, rather than speed. So to compete with gaming laptops, Apple needs new hardware too, by then there will be new chips.

Fourth, assuming we get feature parity between Metal and Vulcan/DX12, these GPUs in their microarchitecture are heavily optimised for compute. There's a reason AMD can match Nvidia in gaming but not in compute, and there's a reason Apple is easily catching up to AMD in compute, but not in gaming. The microarchitecture priorities are different. That means even if all the above happens, Apple will still need to prioritise gaming in their future GPUs microarchitectures, i.e. products that haven't been released yet and won't be for a few years.

So to make it clear, these current products, and the M1 Pro/Max, won't make a difference to the gaming industry and won't be gaming laptops. Whether things will change in the future is irrelevant, these products won't be part of that future even if Apple becomes king of gaming.
 
I'm looking really forward to the M2. I can see Apple adding in even more dedicated processing units, who knows what for, but it's cool to see them have so much control over what they deem important.

I've just gotten into Final Cut a bit more seriously at the moment, and would love some hardware accelerated ProRes encode/decode but for now my M1 MBP is still pretty snappy!
 
I'm looking really forward to the M2. I can see Apple adding in even more dedicated processing units, who knows what for, but it's cool to see them have so much control over what they deem important.

I've just gotten into Final Cut a bit more seriously at the moment, and would love some hardware accelerated ProRes encode/decode but for now my M1 MBP is still pretty snappy!

Assuming M2 uses A15's uarch and node, we should expect ~10% faster and 20% better power efficiency for CPU, and ~25% faster GPU.
 
Assuming M2 uses A15's uarch and node, we should expect ~10% faster and 20% better power efficiency for CPU, and ~25% faster GPU.

All of which are appreciated, although still won't be letting us play Cyberpunk at 4K ultra just yet... That being said... what if in the background they are working on Ray Tracing Cores? Or DLSS type acceleration? Maybe to also put into Apple TV's and iPads? I guess if these cores can also be helpful in other than gaming this is more likely...

On the more practical sense, I would say I could see them adding some sort of AR boosting capabilities (they might already been covered by the general ML cores). I can see Apple's next push into AR to expand what we expect computers can do. Something about processing spacial info real time but extremely power efficient?
 
All of which are appreciated, although still won't be letting us play Cyberpunk at 4K ultra just yet... That being said... what if in the background they are working on Ray Tracing Cores? Or DLSS type acceleration? Maybe to also put into Apple TV's and iPads? I guess if these cores can also be helpful in other than gaming this is more likely...

Well, Cyberpunk isn't even released for macOS anyway, if they released a sueprcomputer GPU it would be no different. AMD's FidelityFX works on M1 GPUs. As for ray tracing, there is no hardware support, only software ray tracing on Metal.

Maybe Apple is doing stuff behind the scenes, who knows. There's potential here, but that's all there is for now.

On the more practical sense, I would say I could see them adding some sort of AR boosting capabilities (they might already been covered by the general ML cores). I can see Apple's next push into AR to expand what we expect computers can do. Something about processing spacial info real time but extremely power efficient?

These chips are already a winner for Apple in wearables, just look at how successful Apple Watch has been compared to the competition and the chip is a big part of it. For AR wearables, it will be even more so. There's a reason Apple is focusing so much on the efficiency cores of their silicon which has gone under the radar, this year they made them 25% faster, and A15E cores are now 3.5x faster compared to ARM's A55 efficiency cores. This performance-per-watt advantage in both CPU and GPU is critical for AR devices which have to be lightweight and battery operated to become mainstream products.

So no doubt their AR chips will have dedicated silicon in there to boost some AR-specific features, e.g. greater spatial awareness. The ML cores are effectively TPUs, so any tensor computations can be offloaded there which again is helpful for AR.
 
Maybe Apple is doing stuff behind the scenes, who knows. There's potential here, but that's all there is for now.

There is zero chance that they are not at least communicating between the M1 GPU software and chip hardware teams to talk about ray tracing. It can also be used for special effects generation for film/movies. I can see in a few years there is a big announcement about how fast it is in Motion or another effects program, and tail end it with improvements in games as well. Who knows, maybe the types of calculations for raytracing are also super useful elsewhere, like calculating how wifi or sound waves bounce through a room and you pair it with LIDAR scanning to give new options.

Wish I could jump in that time machine to see where Apple is in ten years... I guess until then I will entertain myself watching people get mad about benchmark charts.
 
There is zero chance that they are not at least communicating between the M1 GPU software and chip hardware teams to talk about ray tracing. It can also be used for special effects generation for film/movies. I can see in a few years there is a big announcement about how fast it is in Motion or another effects program, and tail end it with improvements in games as well. Who knows, maybe the types of calculations for raytracing are also super useful elsewhere, like calculating how wifi or sound waves bounce through a room and you pair it with LIDAR scanning to give new options.

Wish I could jump in that time machine to see where Apple is in ten years... I guess until then I will entertain myself watching people get mad about benchmark charts.

Always more exciting to see it in action every year than jump 10 years and see the result :D
 
There is zero chance that they are not at least communicating between the M1 GPU software and chip hardware teams to talk about ray tracing. It can also be used for special effects generation for film/movies. I can see in a few years there is a big announcement about how fast it is in Motion or another effects program, and tail end it with improvements in games as well. Who knows, maybe the types of calculations for raytracing are also super useful elsewhere, like calculating how wifi or sound waves bounce through a room and you pair it with LIDAR scanning to give new options.

Wish I could jump in that time machine to see where Apple is in ten years... I guess until then I will entertain myself watching people get mad about benchmark charts.

In ten years the juggernaut that is Intel will probably be at full steam.

The next 5 years is probably all about Zen and M1 with APU’s pushing into new areas. 5 years after I think Intel will be back and furiously determined to take over those new markets.
 
Last edited:
News time: If you take the M1 Max chip and turn upside down and photograph it (with the right tools) you can see that the chip has an interconnected die on one side and this detail was never shown in any of the marketing materials.

So Apple has already been playing with interconnects and it's possible that the M2/M3 are MCM CPUs capable of performance that is 2x to 4x greater than the M1 Max
 
It’s totally going to kill sales of gaming laptops with the performance and battery life on offer. The installed base of Mac users and ability to run Windows will be a tempting market for games developers and that will drive more sale of the M1.

Just catching up on this thread. Amazing chips from Apple. Will be the goto choice for many looking at video editing and other specific use cases.

A side note:
I love watching people make a complete **** of themselves. Cheers jigger.
 
Back
Top Bottom