What's new

Apple M3

I skimmed that site a while ago, and - no offense to Blue Cat - I wasn't sure how worried to be today.
For the past few months, I've been looking for evidence that Apple have screwed up - as that article implies - but haven't found any so far.

In 2020 Apple says, "you must do X* to protect performance." There's a good reason** they did this. Apple usually gives two years' notice of breaking changes, so it seems a bit rich for developers to do nothing about it and then go all Pikachu-faced in 2023 when performance suffers.

The linked article says "X looked like an optional feature" immediately below the video that explicitly says it isn't optional. I found documentation for the things it said weren't documented. Others claim the API's buggy, but nobody has yet been willing/able to say what the bugs are; and I've seen working implementations of all the main operating modes.

Not saying that it's totally trivial, BTW, especially for plugins written once and compiled to many formats/platforms. Looks like JUCE only got updated at the end of last year, so it will be a while before that code makes it to developers who couldn't/didn't want to address the issue directly. After that, it still has to get through developers' regular patch and release cycles (some more regular than others!)... so, could be a while before it reaches users.

* where X = "add additional real-time audio processing threads to the audio device's workgroup". MacOS does this automatically for the 'default' thread, so most(?) plugins with only one real-time thread required no action. Some of the issues are a bit subtle, but anyone writing a plugin with multiple r/t threads must already know what they're doing: otherwise, it's like giving a grenade launcher to a chimp. Apple warns developers of this (admittedly, using different terminology.)

** if you have lots of real-time threads potentially in different processes (e.g. in a multi-architecture DAW with lots of plugins!), the OS needs to know how those threads cooperate to produce their audio. It can't work this out automatically; priority and QoS flags aren't expressive enough; and letting the scheduling order emerge organically from the usual concurrency primitives is far too slow. "Audio workgroups" is a mechanism that allows developers of multi-RT-threaded plugins to tell the OS what it needs to know.
 
I don't know. These VR/AR headsets are cool but have so many drawbacks. The weight on your head, poor battery life, cost, etc.

At some point they will make them work for gaming, experiences, etc but I seriously doubt they will ever work as general use computers. Back in 2010 everyone predicted tablets would replace desktop and laptop computers... and yet here we are.
I'm interested in seeing how the AVP works more from a "I'm a professional and need a lot of screen space to get the job done" perspective.

Like, can I have Logic and Premiere open simultaneously with unlimited space for the window, expand and contract easily as needed for the tracks I'm juggling?

How about Excel?

Less worried about battery life, surely we could just keep it plugged in for desk duty?

Neck pain may be a thing (cf. carpal tunnel, for example.)
 
I'm interested in seeing how the AVP works more from a "I'm a professional and need a lot of screen space to get the job done" perspective.

Like, can I have Logic and Premiere open simultaneously with unlimited space for the window, expand and contract easily as needed for the tracks I'm juggling?

How about Excel?

Less worried about battery life, surely we could just keep it plugged in for desk duty?

Neck pain may be a thing (cf. carpal tunnel, for example.)
Currently, the Vision Pro can only provide one virtual 4k display for a Mac, so you could have Premiere on the Mac running that way and the iPad version of Logic running separately. I believe that Excel is going to be available in a native Vision Pro app, so you'd probably have more flexibility there.
 
I'm interested in seeing how the AVP works more from a "I'm a professional and need a lot of screen space to get the job done" perspective.
They'd need to use a TB cable connected to a Mac for that to work. Even Wifi7 only gets up to 40Gbps in the best of conditions.

Thunderbolt 5 goes up to 120Gbps and you can get up to three 4K monitors at high refresh rate.
 
They'd need to use a TB cable connected to a Mac for that to work. Even Wifi7 only gets up to 40Gbps in the best of conditions.

Thunderbolt 5 goes up to 120Gbps and you can get up to three 4K monitors at high refresh rate.
Apple does have a trick Wifi Remote Desktop High Bandwidth mode that, on a fast network (wired or wireless) can stream like you're sitting in front of the remote computer. It actually turns off the remote computers display when it's connected in this mode, and the Vision Pro is almost certainly using something similar (or identical) to stream the Mac display to the headset.

That said, you'll have a lot more flexibility with native Vision Pro apps - those can be arrayed all around you, however you like. Due to the small market size at release, native apps are probably going to be rather thin on the ground - most apps will be iPad apps that developers are allowing to run on the Vision Pro and, while those are apparently very well-supported, they won't be the same as native apps.
 
Apple does have a trick Wifi Remote Desktop High Bandwidth mode that, on a fast network (wired or wireless) can stream like you're sitting in front of the remote computer. It actually turns off the remote computers display when it's connected in this mode, and the Vision Pro is almost certainly using something similar (or identical) to stream the Mac display to the headset.

That said, you'll have a lot more flexibility with native Vision Pro apps - those can be arrayed all around you, however you like. Due to the small market size at release, native apps are probably going to be rather thin on the ground - most apps will be iPad apps that developers are allowing to run on the Vision Pro and, while those are apparently very well-supported, they won't be the same as native apps.
Wonder if that will include Logic Pro...?
 
I’ve been waiting for years now for Apple to step up with a new MacPro that’s upgradable. I would like to see AI capability integrations, which to me says stacks of GPUs.
 
The weight didn't bother me, but I'm not bothered by the weight of AirPods Max either.

And cost is why I said "offspring." The first products are always expensive, and then they come down later. That happens with everything, from Synclaviers to EVs (getting in the obligatory car analogy).

Battery life isn't an insurmountable issue, any more than it is with power tools, i.e. you just swap them for a charged one.



The FF I don't give about gaming is huge, but I think they're going to be a big deal - eventually. iPads may not have replaced real computers, but iPhones have done pretty well even though they haven't replaced computers.

At first I wasn't especially excited, but as I said, the Van Gogh Experience convinced me otherwise - just as the Palm Treo made it obvious that someone (Apple) was going to come in with a far more advanced version and clean up.

Whether they replace computers is an open question, but I don't think this is going to be another Google Glass/Glasshole technology.
Can you explain to me why the goggles need to contain the entire computer? Wouldn't a simple wire or wireless goggle monitor with good speakers built it be the same experience? In fact it seems like the headphones are so major for people like us, seems like that should be separate so you can get what you want. I know it would be missing apple extras like people fading in and out as they approach, but I have to assume others will eventually adopt any features that are found to be popular.

Why spend so much on Mac goggles when attaching existing goggles and headphones to my Mac M2 Studio Ultra would get me the same experience? Or am I missing something?
 
Can you explain to me why the goggles need to contain the entire computer? Wouldn't a simple wire or wireless goggle monitor with good speakers built it be the same experience? In fact it seems like the headphones are so major for people like us, seems like that should be separate so you can get what you want. I know it would be missing apple extras like people fading in and out as they approach, but I have to assume others will eventually adopt any features that are found to be popular.

Why spend so much on Mac goggles when attaching existing goggles and headphones to my Mac M2 Studio Ultra would get me the same experience? Or am I missing something?
They don't want something that tethers you to a computer, so while the Vision Pro can act as a large display for a Mac (via WiFi), it's really more designed to be used while sitting on a couch/chair. Having its own OS allows them to tailor the interface and interaction to the device - if it was just a Mac accessory, you'd still probably be using it at a desk with a keyboard and mouse/trackpad, just as you'd use a Mac with a normal display, and since it's not designed primarily as a gaming device, I don't know what the compelling use case would be.
 
It's worth thinking about the very first iPhone, and comparing that very basic phone (with no real apps to speak of, but a whole new interface paradigm to adjust to) with how the early reviewers seem to be taking to the AVP.

That is to say, it's cool tech, but nobody really knows what the killer app is yet.

To me the iPhone didn't really start to take its first step into its final form until the 6 Plus, which was released in 2014, seven years after the first iPhone.

However, that is a point - the entire goggle assembly could be attached to a base unit with a cable.

There must be good reasons they didn't want to do that.

I think it's because AVP is not meant to be a computer accessory. They don't want you to think of it as "VR". It's meant to be the future of the iPhone.
 
However, that is a point - the entire goggle assembly could be attached to a base unit with a cable.

There must be good reasons they didn't want to do that.
I think it's because the final form of this device is closer to eyeglasses than what we see now, and the competition (largely Meta) has already moved not only the computing power, but also the battery, into the device. Given the much higher quality of everything in Vision Pro, they were not able to get the battery internal, but given Apple's control over their silicon and where they want this tech to go, the computing power had to be internal, and this was the point where they had a minimum releasable product.

That is to say, it's cool tech, but nobody really knows what the killer app is yet.

To me the iPhone didn't really start to take its first step into its final form until the 6 Plus, which was released in 2014, seven years after the first iPhone.
I agree with the first part, but only partially the second part - the App Store allowed the iPhone to become a general purpose computing device, and allowed users to turn it into whatever they wanted/needed it to be (within what Apple allows, of course).

The release of the 6/6 Plus is probably where it became something that, for many users, could be used as your only computer (due to screen real estate and battery life), rather than a companion device to the other tech in your life. This is where the smart phone became something that everyone needed to have, because it could do just about anything you previously needed a traditional computer to access - instead of getting an unreliable, cheap laptop and WiFi at home to access digital services, you could now just get a smartphone instead.

Apple saw a huge spike in sales in the iPhone 6 cycle that would, paradoxically, cause them some pain over the following five or so years, as that sales spike turned out to be a one-time thing and not a harbinger of continuing iPhone growth rate (Apple Watch, AirPods, and AppleTV+ are all part their need to create growth as iPhone sales flattened out). This was also probably the point where the rest of the industry started to see mobile as the central market, which spurred massive growth for a number of other businesses, but this was all enabled by the birth of the App Store in 2008. Without the App Store, a larger iPhone doesn't really matter.
 
@rnb_2 There was a time when Steve Jobs thought "web apps" were gonna be the thing. I guess Apple should be glad that Scott Forstall eventually got his way!

Although AVP has several/many/most iOS apps available in 2D, it's almost comparable to the iPhone web apps dark ages. The WSJ AVP review showed how cooking timers can be "floated" above the various pots and pans you might have on a stove while cooking - this is the sort of thing that will make a certain demographic say "woooahhhh". There will be many more of these woah moments, covering many more demographics. (and just wait until there is a larger installed userbase of AVP tech; the floating timers from the pots-and-pans demo will be "mood indicators" above your friends and family members heads...)

My point about AVP being the future of the iPhone is... well, stop for a minute and think, what *is* an iPhone? As you noted, for many people it's their only computing device, and I think more importantly from Apple's perspective, it's a device where all of a person's digital life comes together.

What's the first thing that happens when you get a new iPhone? You log in and it brings in all your mail, messages, contacts, photos....

What's the first thing that happens when you get a new iPad? You log in and it brings in all your mail, messages, contacts, photos....

What's the first thing that happens when you get an AVP? It's a rhetorical question. As you also noted, Apple have been struggling to sell iPhones for a while, because after the 6/plus, people kinda realised you don't *really* need anything more. How do they sell more iPhones? By completely re-inventing how you interact with it. And what will the killer app for AVP eventually be?

You log in and it brings in all your mail, messages, contacts, photos....
 
Last edited:
given Apple's control over their silicon and where they want this tech to go, the computing power had to be internal
I don't quite follow that, i.e. why it makes a difference where the computing power is. The length of the wire does make some difference, but is that significant in this case?
 
Apple saw a huge spike in sales in the iPhone 6 cycle that would, paradoxically, cause them some pain over the following five or so years, as that sales spike turned out to be a one-time thing
Yup, I used an iPhone 6 Plus for five or six years - although of course it wasn't my main computer.
 
I don't quite follow that, i.e. why it makes a difference where the computing power is. The length of the wire does make some difference, but is that significant in this case?
In addition to what I described earlier (minimum viable product is something that lets you move around freely, competitive issues with what is already on the market, etc), from a technical perspective, I think it could just be a combination of two things:

1) M2 + R1 give enough performance to provide the experience they're looking for, and removing them from the headset might not save much size/weight; more performance just reduces battery life.

2) Given the amount of data involved, that cable connecting the headset to the external processor would have to be pretty special, since it would need to carry the data for two 4K+screens, plus all of the cameras and eye sensors in the headset, to and from the processors in real time, all the time. Much better to have everything located together, in close proximity to each other in the device (much like Apple Silicon gets some of its performance by having its RAM close by, on a wide data bus).
 
2) Given the amount of data involved, that cable connecting the headset to the external processor would have to be pretty special, since it would need to carry the data for two 4K+screens, plus all of the cameras and eye sensors in the headset, to and from the processors in real time, all the time.
A tether's a tether, but on the technical front I think that's well within current capabilities: VirtualLink was meant to be an alternative mode of USB-C that would do this. I didn't really follow it, but all looked technically plausible... sounds like it failed partly because it messed with pins outside of the usual alt-mode sandbox. Now there's USB 4.0, and that certainly has the bandwidth. Latency estimates for older USB standards were around 100μs so doubt it'll be a round-trip over USB that creates a latency problem.

As a bonus, USB also delivers power which is handy while batteries remain stubbornly heavy.
 
A tether's a tether, but on the technical front I think that's well within current capabilities: VirtualLink was meant to be an alternative mode of USB-C that would do this. I didn't really follow it, but all looked technically plausible... sounds like it failed partly because it messed with pins outside of the usual alt-mode sandbox. Now there's USB 4.0, and that certainly has the bandwidth. Latency estimates for older USB standards were around 100μs so doubt it'll be a round-trip over USB that creates a latency problem.

As a bonus, USB also delivers power which is handy while batteries remain stubbornly heavy.
That was a seat-of-the-pants hunch from late last night, but you inspired me to look up the actual resolution (~3800x3000, so between 4k and 5k per eye) and calculate things out. The Vision Pro will top out at either 90 or 96Hz (the latter for 24fps media), so without compression (unknown at this point), 32-bit-per-pixel video data alone comes out to 61-65Gb/s, or 50%+ more than TB4/USB4's 40Gb/s.

I'm sure there is some compression going on, but the video data isn't all that would need to move back and forth. While it might be technically possible to do this over USB4, it would be a very heavy lift - better to keep everything in close proximity.
 
Top Bottom