Vision Pro Notes: Media Consumption

This is the third part of my series on early notes from the Vision Pro. This one is focused on media consumption—earlier entries covered the hardware and interface and productivity.

  • Media consumption with photos and videos is fantastic. There’s nothing like it. Watching movies on it is even better. I’ve never had a particularly good home theater system. Now I do.
  • I started by watching Moana in the Disney+ app in their theater environment. It is like having your own movie theater. I got so absorbed in the movie’s climax that I teared up a bit. Since I couldn’t wipe my eyes with the Vision Pro, it made me cry a bit, but the story about watching Moana and coming to tears without being able to wipe my eyes, my light seal cushions got wet, which was kind of funny. Hippie.
  • 3D videos are impressive, but at this point, more like a demo. When I have older videos of my family, they’ll start ruining light seals like Moana did.
  • Panoramas look great. I will be shooting a lot more of them going forward. I can tell newer vs. older panoramic photos based on their fidelity. I want to be able to make some of them the equivalent of a background wallpaper so I can put apps in front of them. My guess is Apple is more focused on Environments.
  • I watched a Netflix show in Safari. It was also great, but app-specific media is better.
  • The big asterisk with media consumption is that it is a solitary experience. There are shows I watch without my family, and it’s great for that. The device does not enable any joint viewing experience.

Vision Pro Notes: Productivity

Yesterday, I wrote my notes about the Vision Pro hardware and its interface. Here are my notes on productivity:

  • visionOS has roots in iPadOS, and it shows. You’ll be disappointed if you are looking for a Vision Pro to replace a Mac.
  • Instead, I’ve focused on ways Vision Pro is superior to the Mac for productivity, like my writing cabin.
  • Vision Pro is very good at keeping me isolated for focused work. I can already be productive with the device where that focus matters.
  • We don’t have enough environments to get the most out of that last point.
  • I found an attached Bluetooth keyboard a big help. I use a connected trackpad much less, but it also can come in handy.
  • That said, dictation is much better than it used to be, and don’t forget to use it with the Vision Pro.
  • Fantastical is a stand-out app. Putting up your calendar and make it the size of the wall is pretty damn cool. It works particularly well for the big picture of monthly, quarterly, and yearly use. I’ve got a massive version of my monthly calendar installed on my ceiling. As I think about next month, I can look at the ceiling to see what’s on deck.
  • MindNode Next is also an interesting early entry. It’s a mind-mapping app but also a brainstorming app where you can locate ideas in space.
  • Ideation development (like MindNode) is an excellent use case for Vision Pro. Apple’s Freeform could also serve in this capacity, but it’s not yet there. My experiments continue.
  • If you want to capture a lot of text, try Aiko, an AI-based transcription tool. You just hit the record button, which converts the recording to text with the Whisper AI engine. I checked with the developer, who reports all work is done on-device.
  • Mac display mode acts as an escape hatch, but I don’t see it replacing monitors for extended work sessions. It makes tons of sense to have a big display attached to a laptop in a hotel room or to give you the ability to move your desktop Mac display to a different room, though.
  • We are in early days for the productivity question on Vision Pro. There are still many workflows to be explored and apps to be released.

Vision Pro Notes: The Hardware and Interface

Now that I’ve logged some serious hours in the Vision Pro, I thought I’d share some thoughts about it. This post focuses on the hardware and interface:

  • Strapping into the Vision Pro does feel a little bit like getting ready to make a spacewalk. I charge the battery (generally) with it disconnected, letting me store the hardware (along with a keyboard) in a drawer. When it’s time to go into the device, I put the battery in a pocket and run the cable under my shirt to my neck to avoid getting tangled in things if I go mobile.
  • For productivity work, a keyboard is necessary. I had an extra keyboard and trackpad. I’ve combined them into one unit using this gizmo from Amazon. Twelve South also makes one that looks a little nicer.
  • The screens are excellent, and anything rendered by them (apps, icons, environments) is entirely believable. The pass-through cameras, however, are darker and grainier than I expected.
  • The pre-release discussion of it being too heavy was overblown. I’ve worn it for hours without much trouble.
  • The Dual Loop Band is more comfortable for me than the Solo Knit Band, but the Solo Knit Band is more straightforward to configure. I use the Solo Knit band for short sessions and the Dual Loop band for longer ones, like watching movies.
  • The audio on the Vision Pro is much better than I expected. I connected my AirPods earlier today to confirm they work, but I’ve been using the built-in speakers exclusively thus far for everything (including watching movies), and they seem fine to me.
  • You must train yourself to avoid picking it up by the light seal. It’s a light magnetic connection, and it is easy to drop the device.
  • Touch targets on iPad apps are too small. The eye tracking works great with native apps but is sometimes tricky with iPad apps.
  • One of the nice touches: when you grab the handle of a window, it automatically aligns rotationally to where you’re standing in the space in the room. There are so many subtle details with the way it renders windows. The shadows on real-world objects are another of my favorites.
  • If you’re having trouble with tracking, make the object bigger by stretching it or bringing it closer to you. I kept forgetting about that.
  • You can rotate a window by rotating your head.
  •  The pinch gesture only works when you have your hand with your palm down. I never got it to work with my palm up. 
  • You can long-press the pinch gesture, and you get right-click options. I’d like to know how many other ideas they have for gestures as this product matures. 
  • Strangely, I think I feel things when I touch them: virtual keyboard keys, butterflies, and the like.
  • I struggle a little bit with app management. There aren’t any options except to go through the alphabetical list.
  • It seems silly that you can’t set favorites, have a dock, or otherwise arrange your applications beyond the main screen.
  • With a device so dependent on dictation, there should be an easier way to trigger dictation without resorting to the virtual keyboard.

Contextual Computing with Vision Pro: My Writing Cabin

A wide screen image showing Apple's Notes app with all panes open, against a virtual Yosemite Valley background. This is viewed through a Vision Pro device.
Looking at Yosemite Valley while writing in Apple Notes

This entire post was composed on Apple Vision Pro with dictation and a Bluetooth Apple Keyboard attached…in virtual Yosemite Valley.

One of my interests in the visionOS platform is whether or not I can use it to get work done. Apple thinks so and has made that a big part of the marketing push for this device. However, it is a new platform with a fledgling App Store and many questions surrounding whether it is useful for productive work.

Moreover, the types of workflows that lend themselves to the platform are also in question. Don’t forget the Vision Pro operating system is based on the iPad, not the Mac. It’s easy to strap on this new device, thinking you can turn it into a Mac. (The fact that you can mirror a Mac display makes it even more tempting.) That’s the mistake I made with the iPad, and I spent years drilling empty wells, looking for productivity workflows that would allow me to duplicate Mac workflows. It was only after I accepted the iPad as an iPad that it became productive for me.

I’m not going to make that mistake with the Vision Pro. I’m going into this thing with open eyes and a sense of curiosity for where it can be used to get work done.

This is not a Macintosh. It is something else. And that is where the opportunity lies. While Mac workflows don’t work here in visionOS, are there things in visionOS that don’t work on a Mac? That is where we should be looking.

And for me, that starts with the idea of contextual computing. I have long felt that computers put too much interference between you and your work.

If you want to write an email, you need to open an email application, which will show you a bunch of new emails, but not a compose window where you can write that email. So many times, you’ll start with that task to write that important email but never actually find your way to the compose window. If you want to work on your task list, you often have to wade through screens and screens of existing tasks before you can get to the ones you need. Put simply, computers need to put you in the context of the work with as little interference as possible.

Sadly, most modern software doesn’t do that. Instead, it does the exact opposite. This is partly due to bad design and partly because tech companies have figured out ways to monetize your attention. They are intentionally trying to divert you from the work. That’s how they keep the lights on. One of the easiest ways to be more productive on any platform is to find quick ways to get yourself in the context of the work you seek to do with as little interference as possible.

This is where visionOS and Vision Pro come in. It’s a new platform tightly controlled by one of the only big tech companies interested in user privacy. This new visionOS is where you can work if you are smart about it.

I’m still experimenting and figuring out my workflows, but here’s an easy one I’ve been using in visionOS for several days: my context-based writing space.

It starts in Yosemite Valley. Using the visionOS “Environments” space, I have found myself in an immersive rendition of the Yosemite Valley in winter. There’s snow on the ground, but I’m sitting there right now comfortably with just my socks on … which is nice.

The main screen in front of me has Apple Notes, where I’m writing this article. To my left is a floating research pane with Safari in it. That’s it. A little research. A place to write. Yosemite Valley. I’ve written about 3,000 words here in the valley over the last few days, which is very comforting. I’ve got a focus mode on, so I don’t get interrupted, and I genuinely feel alone with my words. That’s important. For this to work, I need to be off the grid. This is my cabin in the woods, where I do my writing.

When I’m not writing, I don’t go to Yosemite to watch a visionOS movie, or check email, or play with some other aspect of visionOS. My brain is already figuring out that Yosemite Valley equals writing. My Mac is far away, back at my studio, along with the the cognitive load that comes with the work I do on my Mac. That’s all a distant memory here in Yosemite Valley. My brain is successfully duped.

As the context sticks, the work gets easier. This is a form of contextual computing that I’ve never experienced before. I’ve tried it with other headsets, but the poor-quality screens made it unbearable. I expect this writing context will get only easier over time. As the habit sticks and more writing apps and tools start showing up, I’ll consider bringing the better ones with me to future trips to the valley.

When I’m done writing, I leave this place, knowing Yosemite Valley will be there the next time I want to write.

This immersive context is not possible while sitting at a Mac. And for me, it is just the beginning of these explorations. I’m considering building a similar workflow in some other environment for journaling. And I’ve got more ideas after that.

This started simply as a proof-of-concept experiment, but now it’s set for me. I’ll return here the next time I need to do some serious writing. It’s already working: the valley appears, and my brain says, “Okay. Let’s get to it. Let’s start moving that cursor.”

This a digitally created distraction-free environment that is made possible by visionOS. And this is the productivity story for Vision Pro. I’m not looking to replace an existing platform but find new ways that are only possible in the new platform. The valley proves it’s possible. So now I need to see what else it can do. visionOS isn’t at a place where it can become my only operating system. But that doesn’t mean it can’t be an essential tool in helping me get my work done.

Apple Vision Pro Thoughts

It’s a big week for those contemplating buying a Vision Pro. Apple has always prided itself on only releasing products when they are “done.” While I have no doubt that the Vision Pro is done, I also think the use case for the product is far from done…This is a post for MacSparky Labs Members only. Care to join? Or perhaps do you need to sign in?

The Vision Pro Software Question

One of the bigger questions around the looming release of the Vision Pro is software. Specifically, will there be any, and will it be any good? We don’t know yet. We’ve seen some offerings from Apple and some limited offerings from third parties, but now that we have a shipping date on the hardware, announcements are starting to roll out. The Omni Group announced OmniPlan will be on the new hardware. I suspect they’ll be announcing more. This is where the companies that adopted SwiftUI will get their payoff.

But it’s too early to tell whether a healthy software stack will be available to us on Day One. The device could be focused on enterprise-style software, given its cost. But I could equally see a lot of the better developers getting apps on it despite its small initial adoption so that they can have their flag planted. We’ll see.

Spatial Video Demonstrations

John Gruber spent more time with Vision Pro, focusing on the Photos app, including Spatial Video and panoramic photos. In short, John was impressed, and this is just the first iteration of this stuff.

These things are hard to predict. (It took a pandemic for video chat to get legs.) Nevertheless, as families and friends are spread to the four winds, this holodeck-like experience could be a big deal. Moreover, I’ve lost enough people to appreciate how memories fade. My dad died over 30 years ago, and I’d give a lot to be able to feel his presence again, even if just part of a silly spatial video file.

If this takes off, it could become a killer feature for Apple’s future Vision products. And as explained by John, when iOS 17.2 releases you’ll be able to start recording those spatial videos immediately with your iPhone 15 Pro, even if you don’t yet own a Vision Pro headset.

Apple’s Successful Beta Season

Hooray! It’s macOS Sonoma release day. This new operating system has some great features, including better dictation, small (but nice) improvements to the Apple productivity apps (Kanban in Reminders!), a bunch of messaging and video conferencing updates, PDF tools, and desktop widgets. Stephen and I will cover them in depth on this weekend’s episode of the Mac Power Users.

But there is a more significant point here. Apple had an extraordinarily successful beta season in 2023. They promised some solid updates in June at WWDC and delivered on them from the beginning. None of the new features required a “boil the ocean” effort by Apple, but all of them provided a lot of bang-for-the-buck, quality-of-life improvements.

Moreover, they got it all out with very little drama. They didn’t have to delay the iPadOS launch until weeks (or months) after the iOS release. Moreover, this macOS version is shipping just a week after the iPhone. At the same time, they are also going full tilt at an entirely new operating system with visionOS.

I am impressed with the effort all of that took. I would love to know the story behind all of this. I’ve got four pet theories, none of which have a lick of evidence:

Apple is getting better at this

They’ve been updating multiple major operating systems for over a decade now. I expect they’ve learned quite a bit.

Labor and Management Improvements

That experience also gives management a better idea of what to expect and aim for while giving engineering a better idea of what they can pull off in the time allotted.

A Lot of the Heavy Lifting is Behind Them

Apple has taken on some big tasks over the past decade as they’ve moved to SwiftUI and made significant architectural changes to the programming interface and the chipset. All of that is behind them now, which should make things easier from this point.

Forced Constraints

This theory is probably a stretch, but I think the fact that Apple has looming deadlines related to visionOS gave the company even more focus across the board. That showed up in the decisions and work done on everything else.

I expect all of these factors (and probably a few more) played a role in Apple’s success this year, but I hope their management is paying attention and figuring out how to make this the new norm.

visionOS Experiments and Thoughts

The visionOS SDK kit has only been out a few days, but developers are already digging in. Initial reports are encouraging. App developers are getting their apps up and running in the new spatial computing platform. MacStories did a great job collecting posts from many experimenting developers.

There are two things that come to mind seeing how quickly Developers are adapting:

This is part of the SwiftUI Master Plan

Developers that have been building their SwiftUI muscles are getting paid back right now. visionOS runs on SwiftUI. Developers that are up to speed on it are getting an easy on ramp on the new platform.

This is only Phase One

While it’s great seeing conventional existing apps get displayed in the new spatial computing platform, I consider that just phase one. Phase two is where it gets interesting. That’s when developers come up with applications that could only work on the new platform. I don’t know what those killer apps will be yet, but some enterprising developers are cooking them up right now.

The app I’d love to see is a spatial computing enabled planning/idea development application, where I could put the headset on and have a 10’ x 10’ (or bigger) board I could use to develop ideas and plans. I would probably make it contextual and instance it somewhere in Yosemite or Yellowstone, but it would be the place I’d go to work on big ideas. Then when I’m done, I could take the headset off and go make a sandwich.