With Audio Hijack, you can record any audio on your Mac. Save audio from individual applications such as Safari or Zoom, record from hardware audio devices like microphones and mixers, or even capture the audio output of the entire system. Whatever you need to do with audio on your Mac, Audio Hijack can help.
Record conversations from Zoom, FaceTime, and other VoIP apps
Save streaming audio from the web
Create podcasts, both remote and in-studio
And so much more
Audio Hijack 4.2 just shipped with three powerful new effects, Speech Denoise, Parametric EQ, and FIR Filter, to enhance your audio, alongside over two dozen improvements. Check out on the latest version of the award-winning Audio Hijack today.
I use Audio Hijack daily. I’ve used many audio apps over the years, and Audio Hijack is both the most innovative and reliable in its class. That’s not an easy trick to pull off, but Rogue Amoeba just continues to do so.
And through the end of June, MacSparky readers can save 20% when you purchase with coupon code SPARKY20. Learn more and download a free trial on the Audio Hijack site.
Several journalists got to strap into the new Apple Vision Pro headset. Three of my favorite reads are from Matthew Panzarino, Jason Snell, and Chance Miller. I’ve also spoken to several others here in Cupertino that got the demo. Every person I’ve spoken to that got to try Vision Pro praises the technology. This truly is one of those “only Apple” products. There is no other company with the hardware and software expertise to pull it off. The 12-millisecond latency is something that particularly stands out.
The question that we’ll be asking ourselves is how this technology can change our lives. Looking back at the Apple Watch, Apple initially leaned into it as a fashion product but eventually came around to the idea that it is primarily a health and fitness device. I think for the Vision Pro, the killer use case will be even more stratified. Some people will want them to consume content. Imagine having an IMAX-equivalent screen you can strap on your head or watching your favorite sport virtually on the field next to your favorite players. I love the idea of creating virtual workspaces where I could journal in the middle of Yosemite or do some work while enjoying the view from Machu Picchu. I also like the idea of a seemingly 40-foot whiteboard that I could use in my 14-foot office.
The answer to how we’d use this is going to be “it depends on the person.” Can Apple continue to nail and improve upon this technology? Likely. Will this technology reach critical mass as it gets more affordable? That depends on whether there are enough good reasons for it.
Here I am at Apple Park, having spent the morning watching the keynote event on campus. It was a lot of fun and, of course, I have thoughts:
The software updates were more robust than I expected. I’d originally heard this year was going to be primarily bug fixes with everyone at Apple focused on Vision Pro, but that wasn’t the case. Even if they hadn’t introduced the Vision Pro, it was a good year in general.
Live widgets on the Mac? Yes please.
I like the changes to watchOS. The widgets feel like they are taking another run at the now-abandoned Siri face. I thought it was a good idea then. Hopefully they nail it this time.
They spoke about using a new voice recognition engine on all platforms. I’ll be testing that … soon.
We didn’t hear much about Apple’s productivity apps during the presentation. It wouldn’t surprise me to find out there are improvements that they didn’t discuss.
Messages has to be the most commonly used iPhone app (in the US at least). It makes sense that they are continuing to improve it.
Again we heard about gaming. I think that ultimately Apple is going to have to doing a better job of courting AAA game developers but it’s nice to see they are now actively working on it. For years it felt like they’d just given up on gaming.
The New Macs
The new 15-inch MacBook Air was expected, but still nice to see. It’s basically a 13-inch MacBook Air, but bigger. I spent some time with one after the event. The lack of pin-holes for speakers on either side of the keyboard makes it look strange to me because I’ve never seen a large Apple laptop look like that. On the plus side, that’s one less place for water to get in.
The M2 Mac Studio looks exactly like the M1 model. However, I’m pleased that it got an annual update to the M2. There were rumors that they would skip years with the Mac Studio and I’m glad that’s not the case. Now that Apple makes its own chips, I’d hope that everything gets updated with each new M-chip iteration. (I’m looking at you, iMac.)
The Mac Pro feels a bit of a compromise. I spoke to an Apple rep that confirmed the Mac Pro will not run external video cards. It does, however, have a lot of slots if you need more storage and I/O. I’m not sure what else it will run but I’m guessing we’ll hear about it soon.
The new Mac Pro has the nicest looking logic board I’ve ever seen.
They had some of the units on display in the Steve Jobs Theater. The crowd around them had the same vibe as the Macworld when they announced the iPhone.
Apple Vision Pro
When it comes to introducing a new product category, Apple remains the best company in the world to do so.
Speaking of messaging, Apple really was leaning into the Vision Pro as an augmented reality device, not virtual reality. I don’t recall them using the term “virtual reality” once.
The battery-in-the-pocket thing makes total sense given the early-days nature of this product.
Many times Apple explained how the Vision Pro is only a first step. There is no question that they have a whiteboard somewhere that says “Tony Stark Glasses” on it somewhere.
The real trick to this device will be the screen resolution and brightness. If it is going to try and trick the user into thinking they are looking through the glasses at the actual world, it needs to be convincing. This has been the downfall of VR devices I’ve tried in the past. If they can stick this, it will be game changing.
I think the whole way the device is aware of your surroundings and lets you phase in and back out as necessary is clever. I can’t see myself keeping the Visual Pro headset on while talking to someone for more than a few seconds, but I think they are on the right track.
Turning it into a 3D camera is cool, but the idea of watching my daughter blow out birthday candles with that thing strapped to my head repulses me.
$3,499. I was fooled. I thought they’d come in well below $3,000. With that price, this is very much an “early adopter” product.
The Event Vibe
Apple has the “Keynote at Apple Park” thing nailed down at this point. This year they added shade. This shade is not just a tarp, but a metal structure with footings. They’ll be doing this again next year.
As always, it is nice reconnecting with friends from both the media and Apple. This part makes me miss the old days of Macworld, where it was a true gathering of the tribe.
It is amazing at how densely Apple packs information into their presentations these days.
And now I have an appointment to light this computer on fire with a beta. I’ll see you on the other side. Below are some pictures from the event.
Today, Apple is going to announce a new product category and that is always exciting. Here are a few things to think about as they announce their headset:
What’s the Technology Jump?
There’s been a lot of speculation and rumors that the new technology Apple is using will be significantly better than that before it in terms of screen resolution. If it’s true, will virtual reality get over a tipping point today where it is, feels more about the “reality“ part, and less about the “virtual“ part? If true, this could really be the jump this technology needed in order to become something more.
What’s the User Interface?
Apple always revolutionizes user interfaces. They did that on the Mac, the iPhone, the iPad, and the Apple Watch. They’ve been spending a lot of time and money researching virtual reality and augmented reality leading up to today. I can’t wait to see how they tackle the user interface face problem. If history is any judge, whatever they announce today will be in all competing products within a few years.
What’s the Story?
Another thing Apple is good at is coming up with a story around their technology. Assuming we do have significantly better technology, and a better user interface, how does that change virtual reality into something that’s more useful than playing VR games? What is their story behind this product? Where do they see us using it? How can it make our lives easier?
Those are the big three questions in my mind as we head towards today’s keynote. If Apple can deliver on those three questions, virtual reality is about to get much more interesting.
While there has been little secret that Apple will announce its headset next week (now we’re hearing that they’ve invited many virtual reality experts to the event), the question of when Apple will release the headset is much thornier. Of note is that we’ve not seen leaks from manufacturing. That usually happens once Apple hardware goes into production. That could mean that either Apple is not yet in production or they are making headsets in small, controlled batches.
Also of note is that nobody outside of Apple has seen the current state of the software. They could have the hardware locked down but are still waiting to finish the software. Historically, new product categories are the only place where Apple announces early. The iPhone, iPad, and Apple Watch were all announced months before product release. In effect, Apple is planting its flag in the product space.
I don’t think anyone outside of Apple knows when you’ll be able to buy a headset, and I wouldn’t be surprised if Apple isn’t a bit vague about that question when they announce with something like “later this year” or “early next year”. Regardless, if I had to bet a nickel, I’d say it is at least six months before you can buy one.
Lately, I’ve been experimenting with some of these Large Language Model (LLM) artificial intelligence services, particularly Monkey. Several readers have taken issue with my categorization of ChatGPT Monkey as “artificial intelligence”. The reason, they argue, is that ChatGPT really is not an artificial intelligence system. It is a linguistic model looking at a massive amount of data and smashing words together without any understanding of what they actually mean. Technologically, it has more in common with the grammar checker in Microsoft Word than HAL from 2001: A Space Odyssey.
You can ask ChatGPT for the difference between apples and bananas, and it will give you a credible response, but under the covers, it has no idea what an apple or a banana actually is.
One reader wrote in to explain that her mother’s medical professional actually had the nerve to ask ChatGPT about medical dosages. ChatGPT’s understanding of what medicine does is about the same as its understanding of what a banana is: zilch.
While some may argue that ChatGPT is a form of artificial intelligence, I have to agree that there is a more compelling argument that it is not. Moreover, calling it artificial intelligence gives us barely evolved monkeys the impression that it actually is some sort of artificial intelligence that understands and can recommend medical dosages. That is bad.
So going forward, I will be referring to things like ChatGPT as an LLM, and not artificial intelligence. I would argue that you do the same.
(I want to give particular thanks to reader Lisa, who first made the case to me on this point.)