I heard from several of you over the weekend that are enjoying your new iPhones and Apple Watches. Good on you. I spent the weekend kicking the tires on my new iPhone 15 Pro Max, and I dig it… This is a post for MacSparky Labs Members only. Care to join? Or perhaps do you need to sign in?
Last week Pixelmator released Photomator for the Mac. Photomator is the application which finds that Goldilocks position between Apple Photos and Pixelmator Pro. If you’re looking to make your photos look better, but you don’t want to get in the weeds, this is the application for you.
I’ve always been impressed with the way Pixelmator incorporates artificial intelligence and other cutting-edge Apple technologies into their products. Photomator gives you the benefit of those technologies without the learning curve. Color adjustments. Batch editing features. Magical Repair and Clone tools. Even better, with iCloud support, whether you’re using Photomator on Mac, iPhone, or iPad, your edits will always stay in sync.
Watch below to see just how fast and simple Photomator is.
There’s a weird thing in the Photos app where you can create slideshows, but there is no way to share them. Here’s where the Memories feature comes to the rescue. In this video, I explain the entire process…
There’s an interesting story right now about the latest Galaxy S23 and the moon. A person on Reddit made a blurry picture of the moon and then took a picture of that blurry image with his Galaxy S23, which used its particular, highly trained AI, to make it a clear and beautiful picture of the moon. That resulting picture wasn’t so much the same moon the photographer saw so much as it was an AI-generated picture of what the S23 computer brain expected the moon to look like in that particular photo.
I don’t really know how to feel about that. If I took a picture of my wife, would I want the picture of that lady that I love as seen through my lens in the moment or the idealized version of her the AI generates on the phone? That’s kind of a loaded question because, with all of the computational photography going on in all smartphones (iPhone included) you never really see exactly what the lens saw anymore. To me, the tipping point is where the image capture no longer matters. It appears the S23 is at that point when you shoot the moon.
Tyler Stalman did a recent video comparing the iPhone to big, fancy cameras. The question comes up every few years, and every few years the percentage of people for whom big, fancy cameras still make sense gets smaller. Tyler is a professional filmmaker, so he’ll always need something more, but for the rest of us, big fancy cameras are getting harder and harder to justify.
One of my favorite reviews with each new iPhone is that of Austin Mann. Austin always takes the new iPhone someplace interesting (this time, it’s Scotland) and takes some amazing pictures with the new iPhone Pro while pointing out its strengths and weaknesses. Austin’s iPhone 14 Pro camera review is now up.
This time, he spends a lot of time explaining the advantages of the 48 Megapixel sensor and where its limitations are. He also has thoughts on the three camera sensors. If you are interested in iPhone photography, don’t miss this one.
I’ve spent a lot of time shooting videos with the iPhone lately. I made this video when the iPhone 13 Pro was first released, but I’ve also been using the iPhone a lot for MacSparky Labs videos and nearly exclusively for DLR Field Guide content.
My evolving preference for the iPhone over a more dedicated camera results from competing tradeoffs.
The dedicated camera has a better sensor and can have interchangeable lenses. That produces noticeably better video than video out of the iPhone. But for shooting video on the move, as we do with the DLR Field Guide videos, that regular camera comes at a cost. First, it’s heavy to carry around and awkward to wield. Second, and more importantly to me, is stabilization. My regular camera (a Sony) cannot hold a candle to the iPhone video stabilization, even with in-body stabilization, shooting with the Sony takes a ton of post-production effort to get stable where I can use iPhone footage pretty much “as is”.
Potato Jet (one of my favorite camera guys on YouTube) made his own comparison recently and came up with a similar conclusion. I’m not saying that they should shoot the next Star Wars movie with an iPhone, but for much of the stuff I do, the iPhone is plenty enough camera. So for now you need to choose your poison, slightly worse video, or deal with bulk and stability challenges.
The bigger question is where this is heading. If mobile phones continue at their current clip, exactly how long will it be before nobody can tell the difference?
Yesterday morning I spent a few hours testing the new iPhone 13 Pro camera system at Disneyland, particularly in Galaxy’s Edge (of course). A few takeaways were:
The new wide lens is a big improvement and I’m going to be using that lens a lot more.
3X reach is a lot more useful than 2X reach for zoom
Cinematic Mode version 1 is a lot better than Portrait Mode version 1.
Here’s a video with all the details.
For years now, Wally Cherwinski has been teaching people how to use Apple technology to shoot great video. I got to attend one of Wally’s sessions this year at the Macstock conference and in addition to being a great videographer, Wally is also an excellent teacher.
Wally has now released a brand-new media-rich the iBook, Video To Go to help you get better at taking video with your Apple devices.