Drafts Just Got a Lot Smarter (Sponsor)

This post is sponsored by Drafts. Sponsorship doesn’t influence what I write. Here’s my take.

I’ve been a Drafts user since its release. It’s the first place text goes on every device I own. Grocery lists, blog post ideas, meeting notes, quick reminders. Anything that starts as words starts in Drafts.

What makes it work is speed. You open the app and start typing. No picking a folder. No choosing a notebook. Just a blinking cursor ready to go. You sort it out later, and Drafts gives you the tools to send that text wherever it needs to end up.

Version 50 Is a Big Deal

Greg Pierce just shipped Drafts v50, and this one matters for anyone who cares about automation.

The Shortcuts support got a complete overhaul. There are now over 50 Shortcuts actions.

You can query drafts by date ranges and location, access version histories, control the interface, and run granular commands for appending, prepending, and editing drafts. The kind of stuff that used to take workarounds now just works.

On the Mac side, the AppleScript integration got a serious expansion. You can query your entire draft library, update drafts, run actions, and work with workspaces. If you’ve ever wanted to build Mac workflows that pull from or push to your Drafts library, this is the update you’ve been waiting for.

The MCP Server for Claude

This is the one that caught my attention. Greg built an MCP server that connects Drafts directly to Claude. If you use Claude Desktop or Claude Code on your Mac, you can now talk to your Drafts library through the AI.

That means you can ask Claude things like “summarize the drafts I created this week” or “find all my drafts tagged with project-x.” You can create new drafts, run actions, and manage your library through natural conversation. It connects through AppleScript locally on your Mac, so your data stays on your machine.

I set this up and it took about two minutes. You can install it right from Claude Desktop’s Extensions settings. Search for “drafts” and it’s there. For anyone already using both Drafts and Claude, this is worth trying immediately.

There are plenty of note apps. What keeps me in Drafts is the philosophy behind it. Text first. Decide later. The capture friction is zero, and the automation layer lets you build exactly the workflows you need.

With v50, that automation layer got considerably deeper. Whether you’re building Shortcuts on your iPhone, writing AppleScript on your Mac, or connecting to AI through MCP, Drafts meets you where you work.

Check out Drafts if you haven’t already. And if you’re already a user, make sure you’re running v50.

AI Agents Need Guardrails

We are roaring into useful, agentic AI. I’ve been saying for a while now that we’re heading into it faster than the security models can keep up. So I wasn’t surprised to see Tailscale announce Aperture, a governance layer for AI agents.

The trouble with AI agents is that they run afoul of the overriding security principle of the last 30 years, which is to prevent access whenever possible. In order to be useful, an AI agent needs access. The security model has to adapt.

Aperture sits between your AI tools and the services they connect to. It routes requests through a gateway tied to user identity. Instead of distributing API keys to every agent and user, you keep one key per provider on the gateway. Aperture tracks who initiated each action and what the agent actually did. If something goes wrong, you have a trail.

It also gives security teams the ability to see and stop tool calls before they execute. That’s the piece that matters most. You’re not just logging what happened after the fact. You’re able to intervene.

I recorded a YouTube video recently about my experience with OpenClaw, a fully autonomous AI setup. I turned it on. Then I turned it off. The security exposure was too much. Aperture is exactly the kind of infrastructure that needs to exist before autonomous agents become practical for real work.

Tailscale isn’t alone here. Expect a lot of companies making announcements like this in the coming months. The AI capabilities are racing ahead. The governance and security layers are playing catch-up.

The AI tools are getting powerful fast. The guardrails need to keep pace.

On Going Wallet-Free

I left my physical wallet at home last Tuesday. Not on purpose. I was halfway to a doctor appointment when I realized it was sitting on my desk.

But I had my iPhone, so I kept driving.

I had my California digital ID, my digital insurance card, and my credit card in Apple Pay. I was fine. (My car also opens with my phone, so I haven’t carried keys in a long time.)

Apple Wallet has been quietly getting better for years. If you’ve only used it for Apple Pay at checkout, you’re missing the best parts.

You can now store physical card numbers behind Face ID. Peer-to-peer payments are as easy as a tap. My state has a digital ID that shows up nicely in my Apple Wallet.

And yet …

There’s a part of me that still feels compelled to carry a small wallet with my driver’s license in it. In my imagination I get in some situation where somebody needs my ID and the app doesn’t work or the authority asking for it doesn’t accept it.

I wonder if I’ll eventually get over that and truly go out knowingly wallet-free. Am I alone in my hesitation?

Mac Power Users 837: Menu Bar Mayhem

Join Stephen and me on this episode of Mac Power Users as we go deep on the Mac menu bar, comparing our contrasting philosophies and walk through our favorites. We also explore how macOS 26’s multiple Control Centers are changing the game.

This episode of Mac Power Users is sponsored by:

  • Insta360: Introducing the Insta360 Wave and the Link 2 Pro.
  • HTTPBot: A powerful API client and debugger for Apple platforms. Get a 7-day trial and 25% off your subscription.
  • Ecamm: Powerful live streaming platform for Mac.
  • 1Password: Never forget a password again.

Memory is King, Again

I’ve been buying Macs forever. For most of that time, the conventional wisdom was simple: Get as much memory as you can afford. Back then, everyone was doing video editing or photo work. Memory was expensive. So you bought as much as your budget allowed. More memory always makes your Mac snappier.

Then the world changed. Solid state storage got cheap. Cloud services got fast. Most people stopped doing local heavy lifting. They stored their photos in iCloud. They edited video in the cloud. RAM didn’t matter as much anymore. You could get away with less memory on a Mac and never think about it.

I told people that advice for years: “Don’t go crazy when buying memory”. When I bought my souped up M2 Mac Studio, the one place I scaled back was memory.

I’m done giving that advice.

The age of LLM-based Artificial Intelligence has made memory a premium again. That’s particularly true if you run local AI models, but even Apple Intelligence can get memory hungry. Modern Macs are amazing hardware. But it means nothing if you run out of memory.

The AI era has changed the buying calculus for Macs.

If you’re running local AI models, you need more memory than you think. Significantly more. I’m not talking about academic research. I’m talking about doing actual work on your Mac.

The M5 GPU is better at this stuff than previous chips. It’s built for it. But you need to feed it memory. Without it, you’re bottlenecked. With it, you actually get performance.

In addition, the price of memory is skyrocketing and it’s only a matter of time before that’s reflected in new Mac pricing.

For most people using traditional Mac software 16GB is still fine. If you’re using Slack and Chrome and Word, you don’t need more. But if you’re thinking about running local models or you’re thinking about a future where some of your AI processing happens on your hardware instead of in the cloud (which I expect most of us will be doing soon), you need to spec higher.

I’d say for someone interested in AI work 32GB is the new baseline. Not for today necessarily. But as a hedge for tomorrow.

This is a shift from how Mac people think traditionally. We’ve been storage-focused. How much SSD. How much disk space. We’re moving into a world where RAM is the limiting factor. Where more memory means access to capabilities that weren’t available at lower levels.

So here’s my practical advice. If you’re buying a Mac in 2026, go big on memory. You’ll thank me in three years when you don’t need a new machine because you hit the memory wall.