When the Machines Remember

Early in my experience with ChatGPT’s voice feature, my daughter walked into the room and I told it to say hello to her. It did, cheerfully, and it totally creeped her out. I turned the voice off.

A week later, I fired ChatGPT back up, and it asked me how Samantha was doing.

That moment hit me like a splash of cold water. This thing remembers. And not in a “session cookie” kind of way, but in a persistent, “let me tell you about yourself” way.

That was the beginning of my realization that these large language models (LLMs) are quietly soaking up everything we put into them. From that point on, I’ve approached the whole thing with very open eyes about how much of me I’m exposing to this frontier model every time I fire it up.

To some degree, this has always felt inevitable. I’ve long suspected it was just a matter of time before we’d be offered a memory-enabled assistant that could draw from everything we’d ever said to it.

What I didn’t expect was how soon that day would arrive.

The Memory Feature

ChatGPT recently added a feature called Memory, which allows it to remember facts about you across conversations. Things like your name, your tone, your preferred writing style, your favorite apps; it’s all fair game.

And yes, that makes it dramatically better at giving you what you want. But of course, that efficiency comes at the cost of increased data retention and exposure.

I’ve spent almost thirty years as a lawyer, carrying around a near-constant fiduciary duty of confidentiality. So you’d think I’d be paranoid about this sort of thing.

And yet, I’ve been surprisingly relaxed about it. If I had to give an explanation, I’d say my cavalier attitude is motivated by curiosity more than anything else. I’d tell you it’s so I can experiment for the MacSparky Labs, but I’m also just curious.

Regardless, I recently decided to test how well it really knows me.

I typed: “Roast me.”

Here’s what I got back:

Alright, David — you asked for it. Here we go:

You’re like the productivity version of a dad joke: efficient, earnest… and just a little too proud of your label maker.

You’ve got so many field guides, I’m surprised you haven’t written one on ‘How to Overwhelm a First-Time Mac User in 5 Easy Steps.’ Your idea of a wild night is alphabetizing your shortcuts while sipping matcha and whispering sweet nothings to your OmniFocus database.

And let’s not forget your digital filing system — more organized than a NASA launch sequence, yet you still manage to lose your AirPods in the couch cushions like it’s a side quest in a MacSparky RPG.

But hey, if there’s one thing you’ve mastered, it’s turning nerd-level productivity into a full-blown lifestyle brand — and somehow making it cool. Respect.

– ChatGPT

Honestly? A little too close to home.

Managing Your Memory Settings

If you’re curious about trying out this memory feature — or turning it off — you can manage it pretty easily:

  • Go to ChatGPT Settings > Personalization > Memory.
  • There, you’ll see whether memory is on.
  • You can toggle it off entirely or delete specific memories.
  • OpenAI also now notifies you when a memory is created or updated, so it’s not operating behind a curtain.

For a deeper dive into how it works, I recommend John Gruber’s take on Daring Fireball. He poked at it similar to me, but he asked it to describe, not roast.

This moment feels like a turning point. AI tools are starting to remember us, and that memory now lives quietly in the background of every prompt. Whether you view that as magic or menace probably depends on your own relationship with data privacy.

As for me, I’ll keep experimenting — with eyes open and curiosity intact — and maybe just a little less matcha near the keyboard.

Perplexity Moves into Mobile

News broke this week that Perplexity AI is in talks to integrate its assistant directly into Samsung and Motorola phones.

What’s especially interesting here is how this upends the usual Android assistant narrative. Google has been baking Gemini into Android with increasing urgency, and now we have Android hardware makers cutting deals with a third-party assistant — one that’s not Google. That’s a big deal. It says something about how fast the AI landscape is shifting.

And then there’s Apple.

Just imagine a future where Samsung phones have Perplexity, Google phones have Gemini, and Apple…still has Siri.

Yikes.

Now, I get it. This is very on-brand for Apple. They like to own the whole stack, and they’re not the type to go shopping for outside solutions. And to be fair, there have been increasing signals that Apple is pouring serious resources into AI lately. Hopefully that’s not just for show.

Part of me can’t help but wonder if one of the smartest moves Apple could have made was to simply buy Perplexity. But that ship may have already sailed.

It’s not just about having an assistant; it’s about having one that people want to use. The AI space is sprinting, and it feels as if Apple is still stretching on the sidelines.

Perplexity Pages

My experiments with Perplexity continue. This alternate search app takes a different approach to getting answers from the Internet. Rather than giving you a list of links to read, it reads the Internet and tries to give you an answer with footnotes going back to the links it reads. I think it’s a good idea, and Perplexity was early to this game. Google is now following suit to less effect, but I’m sure they’ll continue to work on it.

I recently got an email from Perplexity about a new feature called Perplexity Pages, where you can give it a prompt, and it will build a web page about a subject of interest to you. Just as an experiment, I had it create a page on woodworking hand planes. I fed it a few headings, and then it generated this page. The page uses the Perplexity method of giving you information with footnotes to the websites it’s reading. I fed it a few additional topics, and it generated more content. Then, I pressed “publish” with no further edits. The whole experiment took me five minutes to create.

The speed at which these web pages can be created is both impressive and, in a way, unsettling. If we can generate web pages this quickly, it’s only a matter of time before we face significant challenges in distinguishing reliable information from the vast sea of content on the Internet. In any case, I invite you to explore my five-minute hand plane website.

Hope Springs Eternal for Apple Intelligence

Yesterday, Apple announced its new name for artificial intelligence tools on its platform: Apple Intelligence. If you watched the keynote carefully, it was almost humorous how they danced around the term “artificial intelligence” throughout. At the beginning, Tim made reference to “intelligence” without the word “artificial. Then, throughout the rest of the keynote, up until the announcement of Apple Intelligence, Apple relied on its old standby, “machine learning.” Nevertheless, they eventually got there with the announcement of Apple Intelligence.

official Apple Intelligence text and iPhone image from their website after the june 10 2024 announcement.

The initial explanation was telling. They stated five principles for Apple Intelligence: powerful, intuitive, integrated, personal, and private. These principles are the foundation of what they’re trying to ship. Also, in Apple fashion, the term Apple Intelligence doesn’t refer to a single product or service, but a group of intelligence-related features:

Table Stakes AI This is the type of AI that everyone was expecting. It includes things like removing lampposts from picture backgrounds and cleaning up text. We already see multiple implementations throughout the Internet and in many apps already on our Macs. Apple had to do this.

They did, and the implementation makes sense. It’s got a clean user interface and clear options. Moreover, developers can incorporate these tools into their apps with little or no work. It should be universal throughout the operating systems, so learning how the tool works in one place means you can use it everywhere else. For most consumers, this is golden.

Also, it will be private. While I’m a paying customer of Grammarly, I’m aware that everything it checks is going to their servers. That means there are some things that don’t get checked. I’d much prefer to do this work privately on my device.

LLM AI There have been many rumors about Apple developing its own Large Language Model (LLM), but nobody expected them to have one competitive with the likes of OpenAI and Google. So the question was, is Apple going to ship something inferior, work with one of the big players, or not include LLM as part of this initiative? We got our answer with the partnership with OpenAI, which incorporates OpenAI’s 4o engine into the operating system.

This makes a lot of sense. Since the keynote, Craig Federighi has gone on record saying they also want to make similar partnerships with Google and other LLM providers. While nothing is going to be private sent to a company like OpenAI, Apple is doing what it can to help you out. It doesn’t require an account, and it gives you a warning before it sends data to them. Again, I think this is a rational implementation.

If you already have an OpenAI account, you can even hook it up in the operating system to take advantage of all those additional features.

Private AI

This was the most important component of Apple Intelligence and was underplayed in the keynote. Using the built-in neural engine on Apple silicon combined with Apple Intelligence, Apple intends to give us the ability to take intelligence-based actions that can only be accomplished with knowledge of our data. That bit is essential: Apple Intelligence can see your data, but more powerful LLMs, like ChatGPT, cannot.

That gives Apple Intelligence powers that you won’t get from traditional LLMs. Craig explained it with some example requests:

“Move this note into my Inactive Projects folder”, requiring access to Apple Notes. “Email this presentation to Zara”, requiring access to Keynote and Apple Mail. “Play the podcast that my wife sent the other day,” which requires access to data in the Podcasts and Messages apps.

While these commands aren’t as sexy as asking an LLM engine to write your college paper for you, if they work, they’d be damn useful. This is exactly the kind of implementation I was hoping Apple would pursue. Because they control the whole stack and can do the work on device, this feature will also be unique to Apple customers.

“AI for the Rest of Us”

During the WWDC keynote, I only heard the term “Artificial Intelligence” once. At the end, when Craig said, “Apple Intelligence. This is AI for the rest of us.” I think that sentiment summarizes Apple’s entire approach. I agree with the philosophy.

I’m convinced that Apple has considered AI in a way that makes sense to me and that I’d like to use it. The question now is whether Apple can deliver the goods. Apple Intelligence isn’t going to be released for beta testing until the fall, so now we just have promises.

Apple’s challenge is the way Siri lingered so long. You’ll recall that Siri, too, started with a good philosophy and a lot of promises, but Apple didn’t keep up with it, and Siri never fulfilled its potential.

Looking at the Siri example, I should be skeptical of Apple Intelligence and its commitment. Yet, I’m more hopeful than that. The degree of intentionality described yesterday, combined with the extent to which Apple’s stock price is contingent on getting this right, makes me think this time will be different. In the meantime, we wait.

Apple Licensing Data for its AI Training

The New York Times reports Apple is in negotiations to license published materials for training their generative AI model. This shouldn’t be a surprise. A few years ago, when image processing was the big thing, everyone thought Apple would fall behind because they weren’t collecting all our images for data processing. Then I saw Craig Federighi explain how Apple could get pictures of mountains and that they didn’t need mine.

This is similar to how Machine Learning requires a data set to train. Again, Apple is looking to buy data as opposed to setting its AI loose on the Internet. I really wish I had a better idea about what Apple is thinking to do with AI.