Last week Stephen Levy published an extended article on Backchannel about what’s going on at Apple with machine learning. A lot of us, myself included, have been wondering openly how Apple can compete with companies like Google at making our devices smarter when Apple’s privacy stand prevents them from reading user data. I wrote about this just a few weeks ago and I questioned whether Apple’s respect for user privacy and making the best possible consumer products can coexist.
The Levy article strikes me as Apple’s response to these questions. Apple’s position is that it can deliver excellent services and privacy by processing data on our devices rather than in the cloud. If that’s true, it would be spectacular. It would be like having a private assistant with no memory of actually working for you.
The trouble is that a lot of the artificial intelligence smarts you see from services like Google comes from having the ability to compare all of the users data. I’ve been testing the artificial intelligence and iOS 10’s Photos app and it’s pretty impressive. You can search for faces and objects in photos and the app does the hard work of finding them for you. However, it only learns to the extent Apple allows it to. Apple has pre-populated the application with a series of search terms including things like “mountains”, “dog”, and “baseball”. This has been working since the early betas and I’ll write more about this when iOS 10 releases.
The trouble is it will only search terms that Apple has deemed worthy of a search term. As an example while I can find mountains by searching my Photos library, I cannot find “stormtroopers” or “Yoda”.
Google Images, on the other hand can search both terms. My guess is that Google was keeping track of the searches done on its cloud-based photo storage site and doing some back-end magic to add popular terms and find the appropriate photos. (As an aside, Google thinks the search term “Yoda” should turn up pictures of the Star Wars character and Buddha statues. Maybe they know something.)
In theory, Apple could update its own list of search terms as it perfects the use of Differential Privacy that would anonymize the data as it goes into the Apple servers but I can’t help but feel Apple will never update as aggressively as Google’s server farms will.
I’m not sure exactly how this all plays out. Part of me feels like Apple is getting ready to enter the ring with one hand tied behind its back. Nevertheless, I would prefer slightly less robust machine learning if it allowed me to continue to protect my privacy rights. The real question becomes what consumers do if there’s a significant difference between server-based services with little privacy and device-based services with better privacy.
Either way, the Stephen Levy article demonstrates that Apple is fully engaged with this question and taking what they believe to be the best course of action. I recommend reading the whole thing if this topic is at all interesting to you. Apple is clearly (and publicly) taking a different path with respect to machine learning and privacy. I’m fascinated to see how this all plays out.