Ep. 155 - Apple’s Privacy Changes

Why (and how) Apple plans to scan your photos and messages in iOS 15 🔎

In this episode, Mattimore discusses the fallout from Apple’s recent - and highly controversial - privacy changes.

Topics Discussed:

  • Apple’s recent announcements

  • Backlash over privacy concerns

  • Backlash over security concerns

  • Apple’s historic focus on privacy

  • How CSAM detection feature works

  • How iMessage detection feature works

  • Arguments for CSAM detection

  • Arguments against CSAM detection

  • Arguments for iMessage nude detection

  • Arguments against iMessage nude detection

  • Future implications of these changes

Future Scenarios:

  • Worst case scenario

  • Best case scenario

  • Most likely scenario

Thanks for tuning in 🔭

 
 
 

Episode 155 Transcript

Mattimore Cronin (00:12):

Welcome to Hence The Future podcast. I'm Mattimore Cronin. And today we're discussing Apple's recent privacy changes. There are two new privacy changes that Apple announced this past week: One focused on detecting known child abuse materials, and another focused on detecting nudes sent in iMessage. And both have sparked a major backlash from privacy and security advocates, including the Electronic Freedom Foundation, Edward Snowden, Benedict Evans, Apple customers, and even Apple's own employees.

Mattimore Cronin (00:40):

In today's episode, we're going to explore what these two new features are, how they work on a technical level, why privacy and security experts are so concerned about them, and what the likely fallout will be for Apple and for Apple customers in the future.

Mattimore Cronin (00:54):

The first feature is a CSAM detection tool. CSAM stands for child sexual abuse materials. And the goal of this feature is to identify and report known child sexual abuse material, as it gets uploaded from your phone to the cloud.

Mattimore Cronin (01:11):

Some important context is that most businesses already scan photos on their cloud servers to make sure that they're not hosting known child sexual abuse materials. However, what's different with Apple's approach here is that they are actually scanning every new photo on the device on your actual iPhone or Mac computer, and creating a neural hash of each image, basically a unique ID for each image, and then as those photos are uploaded to iCloud, it will check that neural hash against its database of neural hashes of known child sexual abuse materials. And if a certain threshold is met right now - the current threshold is 30 known images from your phone also matching the CCM database - then it will trigger a human review. A human will look to confirm that, yes, indeed, you do have these illegal photos on your device and in the cloud. And after that, they will notify the relevant authorities.

Mattimore Cronin (02:06):

The second feature is a machine learning algorithm that scans images as they come into your phone. And if they detect that there's a nude and you're underage, it will notify your parent. And here's how it works specifically. The first step is the parent has to opt in to this "on-device intelligence." So it is an opt-in feature.

Mattimore Cronin (02:24):

The next step is that a child under the age of 18 will receive a nude image. The machine learning algorithm will then detect that, yes, this is in fact a nude. And then the kid will have the option of viewing the photo or not viewing it. If they decide to view it, there will be a pop-up warning them that they're about to look at some possibly sensitive materials. If they click on that, there'll be another pop-up warning saying that if they go forward and view the image, their parent will be sent a notification telling the parent that their kid did in fact view this nude image, and the parent can view that image.

Mattimore Cronin (02:55):

Now, before we get into the arguments for and against these two new features, it's important to understand the context of how Apple has historically positioned itself as a privacy focused company. And for myself, this is one of the main reasons I love Apple. It's the reason why I've used all of their devices and continue to use them. And when you look at their past ad campaigns, they're all very privacy focused.

Mattimore Cronin (03:19):

For instance, Apple ran a massive billboard campaign that said, "What happens on your iPhone stays on your iPhone." They ran another campaign that said, "Privacy, that's iPhone. And on their privacy page on their website, the headline reads, "Privacy is a fundamental human right." So clearly Apple has positioned itself as the most privacy focused of all the big tech giants.

Mattimore Cronin (03:47):

And in 2015 famously the FBI requested that Apple create a backdoor so they could look at the San Bernardino shooter's iPhone to help with their counter-terrorism efforts. And Apple put out a big press release where they said, "We will not do this." And specifically they said, "We believe the contents of your iPhone are none of our business." They also noted the security risk that would follow the creation of a backdoor saying, "Building a version of iOS that bypasses the security in this way would undeniably create a back door. There is no way to guarantee such control Apple."

Mattimore Cronin (04:18):

Also recently kneecapped Facebook's business model when they changed the ad pixel tracking to an opt in system, as it was previously to an opt out system. Now, every iPhone user gets a pop-up asking, "Hey, do you want to allow PayPal to continue tracking you?" (Or whatever app you might be using). And that was all done under the banner of greater privacy control for Apple customers.

Mattimore Cronin (04:42):

Tim cook also earlier this year said, "We've spoken out time and time again for strong encryption, without backdoors, recognizing that security is the foundation of privacy."

Mattimore Cronin (04:52):

So the backdrop to all of these privacy changes is that everyone who loves Apple pretty much loves them because of how privacy focused they are and how good the security and encryption has been historically on Apple devices. Now let's look at some arguments for and against each of these respective new features.

Mattimore Cronin (05:10):

Here are the main arguments for Apple's CCM detection feature. First argument is that this change is actually more private than what competitors offer because the computer doesn't actually see your photos. It only sees the hashes. So the only case where a human would actually look at your photos would be in the case where the threshold is met: where there are 30 hashes that match the CSAM database hashes. And therefore, only in that case would someone actually be looking at your photos.

Mattimore Cronin (05:35):

The next argument is that this hash matching doesn't actually occur solely on the device. The way it works is that it creates the neural hash on the device as you have new images added to your phone. But then the second part happens in the cloud, where they match the on-device hashes to those in the CSAM database, as your photos are uploaded to the cloud. So if you choose not to use iCloud and you never upload them to the cloud, then theoretically you would not have the CSAM scanning system impact you in any way whatsoever. In other words, it's opt in. No one's forcing you to use iCloud. If you use Telegram or Signal or some other app to send messages, those would not be scanned and detected with the CSAM feature.

Mattimore Cronin (06:28):

And I would say the primary argument is that this is a very important cause that Apple is fighting for. Child sexual abuse is nothing to joke about. It's nothing to brush aside. And this feature, according to Apple, is only being used with regard to CSAM. It would not be used to detect any other type of media that's not allowed by, for instance, an oppressive government like China, or in the US, or in any other countries around the world. It would only ever be used for CSAM. And Apple said they would refuse any requests to increase the database, to include other things beyond just child sexual abuse materials.

Mattimore Cronin (06:59):

Now let's look at some arguments against this CSAM detection feature, mostly from privacy and security advocates. One is that... It doesn't really matter if a person really *sees* your photos. The fact that you're matching the unique identifying hashes of the images on your device to something in a database is the only thing privacy and security advocates need to know.

Mattimore Cronin (07:23):

They are scanning your phone. They're just doing it in a way that makes more sense for machines rather than for humans who would otherwise have to spend all these hours manually reviewing them.

Mattimore Cronin (07:30):

The other argument against this feature is that it could easily be misused down the road. Once you build this capability, other countries can use it to their own ends. They may use it to suppress information, to surveil their population, and to exert top-down control over what information is and is not allowed to be shared.

Mattimore Cronin (07:51):

The Electronic Freedom Foundation distilled this perfectly when they wrote "It's impossible to build a client side scanning system that can only be used for sexually explicit images sent to or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the Messenger's encryption itself, and open the door to broader abuses."

Mattimore Cronin (08:13):

Another criticism is that this banner of protecting kids feels kind of suspicious. There's a great political cartoon that distills this, where there's corporate media asking Uncle Sam, "How would you like this wrapped?" And the present that they're wrapping is "control of internet speech." And the two types of wrapping paper are "anti-terrorism" and "protecting kids." And this is a good criticism from my perspective, because we've already given up many of our freedoms through the Patriot Act, under the banner of fighting terrorism. And Edward Snowden revealed many of the freedoms that we gave up because we were focused on fighting terrorism. Now we may be giving up similar freedoms under the banner of protecting kids. So even if the cause itself is just, it can be used as a marketing strategy to make it more palatable to the regular Apple customer.

Mattimore Cronin (09:10):

Another argument is that there are many other ways to protect children online that don't require scanning media on your device / on your iPhone.

Mattimore Cronin (09:20):

There could be an in-app reporting tool where Apple makes it really easy to report child sexual abuse materials to the authorities. There could be scanning of what's on their actual servers without having to scan what's on your device. There are many ways you could fight child sexual abuse materials without having to scan your actual iPhone or your actual Macbook.

Mattimore Cronin (09:40):

And the most compelling argument against this new CSAM detection feature is that if another country demanded that you increase the list of blacklisted items that are not allowed, it would be hard for Apple to deny that request, especially if the request came from China. China is the second biggest market for Apple. About a quarter of Apple's revenue comes from China. And if China said, Hey, the only way we will allow you to continue doing business in our country is if you expand your list to include our other lists of images that are not allowed, and maybe initially those will be CSAM images in China, Apple would likely comply with that.

Mattimore Cronin (10:18):

And over time you can see how this could be a slippery slope. China may eventually disallow images of Tiananmen square via this feature, because it's something that is not allowed in China. You cannot search for that image and get any results if you are within the great Chinese firewall. Similarly, you can't search Winnie the Pooh because people have said it looks similar to President Xi. So this could be a slippery slope where once you have this capability, it's hard to say for sure that you're not going to go any further, that you're not going to allow this to be used in a more expansive way by China, by the United States (if there were a corrupt government in the United States that takes power in the future), or by any other oppressive government around the world that has a say in what the rules are for Apple devices in their country.

Mattimore Cronin (11:08):

And the biggest argument against on-device scanning is that it goes against everything Apple has stood for. Apple has made a point of their marketing to let you know what happens on your iPhone stays on your iPhone. And for many people, your iPhone is like an extension of your mind. And that's a very personal space. So to think about the possibility of Apple or any government, or anyone top-down scanning all the materials on your external mind to see if it's allowed or not allowed, that's pretty scary to think about.

Mattimore Cronin (11:38):

And finally, there's the argument that by implementing this new feature, Apple is creating a new security vulnerability for iPhone users. Will Cathcart is the head of WhatsApp at Facebook. And he put out a Twitter thread about this. He said, "What will happen when spyware companies find a way to exploit this software? Recent reporting shows the cost of vulnerabilities in iOS software as is. What happens if someone figures out how to exploit this new system?"

Mattimore Cronin (12:09):

And this goes back to Apple's remarks in the 2015 San Bernardino shooter case, where they noted that by adding a feature that would function as a de facto backdoor, you are creating a security vulnerability for iPhone and Mac users.

Mattimore Cronin (12:24):

Now let's look at arguments for and against the iMessage nude detection feature. The arguments for are as follows...

Mattimore Cronin (12:32):

One: It's totally opt in. As a parent, you don't have to opt in to this feature. It only becomes active if you opt in. Second, this is meant to help kids stay safe online. As a parent, it's important to know that your kid isn't getting groomed by some predator. And a lot of parents worry that they don't know what's going on on their kids' phones. And so some parents will opt for a "lite" phone, which is made for kids, as it only allows for text messaging and calling, and it doesn't allow for any internet browsing or sending images or videos to friends. So this is a way to take a high quality smartphone, an iPhone, and make it safe for kids. In other words, it gives good parents the opportunity to intervene before something really bad happens with regard to kids being exploited or groomed by predators.

Mattimore Cronin (13:06):

Another argument in favor of this tool is that it does give some power to kids. It would be much worse if this tool simply notified parents without telling the kid anything at all, or giving the kid any choice in the matter. At least with this feature, it gives kids the option of not viewing the photo, in which case their parent would not be notified.

Mattimore Cronin (13:41):

Here are the arguments against the iMessage nude detection feature...

Mattimore Cronin (13:45):

One is that, not all parents are good. And so while you might be preventing abuse from unknown third-party predators, you might actually be creating instances of abuse for parents who are abusive to their kids.

Mattimore Cronin (13:59):

One example is the LGBTQ community. Imagine if you're an LGBTQ kid and you haven't come out to your parents yet, and you have a boyfriend or girlfriend, and you're sending nude images back and forth to one another. You could unintentionally be outed by this feature. And if your parent is abusive and they're anti LGBTQ, they could potentially beat you or do something else bad to you because they would have found out about something that was private to you. And I think in many cases, if you are a consensual 16 year old teen messaging back with your girlfriend or boyfriend, it does seem a little bit weird that they would notify your parents about that. And your parent could actually see the image. So from that perspective, if I put myself in the shoes of me at age 16 or 15 or whatever, I wouldn't want this feature enabled on my phone. But if you put me in the shoes of a parent of a kid, I probably would want this feature as a parent, so I would know my kid is more protected from predators.

Mattimore Cronin (14:56):

And the biggest argument against the iMessage nude detection feature is that this could be yet another instance of the slippery slope where, at first, it's only used to detect nude images through iMessage. But over time, you could use that same machine learning algorithm to detect other types of images that aren't allowed.

Mattimore Cronin (15:18):

So for instance, if China demanded that this feature also be used to detect incoming images that are not allowed by the state, that would be a terrible outcome. And other governments around the world could abuse this tool. And it could be used to suppress free speech, suppress the flow of information, to exert top-down control and surveillance.

Mattimore Cronin (15:39):

And I will say high level, the iMessage new detection feature feels much less problematic to me than the CSAM scanning feature. Many of these tech companies already use machine learning to improve the user experience, but by scanning all the images on your phone, whether or not they have actually been uploaded to Apple servers, that to me seems the most problematic of all of these changes.

Mattimore Cronin (16:03):

Now let's explore the likely fallout of these privacy changes in the future scenarios...

Mattimore Cronin (16:13):

Let's start with the worst case scenario. The worst case scenario is that Apple is implementing these new features, knowing that they will be expanded down the road.

Mattimore Cronin (16:30):

And this may be a survival mechanism for Apple. Maybe they played out the game theory already, and they know if they don't have some feature like this in the future, they will not be able to operate in countries like China. And so this feature could essentially be a proof of concept to show that they have the ability to outlaw this black listed information. And therefore any government around the world could enforce their own rules about what information is and is not allowed on the device.

Mattimore Cronin (17:00):

And I think this speaks to a larger trend where, in America, we have had this historic stance that if we operate under the principles of freedom and we allow the free market capitalism to spread to other countries, other countries will lean more democratically over time.

Mattimore Cronin (17:17):

Well, we've seen that this is not the case. By opening the doors of capitalism and allowing China into the world trade organization, if anything, they have swayed even further away from democracy towards authoritarianism. So part of my concern is that American tech companies are cow towing to authoritarian governments because that's what's best for their bottom line rather than staying true to the values of America, which are freedom of information, freedom of the press, having privacy of your own thoughts, the freedom of thought, and really to have your own information on your iPhone, which is an extension of your mind, safe from any government or other entity looking at that information.

Mattimore Cronin (18:00):

So my worst case scenario is that Apple may no longer be as privacy focused of a company moving forward. And in that case, I'm going to look to other devices that might be better when it comes to privacy and security.

Mattimore Cronin (18:12):

Now let's get into the best case scenario.

Mattimore Cronin (18:22):

The best case scenario is that, because of all the backlash from Apple customers, Apple employees, and privacy and security experts, Apple decides not to move forward with the CSAM on-device scanning feature.

Mattimore Cronin (18:34):

Ideally we would have the best of both worlds, where we keep the same on-device privacy and security for iPhones and Mac computers, while also improving the way that child sexual abuse materials are detected. So if we can have a better solution that doesn't require scanning on the device, and that still keeps kids safe and secure, that is the ideal outcome.

Mattimore Cronin (18:56):

And Edward Snowden has an optimistic tweet about this, where he says, "Do not shut up about Apple's #spyPhone scandal. We can win this." So I think it's possible that Apple could decide not to move forward in this regard because of all the backlash, but it will take even more momentum and even more people letting their voices be heard for Apple to reverse its decision.

Mattimore Cronin (19:19):

Now let's bring it home with the most likely scenario.

Mattimore Cronin (19:26):

The most likely scenario is that Apple will not walk back these features. Executives have already come out defending them, and Apple seems pretty intent on implementing them with iOS 15. So it seems unlikely they would do a full 180 at this point.

Mattimore Cronin (19:49):

However, they have noted the fact that people are concerned about these changes. So I think they're going to tread much more carefully with any future implementations of privacy changes that they make. And I really do think it's an open question about the future of Apple and Apple devices...

Mattimore Cronin (20:06):

I believe the world is bifurcating into two systems systems: One that is focused on empowering the individual, maintaining privacy and security, for a bottom-up society of individual nodes that are private and empowered in and of themselves.

Mattimore Cronin (20:23):

On the other hand, we have a system of technology that empowers the state and exerts top-down control. The great Chinese firewall is an example of this. And it's an open question whether Apple goes down one path or the other.

Mattimore Cronin (20:36):

One thing that seems certain to me is that the privacy and security route of encrypted devices that empower the individual, those aren't going away. So even if Apple stops being the banner holder of privacy and security, there will be other tech companies that will take on that banner as soon as Apple relinquishes it. So I'm not as worried at the high level of encryption or privacy or security failing totally in the future, but it may be that Apple is no longer the predominant privacy company in tech.

Mattimore Cronin (21:07):

I will say that over time, I think Apple will realize how important it is to maintain their stance on privacy and security. So I'm not ready to give up on Apple yet. I probably will still buy the iPhone 13 when it comes out in the fall. But for me, Apple is on thin ice.

Mattimore Cronin (21:16):

If they implement any other changes that seem to threaten the security of the device, which as we mentioned earlier, it's like an extension of your mind, then I would forever not be an Apple customer. I would find a better, more secure ecosystem to take part in.

Mattimore Cronin (21:40):

And lastly, I would say that, it's so important that everyone who cares about privacy, everyone who cares about security, speaks out about it. Because these changes don't happen on their own. It really takes a movement to let corporations know that these are important values. And regardless of what's happening with Apple and the Apple ecosystem, I remain incredibly optimistic about the future.

Mattimore Cronin (22:02):

Thank you for tuning in. Hopefully you enjoyed today's episode. And I'll see you next time.

Previous
Previous

Ep. 156 - What’s Next for Afghanistan and the World with Brett Ewer

Next
Next

Ep. 154 - Bitcoin and The U.S. Government