On Apple’s “Expanded Protections for Children” – A Personal Story

These efforts will evolve and expand over time.

- Apple on their "Expanded Protections for Children"

I am broke “non-liquid” right now. Normally, that sentence doesn’t have geopolitical ramifications on the creation of hybrid private-public surveillance state. But in my case it does.

It started rather innocuously. Someone couldn’t send me money. No matter what they did. They couldn’t transfer any money. Curious. We started experimenting with different friends, companies, and acquaintances. None of them could send me money. Wise wouldn’t provide any explanation. My business had been a paying customer for more than ~4+ years. And now, I was locked out. Okay. They’re hardly the only fish in the sea.

Except it happened again. This time with our bank, Mercury†. Okay, that’s strange. They asked for more details. We sent them details. Scans of my documentation. They’d revert back a few days later for more details. And we’d send it to them again. On and on we went chasing each other around the maypole. A few weeks passed, and Mercury’s team finally cut to the chase, they asked if I was affiliated with Entity X. We said no. But it didn’t matter. They froze me out anyway.

In some ways, we’re lucky that they flagged the transfer, because they gave us the first clue of what was going on. A friend ended up having the insight to search for Entity X (thanks Nick!) and found that they’d been sanctioned. Okay. What did this have to do with me?

On further investigation, we found that I shared my legal name with someone working with Entity X. Great. It’s a common name, like Robert Brown or Mary Johnson. Shouldn’t it be easy to prove that I’m not person X? A little Googling later, I found that the United States Department of the Treasury has an Office of Foreign Assets Control (OFAC), and they have an email address for “reconsideration” if you’ve been mistakenly placed on the sanctions list. As I’m clearly not the person they were looking to sanction, I decided that the most logical course of action would be write to them, and get their assent that I am indeed not the human that they’re looking for.

We wrote something up (once again, thanks Nick). And we sent it in.

The OFAC replied, to their credit, quite promptly with a link for false positives, and they told me to send this to the bank. Okay. Done. Except, it didn’t matter. Mercury blocked me anyway. The saga finished with me getting nada.

I wrote back to the OFAC requesting them to write something, anything on an official looking letterhead stating the simple fact that I, Person X born on such and such date with an ID number of Z, am not the person who has been sanctioned. I offered to provide them with additional details if there was any hesitancy on their part.

They haven’t replied. And I still can’t get any cash.

What’s amazing about this saga is that I have a statistically common legal name. There are thousands of people running around with this name. And effectively, they’ve all been barred from the global financial system.

To their credit, the OFAC publishes additional details for each person to prevent this exact scenario from occurring. However, these details seem to vary across individuals and countries. It doesn’t seem to be a neat dataset to model. In fact, the lists are available online in .PDF and .TXT files, and it’s a mess. Also, there isn’t one list, but rather multiple lists, with different details. As far as I can tell, there is no official API. There is no efficient way for programmers designing, say a “[bank] built for startups” to comply with the Treasury. Consequently, someone decided, in a meeting room somewhere, to punt the problem.

If it’s a pain to code this thing and it seems like a box to tick, then why not just run a simple match against the name – the one element of the dataset guaranteed to be there – and then punt it to our underpaid and understaffed KYC department to deal with? And the KYC department punted it to legal. And legal took one look and went, “hrm, what’s our downside risk if we approve this? Do we know this person is indeed the one being sanctioned? Oh, I see. Well let’s just block them. There’s not much they can do about it.”

And so, I am currently broke non-liquid. Kafka would have been proud.

(for connoisseurs of dark humor, I find it particularly hilarious that more likely than not the person they’re targeting will never be affected by this. This person, with whom I have the misfortune of sharing a legal name, doesn’t seem to perform many international transactions. She lives in a cash-first developing country, where the sanctions have no reach. She’s but a lowly piece in a very big game)


In other news, Apple has decided to scan images on your phone. For child porn, of course. There aren’t enough people (nor PR capital) to look through all of your photos. So, they’ve done the efficient thing instead. They’re simply going to match all of your photos, on your phone, against a hash of child porn images. What could possibly go wrong?

Other, more eloquent commentators, have pointed out the danger this poses. Apple often bends over for authoritarian regimes like China, going so far as to remove apps that help pro-democracy protestors in Hong Kong avoid police violence. Or, that the company doesn’t allow adults to view adult content on the devices that they own. Or, in more conservative countries, removing any apps that mention the inconvenient existence of LGBTQ+ people. Wikipedia has an exhaustive list.

However, it is not my purpose to repeat what these commentators have already said. Nor to write a polemic for its own sake. My goal is to use my experience to peer into the future. To predict, through inevitability, the chain of events for poor souls in days yet to pass.

Just like the programmers who chose to match names against the sanctions list, while the KYC and legal departments handle the rest, Apple is matching all of the private photos on your phone against a list. This list is generated using something called a hash function. It turns an image into a number that a computer can then use to make the comparison. The process looks a bit like this,

Please note this is an example from a lovely blog that goes through this process. Apple’s process is different, but inherently similar.

In most cases, when we’re dealing with hashes in computer science, we’re looking for exact matches. But this case is different. Criminals aren’t stupid. They’re often not clever, but not all of them are stupid. What if they add a filter? Or, change the image somehow? Maybe add in a blur? What then?

Well, clearly, they can’t make every variation possible of every image that the police are looking for. If it’s a pain to code this thing and it seems like a box to tick, then why not just look for similarity instead and then punt it to law enforcement to deal with? To put it another way, the question Apple is asking isn’t, “is this image exactly like this photo from the child porn database?” No, the question Apple is asking is, “is this image similar to this photo from the child porn database?” Where the “similar” question is answered by a very clever, but very dumb algorithm.

In my case, when the programmers punted to the KYC department and legal, a bunch of transactions were cancelled. In this case, the police will raid your house and arrest you for “child porn.”

The image they will raid you for wouldn’t even necessarily have to be an adult one. Computers are strange. They are very literal beings. They do exactly what they are told, which is both their greatest asset and weakness. For example, this neural network thinks that this cat is guacamole,

Apple’s algorithm is very different, but it is susceptible to the same problems. In this case, noise was added into the image to make the neural network classify the cat as guacamole. In your case, it could be just happenstance. Random noise due to lighting conditions that takes an innocent photo (or, more likely, an intimate photo of you or your partner) and makes the algorithm assume that it is similar to a restricted photo.

Just like me, one day you’ll wake up and notice that something weird might be going on. Maybe it’s a car parked outside your street. Maybe it’s a feeling like people are following you. You won’t be that alarmed at first. You’ll dismiss it. But, eventually, it will come crashing in. Your door will be knocked open by SWAT. And you will find yourself facing armored people in guns to your confusion. They’ll scream for you to raise your hands and put them behind your head. They’ll say, “ma’am we have a warrant for your arrest.” And confused, you’ll be walked out to a waiting squad car.

Your local newspaper will write, “[You] has been arrested for possessing child pornography.” But you won’t read the headline. You’ll be in an interrogation room. Waiting for them to come in. Maybe the officer is letting you sweat it out. Maybe not. But eventually they’ll come in and they’ll start interrogating you. “Ma’am do you consume pornography that has minors in it?”

And you’ll deny it. They’ll put your phone in front of you. It’ll be in one of those bags. You’ll unlock it for them. And you’ll know that they’re going to go through everything. Including that embarrassing pic you took for K.

They won’t find anything. So they’ll return to question you. Maybe you’ve deleted the photos, who knows? Let’s pass it on to forensics and see what they come up with.

Eventually, you’ll end up being detained in jail. Your bail will be set high. You’ll have to plead and explain to everyone around you that you don’t know what’s happening. You don’t know why this happened. And those near you might believe you, but there will be this doubt that crosses through their mind. Just for a second. They’ll have this doubt, and that doubt will always be there for the rest of their life.

If you’re lucky enough to have money, you’ll get a lawyer. They’ll work on getting you out. Eventually, the forensics will return nothing. Your lawyer will succeed in getting you to walk free. And you’ll walk back to your home, with stares from your neighbors.

In a few months, it’ll come out that no, Apple’s algorithm screwed up here. Apple will blame law enforcement. Law enforcement will blame Apple. It doesn’t matter. That newspaper will put up a small story on their site about how you’re really innocent. It won’t get many clicks. And the one about you getting arrested for possessing child porn will stay up forever. With a note added to it, if you’re lucky.

Maybe you’ll get your life back. Maybe not. But you’ll spend the rest of your life under this cloud, because some programmer somewhere wrote this code. And everyone else assumed that it just works.


PS – This has already happened many, many times. Except not for something as incendiary as child porn, nor with the exact same technology. However, these examples are included as they are illustrative;

Per chance, if employees of Mercury or Wise see this, feel free to reach out to me on Twitter, @_areoform.

8 thoughts on “On Apple’s “Expanded Protections for Children” – A Personal Story”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s