An Algorithm that Forgets



The verdict of the highest European Court of Justice against Google in the dispute over the “right to be forgotten” is a scandal, no doubt about it! The Chief Justices obviously failed to really see the size and scope of the apple cart they have upset here. At least in theory anyone in Europe can now order Google to delete online information that mentions them, on the grounds that they feel disturbed. It doesn’t matter whether the claim is justified, or if indeed any legal rights have been violated or any ethical boundaries have been crossed. It suffices to say, „I want out!“

Again in theory Google will have to carefully consider in each individual case. To do this, Google will need to hire an army of rather intelligent and therefore rather high-paid employees to research each background and reach millions of complicated decisions on whether or not a delete request is justified by the provisions or the spirit of existing laws, by the vague rules of “netiquette” or just by pure common sense. And what if they decline a request? The applicant will presumable be able to take recourse to the courts in his or her country and try and force Google to comply.

So what is Google going to do? At the end of the day, they are a for-profit company, and the costs they are facing here are potentially huge. So of course Google will take the path of least resistance and just simply delete anything anybody asks them to delete. Presumably, this will be an automated process, so essentially this means we can all just delete anything we want at any time. And where, pray tell, does that leave the simple, unvarnished truth? Can we simply rewrite history as we see fit?

Last night after the IT Security Conference of TÜV Rheinland in Fürstenfeldbruck I sat down with Odad Ilan from Cyber ​​Gym, an Israeli training camp for cyberwar soldiers , and he found a pretty drastic comparison. „Imagine that Hitler was still alive. He has survived for years undetected somewhere in South America, and now he’s making the request to delete all Google entries on the extermination of the Jews in the Third Reich. Why? Because he feels disturbed by the publication of his youthful sins?“ If Google deletes without checking, so his argument goes, then the Holocaust could disappear from the public record. Do we want that?

The next day I had lunch with my old friend Phil Zimmermann, the creator of PGP, the most widely used e-mail encryption program in the world. I drank a couple of non-alcoholic wheat beers, he had a mango spritz, so we weren’t exactly inebriated. But the idea we came up with does sound inspired by some kind of stimulant.

Phil started by describing how the human brains forgets. Our biological neural network assigns a particular importance to information depending on any number of factors. If something is really important, like where was I on 9/11?, our neural network is severely impacted and embeds that information firmly in my dura mater. On the other hand, remembering where I left my car keys is, in the scale of things, a rather trivial fact, so my brain sometimes discards it in the mental dustbin of oblivion. So most of the time we are able to remember events and facts that are really important to us. The unimportant, however, has a shorter half-life and is therefore often just sorted out.

How would it be, we wondered if we would develop an intelligent algorithm which weights data stored by Google and determines whether or not to delete based on a kind of “relevance score”? Such an algorithm, Phil thinks, would possessed or at least mimic human common sense: Big, important events create a high amplitude on whatever scale we are using, while trivial facts remain a blip of the screen. The half-life of a fact could then be assigned according to its relevance score and would disappear after a certain time, depending on its general relevance. Unimportant information would decay fast, so to speak, while information classified as important would get a longer expiration period or would remain online forever.

While we ate, we tried to think up a few examples. How about these: information about a mass murderer or a child molester should remain available, while the speeding ticket some stupid young beginner driver received should not be able to pursue him all his life.

Phil noticed an interesting parallel to the social phenomenon of forgivingness: the ability to forgive and forget, he believes, has a direct relationship to the temporal distance between the original event and our present situation. Old veterans fraternize decades after the war with their former opponents; the selfsame guys they used to try their best to kill, and vice vera.

Of course there are still a few bugs in our idea, for example: Who gets to determine how the scoring system weighs and evaluates individual bits of information? Whose brand of ​​“common sense“ applies? Who can lodge an objection against the algorithm’s pronouncement, and what happens then? But those are just some details we can probably leave for our people to work out. In case of most of the expected deletion requests Google will be facing all we need is a pretty simple algorithm. Phil’s and my common sense tells us that for someone to evoke a “right to be forgotten” will require more than proof of a slight irritation. Just because some piece of information bothers me doesn’t give me the right to deprive others from perusing it. We as society must place the barrier much higher by forcing the applicant to answer the question: Am I affected in my human dignity by the publication of this information?

So how do you code human dignity into an algorithm? I think to answer that question conclusively, we as a society still need to empty many bottles of beer or red wine together. But hey, at least it’s a start !


Dieser Beitrag wurde unter Digitale Aufklärung, Internet & Co., posts in English abgelegt und mit , , , , , , verschlagwortet. Setze ein Lesezeichen auf den Permalink.

Kommentar verfassen