THE END OF PRIVACY!? A VOTE FOR “YES.” I was threatened with death, and all I did was taking the algorithms used in all of our cell phones and warned: you need to think about this, because they can harm you! – says Michał Kosiński, Ph.D., from University of Stanford in a conversation with Monika Redzisz

Monika Redzisz: You studied social psychology at SWPS (University of Social Sciences and Humanities). How did you end up in Stanford, working on psychometry? You also studied social sciences at the University of Cambridge. How did it happen, you becoming a data scientist?

Michał Kosiński*: My thesis supervisor worked on psychometry. He measured psychological traits such as personality or intelligence. As I was a bit better at math than the average psychologist, I started working on it, too.

I compiled questionnaires, measured the personality I undertook a collaboration with David Stillwell, who created the My Personality application, allowing Facebook users to fill out psychological surveys. One of them was filled out by five million people.

When I analyzed the data, it occurred to me that liking Facebook pages is really similar to answering the questions on a personality test: which books do people like, which movies, what do they like. I tried to use it and it turned out the results are really precise. We tested our algorithms for many months, because we could not believe that it is possible to describe someone’s personality so accurately, or psychological traits on the basis of likes alone.

It is said today that data are our most precious resource. Companies, governments and various institutions are working to get our data, because thanks to their analysis they can earn a lot of money and get control over us. Do I have to say a definitive goodbye to my privacy?

Yes, this is the unfortunate reality – whether we like it, or not. I have spent most of my career trying to analyze this phenomenon. With hope that we can find some way to stop the upcoming era of post-privacy. But I must admit that the more I know, the more I lean towards the conclusion that we have lost this war.


That you have liked some restaurant or a picture of your nephew, can betray your sex, race, sexual orientation, political and religious views, your personality traits

We leave a huge digital footprint these days that are saved by our smartphones, Facebook, LinkedIn, Google, credit cards, or even cars that are increasingly linked to the web. The same applies to surveillance cameras and other sensors tracing our movements in the public space. Facebook likes alone can reveal a lot about your intimate traits with reasonable accuracy, not to mention all the additional data sources.

Even though I do not publish any kind of sensitive data about me on Facebook?

It does not matter. Even innocent digital breadcrumbs tell a lot about us, because they are just the tip of the iceberg. The fact you liked a restaurant, or a picture of your nephew, can betray your sex, race, sexual orientation, political and religious views, your personality traits.

How many likes are needed for the algorithm to accurately describe my personality?

It depends on the accuracy. In order to know your personality as well as your co-worker – around ten.

And with the accuracy on my friend’s level?

100-150 is enough.

And accuracy comparable to my husband?

The algorithm based on around 250 likes can describe your personality with more accuracy than your long-time partner.

Impossible! Despite so many years of living together, many intimate conversations, shared experiences?!

I understand this is saddening, but it gets even worse. The algorithm knows a lot more about us than we know about ourselves. A bit like a physician who, based on the data we disclose to him, forms a diagnosis, meaning what we did not know about ourselves. Now, compare the doctor with the algorithm. The doctor can admit a few thousand patients in a year, read several dozen science articles. The algorithm has access to all the medical articles in the world, as well as all the patients. That is why it knows more about us than we do, and more than the doctor does.

I can still fight – delete my Facebook account, throw out the smartphone, pay in cash where possible.

But it is impossible to stop walking in the streets under the eye of surveillance cameras. What is more, if you are a young, beautiful rich woman living in a big city, you can afford not to use e-banking, Uber, Google maps, and have an old flip phone. But if you are a lonely mother working three jobs, these types of technologies are a salvation for you. It is not about making your life easier, it is about making it possible at all, as you would not be able to function without these technologies.

Michał Kosiński, Ph.D

Not to mention that it is difficult to imagine a society which would freely resign from these options. We are all narcsissists and we share our data freel, by sharing Facebook, Instagram and Twitter posts. Turning our backs to new technologies is impossible.

Aren’t you terrified by this prospect?

I am concerned with the end of privacy, but I have accepted it. I believe this is the only rational approach. We all can agree that earthquakes ought to be banned, yet they will happen anyway. This is why, instead of delegalizing earthquakes, we ought to think how to organize society, what kind of law to set to minimize the damage caused by them.

This is the same with privacy. There is no way to stop losing it so we ought to focus on minimizing risk and maximizing potential benefits. Otherwise, we will only make our life needlessly difficult and we will hand over the progress baton to other nations.

An available alternative method for the surveillance system is a sleeping security guard or a police officer who profiles people on the basis of tattoos, skin color, or sex

The European Union has been at the forefront of restrictive data protection for years. What is the result? The companies have relocated to the United States and have developed their technologies there. And now they are moving to China where you can do anything. Google, Facebook, Twitter are not legal today in light of the European Union law. We have created an illusion of privacy protection in Europe and we also have deprived ourselves of influence on how these technologies develop, and the benefits that we could have if these technologies have been created on our market.

In the States, people are rebelling as well. San Francisco authorities did not agree to facial recognition in the streets.

They will pay a certain price for it. It is worth remembering this is a double-edged sword. With intelligent surveillance, it will be much easier to give people security, employing a lot fewer security guards and police officers. Of course, the rich city like San Francisco can let itself employ a higher number of police officers but somewhere else people will pay with their security for it.

But algorithms get it wrong, too. They deem an innocent person to be a criminal?

Of course, algorithms make mistakes, and it is a problem. But it is important to compare the accuracy of a given crime prevention system with the accuracy of other available systems, and not with the operation of a perfect system, because it does not exist.

The available alternative option for surveillance system is a sleeping security guard or a police officer who profiles people on the basis of a tattoo, skin color, or sex. And who uses their human intuition when deciding whether to detain someone, or not. As we know, human intuition is highly imprecise. This office is likely going to vet people who remind them of someone, maybe they have a darker skin color or they wear clothes the officer does not like.

We demand the same perfection from self-driving cars.

It is a huge mistake! We pay for it with lives of drivers and pedestrians. If we recognize we need a guarantee that a self-driving car will never kill anyone then we will never have cars like this. But we will still have terrible and drunk drivers behind the wheel – killing many more people as a result than the algorithm would have done. The better is the enemy of good. A good algorithm is always going to be better than a human being, even the best. We just need to let scientists create these algorithms.

We know with certainty that medical diagnoses from the computer are better than diagnoses made by a human. Of course, the computer makes mistakes, but a human makes more mistakes. Despite that we are reluctant to replace doctors with computers. We know that an automatic pilot in a plane is able to land better than a human. But people are still not mentally ready to trust the machines – not to mention that there is a lack of legal regulations which would make implementing the changes possible.

As much as I am able to imagine citizen transparency in a democratic state which respects human rights, I certainly can’t imagine it in a totalitarian state. This is an Orwellian vision.

I agree but I believe that technologies, even in the most dangerous aspects, are one of the biggest factors popularizing the democracy as we understand it. In light of this, it is equally as transparent for a common citizen, a politician, an owner of a huge company, or a police officer.

Really?

Even more so, because hackers can benefit from vetting of very important individuals and institutions – politicians accepting bribes or entrepreneurs who do not pay taxes. It can be said that we are equal to an algorithm to a much greater extent than in front of judges – people. When the world becomes transparent, it is more difficult for us to keep up appearances. Let us imagine what would happen, if in Saudi Arabia – a country where homosexuality is punishable by death – everyone woke up with their sexual orientation written on their foreheads? Is it possible there are no gays among police officers, army officers, the royal family, the clergy? Gays comprise around seven percent of every society. Sometimes, strength is in the population size…

Let’s hope so, although I am concrned that knowledge is not everything. We know a lot about the dirty deeds of politicians, and they are still in power. All it takes is enough power and enough money.

In the States, many homosexuals decided to come out and give up on privacy. Many paid the highest price for it, but it is much easier to be gay in America today thanks to them, thanks to transparency. So, maybe this inevitable loss of privacy, that I am concerned about too, can help us reduce the number of problems we deal with?

Still, your publication from the end of 2017, on recognizing sexual orientation based on facial features, caused many ethical controversies. It was discussed how far a scientist can go in their research without hurting anyone.

Unfortunately, people do not read the articles they criticize. I have not created any algorithm, I have merely shown that the widely used technologies have this kind of potential. I have warned it is an extremely dangerous tool.

On the basis of one picture, a computer is able to reach eighty percent accuracy, when it comes to specifying the sexual orientation. The loss of privacy by gays which would cause a certain level of discomfort in our country, could mean a grave danger in many other countries. But the fact you can determine the sexuality of a human being on the basis of facial features does not mean much – it can be predicted with a much higher level of accuracy based on Facebook likes, search history, credit card operations.

It is difficult to imagine today that, suddenly, seven percent of homosexual society will stop using Google. What is interesting, when I wrote about it, nobody has even squeaked a bad word about me. Everyone praised me: “Kosiński showed what the danger is like.” The face was it, because the face did not suit people ideologically.

What were the reactions like?

Hate. I was threatened with death. I got so many threats that I had a police officer assigned to me for two weeks. And I did not even come up with these algorithms! I took the algorithms we all have on our phones and said: “You have got to think about this, because these algorithms have enormous potential to harm you!” This is all I did. Unfortunately, many people came to the conclusion that I was responsible for designing these algorithms, and that I must be a homophobe.

The same internet which lets the manipulators manipulate society, it gives access to reliable information. The balance is positive.

I observed how, along with intensification of hate, article titles about me changed. The first one – “Nowhere to hide. What machines can tell from your face.” In “The Economist,” a cover article, positive. They presented me as a scientist who tries to protect people’s privacy.

The next material, in another magazine, was titled: “Professor Kosiński created a gaydar: a gay-detecting radar. In the third, they wrote that Kosiński is a homophobe from Stanford and he ought to be thrown out of the University. Stanford administrators received many letters from enraged donors, demanding to fire me, they threatened to withdraw the donations. Fortunately, the university administrators unequivocally took my side.

Were you expecting this?

Yes, I was not planning to publish it at first. But, one day, a friend asked me: “will you be able to look in the mirror when someone hurts others with these algorithms, and you knew about the threat, yet you chose to stay silent?”

On the other hand, when you do not talk about something openly, the nit is not widely used…

Not true. That is when you do not use it, neither do I, but the govenrments – very much use it. It is nothing new for the powers that be: they have known about this for a long time, how it can be used, they just do not say anything out loud. The governments employ the best specialists to build such systems. Technologies for intimate facial features recognition were patented by various start-ups almost a decade ago.

How long have you known?

Since 2015. I wondered whether it was possible to describe someone’s personality, sexual orientation was one of the variables. I trained on various models, one of them pertained to sexual orientation. It worked so accurately, so exactly, that I thought I had some sort of a data error at first!

My student repeated the study – the same outcome. I could not believe it. I gathered the new data from a separate source and I wrote the whole code from the beginning. And again – same results. I wrote an article about this, but I only shared it in closed circles, I was simply afraid to publish it.

You were also accused of making the study binary in nature: we are either hetero, or homosexual. And our sexuality is sometimes more complicated.

It does not matter. If I had more accurate data, then the classification would have been even more accurate. My aim was not studying the sexual orientation, but warning against dangers to privacy.

And more broadly – personality? We are not either introvert or extrovert – there is a whole spectrum of options in the middle. What is more, people change, influenced by experiences, they learn.

I do not think so. An introvert child will be an introvert adult. We are dealing with an erroneous measurement here. Sometimes, people have more, sometimes fewer occasions to show particular traits. I believe personality changes just a bit and in a really few cases. It is just we find it difficult to accept it.

Why?

Because we have a powerful illusion of free will. Particularly in America, an omnipresent ideology exists that I am who I want to be; the American dream was built on it. That is why Americans look at someone homeless and say: “I am sorry, but it is his fault he is homeless. If he worked as hard as me, he would have what I have.” They forget not everyone can be a financier or a computer programmer, that we differ in terms of character and talent, that there are people who have trouble concentrating, controlling aggression, and so on.

And it was after the scandal connected to Cambridge Analytica and the presidential campaign in the USA. On your website, we can read a declaration to this day: “I have nothing to do with Cambridge Analytica and I was the first who warned against similar practices in the ‘Guardian.”

Yes, I was being attacked then, too. Unfortunately, people always shoot the messenger. They forget that I do not invent these technologies; they were invented, patented and used by others.

Technology used by Cambridge Analytica during the presidential campaign in 2016 had been patented by Facebook all the way back in 2016! Comnpanies, governments, institutions have been using the knowledge about us, so they can profile us. Even Mark Zuckerberg had to wear a suit and explain what on earth has been happening with Facebook.

He explained he had no idea that psychological traits can be determined on the basis of likes. Clearly, he had forgotten this is Facebook’s business model which observes our behavior online and then shows everyone such commercials, so as to earn as much money as possible.

I am only warning. I try to inform how negative the consequences of this can be. It pains me that I am attacked by journalists whose task is informing and warning.

There is an increasing number of fake news on the web, a number of flat Earth supporters is increasing, and vaccines are harmful. How do you educate society, so it chooses the verified ones in the informational chaos?

Humanity has been around for thousands of years and with each passing year there are more educated people. Of course, we usually make two steps forward and one step back.There was the expansion of the European Union – a huge, incredible step towards freedom, democratization, integration – and now, there is brexit. Everyone focuses on brexit and says: the world is collapsing! But how many steps forward have we made before? Anti-vaccine movement? A step back – but how many pseudoscientific theories were debunked in the meantime? We must not concentrate on a single case of a girl who died, because she was allergic to some ingredient in the vaccine, but on thousands of children who were saved thanks to vaccines.

You are an optimist.

I am. The same Internet that let anti-vaxxers talk to each other will be their end. The same Internet which allows the manipulators to manipulate society gives society access to reliable information. The balance is positive. When we look at any global indicators, we can see the world is becoming more liberal, progressive, informed and less polarized.


*Michał Kosiński, Ph.D. a social psychologist and data scientist, an expert in the field of psychometry and personality analysis online. He works at Stanford University Graduate School of Business, and lectures at the SWPS University of Social Sciences and Humanities, as well as Concilium Civitas in Warsaw. A Cambridge and SWPS graduate – where he got his Ph.D. in psychology from. He was the Deputy Director of the University of Cambridge Psychometrics Centre. He did research in Microsoft Research and at the Computer Science Department in Stanford. He coordinated the myPersonality project which entailed the global collabortion of over two hundred scientists analyzing the detailed profiles of eight million Facebook users. When Facebook patented the algorithm that creates users’ psychological profile based on their activity, Kosiński published the research showing that the algorithm is able to determine our age, sex, sexual preferences, ethnic background and views. He warned against the serious danger to our privacy. It soon turned out these algorithms were used in the Trump presidential campaign and in the brexit campaign. In 2013, IBM and DataIQ counted him as one of the fifty most influential people in Big Data. In 2015, he received a Rising Star award from the Association of Psychological Science.


Michał Kosiński and Mirosław Sopek were guests at the last meeting: “Human Tech Meetings” that took place on December 17, 2019, and was called: “In the steps of Big Data.” These meetings take part cyclically as part of the HumanTech Center for Social and Technological Innovation at SWPS University. We thank the organizers for the invitation and help in the implementation of the material.


Przeczytaj polską wersję tego tekstu TUTAJ

Skip to content