We want to draw attention to how inhumane the world of algorithms is, to talk about malpractices of digital corporations and to expose the truth about the technology everybody admires, says Wiesław Bartkowski, head of Creative Coding at SWPS University of Social Sciences and Humanities, in conversation with Monika Redzisz
Monika Redzisz: “Today’s corporations regard human experiences as a free resource which can be exploited without limitations. With their wanton exploitation they are degrading human nature, similarly to the corporations in early capitalism which caused environmental degradation”. That sounds like a manifesto.
Wiesław Bartkowski*: This is a passage from my introduction to the exhibition called “Hello! I know you – I am you”, which I have been preparing for the next edition of the Gdynia Design Days festival in July. The beginning of the exhibition, which due to the pandemic has become an online experience, is going to start off innocently by inviting the guests to play and have fun with artificial neural networks. Before they are allowed to play, each visitor will have to agree to the terms of service as we are going to use their faces in the performance. After obtaining their acceptance, the organizers will be authorized to freely use the images of the guests. I don’t think anyone will bother to read the whole service agreement because we have already got used to give our consent without giving it a second thought. We are OK with whatever we have to sign as long as we get access to the service right away.
We have drafted our agreement on the basis of the agreement of iRobot, a manufacturer of smart vacuum cleaners, which 95% of iRobot owners agreed to accept without hesitation. The new agreement was delivered with a software update which turned, “free-of-charge”, a “stupid” vacuum cleaner to a super intelligent device with lots of new functions; for example, it could learn where the kitchen is, where the floor should be hoovered more often or where and when not to clean it to avoid disturbing people in the room. But that also made a vacuum cleaner a machine to gather metadata regarding our habits and space we live in. Such knowledge allows to infer information that is even more interesting, e.g. about the size of our apartment, its layout and, most importantly, our lifestyle. All that is included in the agreement nobody cares to read. It stipulates that the company is collaborating with entities to which metadata will be made available.
Why would anyone need to know that there are three rooms in my flat and I work from 9 a.m. to 5 p.m.? In fact, I have nothing to hide from my vacuum cleaner… unless it starts taking pictures and sending them out to the world.
No, that’s not the point! The creators of the models, i.e. digital corporations, do not get access to our private data and so they do not infringe our privacy in a way the user understands it. However, they do get access to metadata which are not regarded by most people as confidential information. This is what the corporations feed on. Most people simply do not understand what metadata are and why they can tell you so much about us. The issue is not clearly regulated by law although some timid actions have already been taken. Another problem is that metadata are not as easily encrypted as the communication content. Their efficiency was confirmed by general Michael Hayden, former CIA and NSA director: “We kill people based on metadata”.
We should slow down technology development to get more time to think about where we’re heading
It is crucial for corporations to recreate the way we behave and what psychologists would call “personality”. If they know it, they will be able to predict our actions. Nobody is interested in our nude photos that could be easily taken by a vacuum cleaner fitted with a camera. In this whole system a particular person does not matter. He or she is seen as a mere provider of a valuable resource, i.e. metadata.
Valuable for whom?
For every company that wants us to buy its products. As a matter of fact, we are not iRobot customers; it’s the companies which acquire our data from iRobot. We are paying for a vacuum cleaner not with our money but with allowing the company to glean information about our life. We want to believe that many technologies are offered for free. But although we don’t spend any cash on them, they cost us dearly.
Marketing has always been about the same thing: to provide a target group with the information about a product. We have always been manipulated by advertisements, at least to some extent.
Now it’s about something more. It’s about making sure that a target group reacts in an expected and desired way, for example by clicking a “Buy now” or “Vote now” button. To achieve that, the companies have decided to affect users’ behaviors. It’s a mass-scale experiment in which side effects are completely neglected. Professor Shoshana Zuboff claims that a new era has just started. She calls it surveillance capitalism. It’s a system based on absolute predictability. The idea is to reduce uncertainty to minimum and to make sure that business generates profits. I would even go further and say that this action will lead to unification and elimination of other alternative ways of thinking and actions which might destabilize the system. It’s not only a threat to our liberty but also a systematic process of elimination of the most creative actions, which, in consequence, will result in bringing development to a halt.
In 2017 Facebook employees openly boasted about having tools allowing to establish an emotional state of a specific group of working students and that they were capable of foreseeing a moment in which the students will need support or be more susceptible to manipulation. At that point all you have to do is to offer them a product that would satisfy or seemingly satisfy their needs. Had they been in a different mental condition, they wouldn’t have bought it. That runs counter to ethical standards, but apparently it’s not a problem for FB.
Besides, unlike traditional advertising, these technologies are interactive. They can analyze our actions and adapt to us. We are dealing with something we have never experienced in our history – with instrumentation that is taking control over our behavioral system. Machine learning models are used for example to shape what we see on Facebook to make us stay there for as long as possible and to display as many adverts as possible. Our mind is the environment that is modified by the algorithm. And we have no control over it because the algorithms are black boxes and we don’t know exactly what’s happening inside them. If you also realize the strategy of corporations that want to implement such models as fast as possible to rack up brisk sales, it looks like we’re in big trouble.
Billboard ads and TV commercials are also manipulative but they can’t adapt to our current state. In the advertising industry we have managed to draw a line. In a digital world it’s a shambolic free-for-all. It’s a mass-scale experiment.
What can we play with after we have signed the agreement?
You will consent to use your face which will be experimented on with the use of various types of artificial neural networks. Under the pretext of fun and innocent experiment the system will be collecting metadata in a similar way corporations do. It will be also creating a base of faces which will make it possible to train network models. Gathering and processing such data means a huge profit for corporations and we let them do that for a song. Finally, we are going to show our hand, something that corporations would never do. They don’t tell us what they know about us. What they know is very often hidden in trained machine learning models; we are unable to translate it to a human language. It’s inhumane knowledge.
Technology has become a substitute and a prosthesis of a normal life because we didn’t have a choice. Much like compound chocolate in the communist era
The idea of our exhibition is to draw attention to the inhumane world of algorithms, especially in a new machine learning paradigm. We want to show how dangerous it is to create techno-social systems in which non-human actors mediate interpersonal interactions and to what extent such systems change our perception of reality. We would like to talk a bit about malpractices committed by such corporations as Google, Facebook, Amazon and Apple, and to expose the truth about the technology everybody admires.
They admire it because it brings them a multitude of benefits. IBM has just released a free chatbot for teachers. Alexa is a real life-saver for the disabled. Don’t the benefits outweigh the risks?
You have also believed their narrative, haven’t you? Do you really think that IBM offered a chatbot to teachers for free? That chatbot cost them nothing as they had it ready anyway. NGO employees and teachers had to work exceptionally hard to feed it with high-quality content. They did all the job after which IBM proudly announced that it met their expectations. We are starting to believe that we should converse with bots. Wouldn’t it be better to talk to an expert that could share their knowledge?
But we are short of experts.
Technology is not going to solve that problem and yet everyone wants to believe that artificial intelligence is very smart and can answer all our questions, so the issue is out of the way.
We need to change the narrative and stress that agency lies elsewhere. Tools are not going to do a job for us and artificial intelligence is not going to save us. We have to do it on our own.
Artificial intelligence is undeniably extraordinary but it brings many threats. Development of deep learning may be compared to drug research. Since we do not entirely understand how a human body works, we create medicines following a trial and error approach. But before they are commercialized, it takes years to test their efficiency and side effects and to obtain special certificates issued by authorized institutions. In the case of deep learning we use the trial and error approach too as we don’t understand how it works. And yet, we want to implement it on a large scale.
When Noam Chomsky was asked if deep learning was going to make a breakthrough in understanding a language, he replied that it was not possible because it was not science but refined engineering excelling at identifying patterns in a seemingly disordered set of data. Artificial intelligence is not going to tell us anything new about a language because, in itself, it is unexplainable.
This game is about our money. They manipulate us because they want to sell us their products. If someone is not interested, he or she will not buy it. I agree that most of us fall for attractive offers but why would we blame corporations for our weaknesses? Nobody forces us to create a Facebook account or to click any links or buttons.
You don’t have to click. Scrolling is enough.
You can always resign…
No, you can’t! There are almost two billion FB users, which amounts for a quarter of global population. Even if I resign, there will still be my friends and my friends’ friends and their presence will affect my person. This is how social networking works: we are influenced indirectly. If our friends use a system, the chances that we will become a part of it are very high. The friends of our friends affect our actions with the efficiency of 25 percent.
We are the first generation to experience that situation. Do you think we need more time to become resistant to getting manipulated?
Of course. It reminds me a bit of an arms race but interactive manipulation is hard to resist. Today’s economic inequality is the same as in the times of great kings. But it’s not only about the financial imbalance. The most important thing is the knowledge imbalance. In times of yore, despite being rich, kings and monarchs had no detailed information about their subjects. Today, we are under surveillance of the system that knows more about us than we do.
We, too, have access to knowledge, or at least to many of its tools. It’s a double-edged sword.
I disagree. Some have access to metadata while others don’t. At some point metadata started to be perceived as a free resource, which you can gather, process and sell. As simple as that. There has never been a public debate on that issue. It has never been discussed whether data being the result of our actions are our property or not.
In major technology companies being a human is seen as a weakness. In consequence, we are trying to escape humanity and what’s tangible and deficient in its physicality
The other mechanism is even more menacing. Systems that analyze our metadata, deep neural networks, are so complex that their operation is impossible to understand even for their creators. They don’t know why the system makes this decision and not another. They have an effective tool and they use it; it’s worth it. This way they create a system whose action consequences surpass their expectations but which cannot be stopped. They get involved in their machine too much. They can control it neither on the technical nor on the socio-economic level. They have to pay their debt to the investors to whom they promised revenues. The CEO of a corporation cannot take actions that would bring losses to his company; he has to optimize the profit. He is immersed in the state called the “invisible hand of the market”. Recently, a Canadian economist said that during the pandemic we all had to wash hands but there was still one hand that wouldn’t be clean.
Today social and financial inequalities are huge but you also have to agree that the standard of living of those at the bottom of the ladder has never been better. Never have the people been so wealthy and well educated and never have their actions been focused so much on civic issues.
Is that because of technological development? Or is it against it?
Let’s take a look at TV. For many it used to be “a window on the world”.
The question is what world it showed. We know what public TV is like. And commercial TV stations will do everything to show as many commercials as they can. They are outdoing each other in creating new techniques to beset us even more. They have contributed to speed up the development of hyperconsumerism. In his book “Empire of Things”, Frank Trentmann wrote that in 2013 the Britons had owned about 6 billion articles of clothing out of which a quarter had never been worn. In the United States houses and garages are overflowing with Americans’ possessions forcing them to rent special storage spaces.
Maybe that’s our reaction to hundreds of years of being poor? We have never had access to so many goods so it’s not surprising that we can’t resist the temptation. Besides, the times of prosperity are over. People are getting poorer again.
It’s worth noting when the times of prosperity ended. That was in 1970s. But the vision of everyone having a house in the suburbs and a car to get there created a class of people living on the outskirts of American cities. They soon discovered that living there was not so rosy as they once had imagined. Consumerism generated a certain need which led to social stratification.
Big companies are powerful but they don’t have absolute power. There are also state institutions that can influence their actions.
Good legislation is a solution. Unfortunately, the state is weak. Last year the United States spent about 100 billion dollars on research and development projects whereas only 10 biggest global corporations earmarked the total of 200 billion dollars for the same purpose. And that means that those ten corporations are twice as powerful as the most powerful country in the world. Influential lobbies efficiently prevent governments from limiting a shambolic free-for-all.
We need to think of a new way to talk about the world. I put my trust in education, art and culture whose task is to develop our sensitivity to the complexity of the world
In his book “Capital and Ideology”, Thomas Piketty goes even further: it’s not only about who has the capital and power but also about who is telling the story. It’s a matter of ideology. In my opinion we have blindly believed in a unilateral vision of the world where technology, and artificial intelligence in particular, is a solution to all our problems, meaning that the faster we develop it, the better. But what does “fast” mean? It means not giving it a second thought and not waiting for appropriate legislation; it means concentrating on short-term profits to the benefit of huge capital and big corporations. I think we should slow down technology development to get more time to think about where we’re heading. At this point it’s worth referring to what Neil Postman said. Technological development will always give us something but it will also take something away.
We allow for surveillance for the sake of security.
Yes, when we are afraid, we are ready to sacrifice a lot and the only proper narrative that technology allows us to function normally is getting even more dominant. That’s what happened after the September 11 attacks. All of the sudden everybody believed that they would be saved by technology. Despite it being out of line with the constitution but who cares? The state did not have sufficient infrastructure to create a system that would allow to surveil the society to prevent further attacks. Nor did it have human resources to create that infrastructure as the best employees had been poached by corporations which now have access to the best staff. However, the decision was made and it is still in force. Although legal regulations were to be in effect only for the period when there was a risk of terrorist attacks, they have never been revoked. The state feels strong if its citizens are scared.
That’s the populist narrative which other politicians do not share. That’s what Trump is saying but Obama’s beliefs were different.
That’s true but he didn’t revoke the regulations that allow to surveil the citizens. He tried to strengthen the role of his administration and lure the people from the technology industry but he failed. He wasn’t able to think of a narrative that would prove good enough to convince people to his vision. The narrative of tech companies is much more substantial. Amazon managed to get its best contracts during Obama’s presidency. I recently discussed this issue with one of my best student groups – the group of future business leaders. Even they put more trust in corporations than in a state. This goes to show how much we believe that someone who can develop technology is also great at everything else, for example at education. At this point it’s also worth mentioning that many employees of global corporations voted for Trump.
You’re writing that technology is tearing us apart and that social tissue is beginning to rot. But during the pandemic, technology has made it possible for us to communicate.
It’s a clichéd reasoning. Modern technologies make it possible for us to connect but they haven’t solved the problem. Technology has become a substitute and a prosthesis of a normal life because we didn’t have a choice. Much like chocolate compound in the communist era; I regretted every time I took a bite of that thing. And now it’s the same thing: I launched Zoom to “go” to a birthday party and that felt weird. Essentially, the most important thing in our life is to be with other people. Every action that separates us from one another is bad. And technology has brought serious harm to us. We are deluded into believing that we don’t need other people as we are self-sufficient as long as we have technology to support us.
Even though online meetings are a mere substitute of real meetings, they’re still better than nothing, aren’t they? What would our life be like today if we didn’t have such tools? We wouldn’t be able to work, learn or even enjoy our ersatz social life.
Of course, but you need to find a golden mean and we have put all our eggs into one basket. We design technology without thinking too much about it. We are short-sighted. The only thing that matters is to keep us dependent on a particular service and the cost of being addicted is exorbitant. Notifications are the main mechanism to keep us hooked on. I turned off all notifications a long time ago. You can say I am partially offline. I reply to Messenger messages three days after they are sent to me. I am fully aware of the fact that if I hadn’t done that, I wouldn’t be able to resist the temptation. We’re not strong enough.
Soon the system will decide who is our best life partner, what we should invest in and what we should learn
Social media give us an opportunity to socialize, plan various events and educate one another.
Such tools are motivating and demotivating at the same time. We hit the like button and we think it’s enough. We say to ourselves that we have done our civic duty and we don’t have to do anything else. Technology changes the way we perceive the reality. Especially when we abuse technology. Let me illustrate that with an example of GPS. I unwittingly experimented on myself and I discovered that I suddenly lost spatial awareness. Researchers have already published studies showing how our brain reacts to GPS instructions: we do not activate the areas of our brain that we used to activate when we were looking for characteristic elements of the space around us. The same structures are responsible for long-term memory. Available research shows that using GPS too often may lead to problems with memory and premature Alzheimer’s disease.
We have to come to terms with a certain evolutionary process…
That’s the thinking I’m trying to warn people against! We don’t have to come to terms with anything! Technology brings the values of its creators. Nobody bothers to ask us if we want to embrace those values. In major technology companies being a human is seen as a weakness. In consequence, we are trying to escape humanity and what’s tangible and deficient in its physicality. We have embarked on a journey beyond our physicality to find a new better substrate: our body will migrate to VR and our mind will be supported by AI or by whatever becomes trendy at a given time. But do we really want to stay connected to and dependent on all those code-driven devices around us? We can hear others say that it is unavoidable and that it is just another evolutionary stage. Which is not true. Our goal is not to create perfect technology that would replace us. Some believe it will be possible to transfer their mind to a silicon chip and live forever. If we continue to recklessly develop our technology without giving it a second thought, we may indeed end up in a silicon chip. At the cost of our humanity. But it doesn’t have to be like this. It’s just a design which we can redesign.
And that’s the key issue: basically, we have no ideas for alternative narratives. In her Nobel speech, Olga Tokarczuk said that we felt something was wrong with the world. We need to think of a new way to talk about the world. Tokarczuk refers to a perceptive narrator that is sensitive to a complex, tangled, multidimensional reality. I put my trust in education, art and culture whose task is to develop our sensitivity to the complexity of the world and to the fact that there is no silver bullet like “technology will solve all our problems”.
Luckily, there is an increasing number of social movements that propagate a different ideology and tell an alternative story. We need such new narratives and they are created by people who think critically, have courage to daydream and are not afraid of intellectual effort. Unfortunately, they are hard to find because the environment we live in, including technologies, makes us more consumption and less action-oriented. It’s another set of skills we agree to give away to the technological system. It might seem it gives us more free time but in fact we have more time and less freedom. Soon the system will decide who is our best life partner, what we should invest in and what we should learn. This is why we must develop critical thinking, imagination and a sense of empowerment at all costs.
What’s the role of art here?
Alain Badiou claims that “art is a truth procedure”. Following that logic, artists should explore the complexities of this world and look for alternative ways, prove that you can think about the world in many dimensions, join the discussion about the world and technology. During the opening ceremony of Ars Electronica 2019, the biggest electronic art festival in the world, Roberto Viola, who is responsible for the EU digital policy, said that nowadays the role of artists was key and that the artists could think critically and understand nuances and subtleties of this world better than engineers and financiers. Because they can see the reality from a broader perspective.
In the era of Big Data, all consumers of culture are beginning to resemble each other. We are witnessing a process of proletarianization, or cultural impoverishment of the middle class
Of course, when art becomes commercial, it stops playing this role too, because it tries to please the customer and is subject to the free market mechanisms. Just see what happened to our culture. In the era of Big Data, all consumers of culture are beginning to resemble each other. This is perfectly encapsulated in the latest book by Edwin Bendyk, in which he is referring to proletarianization of culture by writing “Proletarianization of workers underwent the same process; once versatile and unique craftsmen working on Fordist production lines, they were now becoming replaceable, repetitive cogs in the machine of the system, also known as proletarians”. Netflix, which is consumed on a large scale, uses just a several algorithms to produce movies and series to ensure the best possible chance for success. By doing the same thing over and over again, the company is trying to cast our attitudes and preferences in the same mold. We are witnessing a process of proletarianization, or cultural impoverishment of the middle class, because the system is producing an easy-to-consume cultural mixture. It prevents us from any intellectual effort by offering us products tailored to our expectations. This is extremely dangerous because it may lead to elimination of what is the most important for us, i.e. the ability to conceive an alternative narrative and to change the world.
The art is not going to change the system.
That’s true. It’s impossible without politics. A single citizen or even a single country will not make any difference. We cannot fight tech corporations if we’re not a part of a structure that would be at least as big as the European Union. The EU may be the last bastion of independence capable of fighting back, negotiating with big corporations and looking for a golden mean.
We need to make bolder decisions. In my opinion it is impossible to change the course of technology development without changing the socio-economic system, which should be based not only on maximizing profits but also on other values. One of the solutions might include Piketty’s participatory socialism, which could diminish inequalities by partially communizing private property and making it more dynamic. Piketty claims for example that supervisory boards should consist in 50 percent of employees and that the maximum share package controlled by one person should not exceed 10 percent. Technological development cannot be driven by the market power only. Companies are set up to earn money. To achieve that goal they need to develop the technology. Apple made a deal with Google but they won’t save the humanity for free. Corporations can’t do thing like that. If a business proved beneficial to the humanity but resulted in losses for the company, the managers of the company would be accused of committing an offense.
Is it possible to change the system without a revolution?
That’s the point! We have stopped believing that the system could be changed. Slavoj Žižek oftentimes repeats that “it is easier to imagine the end of the world than to imagine the end of capitalism”. With this kind of thinking in mind, we are doomed.
*Wiesław Bartkowski – designer of interactions and researcher of complex systems. Co-creator and director of post-graduate anti-disciplinary CreativeCoding.pl studies, which blur the differences between art, design, science and technology. He coined the term “the matter of code” and wrote the manifesto. He is an academic at SWPS University of Social Sciences and Humanities. He teaches artists and designers how to use the “matter of code” in their works. His visualizations and interactive installations have been exhibited in the Museum of Modern Art, the Centre for Contemporary Art, Zachęta Gallery, BOZAR Gallery, Arsenal Gallery, Artothek in Munich, Powszechny Theater and Nowy Theater.
Wiesław Bartkowski is the curator and a co-author of the exhibition “Hello! I know you – I am you”, which is open for viewing from 4 to 12 July at Gdynia Design Days – the biggest summer design festival in Poland, which this is held online. During the festival Wiesław Bartkowski will deliver a lecture entitled “It’s time to join Team Human”, presenting detailed information on the issues discussed in the interview.
Recommended by Wiesław Bartkowski
I would like to recommend the books I have recently read. They provide detailed information on the topics discussed in the interview or were an inspiration to the answers.
- Bendyk, Edwin: „W Polsce, czyli wszędzie. Rzecz o upadku i przyszłości świata”;
- Brockman, John: „Possible Minds: Twenty-Five Ways of Looking at AI”;
- Carr, Nicholas: „The Glass Cage: How Our Computers Are Changing Us”;
- Chater, Nick: „The Mind Is Flat”;
- Damasio, Antonio: „The Strange Order of Things”;
- Fisher, Mark: „CapitalistRealism”;
- Ford, Martin: „Architects of Intelligence: The truth about AI from the people building it”;
- Gellman, Barton: „Dark Mirror: Edward Snowden and the American Surveillance State”;
- Giridharadas, Anand: „Winners Take All”;
- Hoffman, Donald: „The Case Against Reality: Why Evolution Hid the Truth from Our Eyes”;
- Illouz, Eva: „The End of Love”;
- Lem, Stanisław: „Golem XIV”;
- McNamee, Roger: „Zucked : Waking Up to the Facebook Catastrophe”;
- Pearl, Judea: „The Book of Why: The New Science of Cause and Effect”;
- Piketty, Thomas: „Capital and Ideology”;
- Rushkoff, Douglas: „Life Inc : How Corporatism Conquered the World, and How We Can Take It Back”;
- Rushkoff, Douglas: „Team Human”;
- Russell, Stuart: „Human Compatible: Artificial Intelligence and the Problem of Control”;
- Snowden, Edward: „Permanent Record”
- Trentmann, Frank: „Empire of Things : How We Became a World of Consumers, from the Fifteenth Century to the Twenty-First”;
- Zuboff, Shoshana: „The Age of Surveillance Capitalism”.
Przeczytaj polską wersję tego tekstu TUTAJ