Close
When you subscribe to Furtherfield’s newsletter service you will receive occasional email newsletters from us plus invitations to our exhibitions and events. To opt out of the newsletter service at any time please click the unsubscribe link in the emails.
Close
All Content
Contributors
UFO Icon
Close
Irridescent cyber duck illustration with a bionic eye Irridescent cyber bear illustration with a bionic eye Irridescent cyber bee illustration
Visit People's Park Plinth

Borders of Fear

The 21st conference of the Disruption Network Lab “Borders of Fear” was held on the 27th, 28th, and 29th November 2020 in Berlin’s Kunstquartier Bethanien. Journalists, activists, advocates, and researchers shed light on abuses and human rights violations in the context of migration management policies. Keynote speeches, panel discussions and several workshops were held involving a total number of 18 speakers and bringing together hundreds of online viewers.

Drawing on insights from humanities, science and technology studies, participants analysed from different perspectives the spread of new forms of persecution and border control targeting people on the move and those seeking refuge. They reflected collectively on forms of social justice and discussed the politics of fear that crystalize the stigmatisation of migrants. By concretely addressing these issues the conference also investigated the deployment of technology and the role of media to consolidate a well-defined structure of power, and outlined the reasons behind the rise of borders and walls, factors that lead to cultural and physical violence, persecution, and human rights violations.

In her introductory statement Dr. Tatiana Bazzichelli, founder and director of the Distruption Network Lab, presented the programme of the conference meant to address the discourse of borders both in their material function, and in their defining role within a strategy of dehumanization and racialisations of individuals. Across the globe, an unprecedented number of people are on the move due to war, economic and environmental factors. Bazzichelli recalled the urgent need to discuss human-right-related topics such as segregation and pushbacks, refugee camps and militarization of frontiers, considering the new technological artilleries available for states, investigating at the same time how border policing and datafication of society are affecting the narrative around migrants and refugees in Europe and the in the West.

The four immediate key content takeaways of the first day were the will to prevent people from reaching countries where they can apply for refugee status or visa; the externalisation polices of border control; the illegal practice of pushbacks; and systematic human rights violations by authorities, extensively documented but difficult to prove in court.

Lieke Ploeger, community director of the Disruption Network Lab, and Dr. Tatiana Bazzichelli, founder and director of the Distruption Network Lab opening the 21st conference “Borders of Fear”
Lieke Ploeger, community director of the Disruption Network Lab, and Dr. Tatiana Bazzichelli, founder and director of the Distruption Network Lab opening the 21st conference “Borders of Fear”

The conference opened with the short film by Josh Begley “Best of Luck with the Wall” – a voyage across the US-Mexico border – stitched together from 200,000 satellite images, and a talk by lawyer Renata Avila, who gave an overview of the physical and socio-economic barriers, which people migrating encounter whilst crossing South and Central America.

Avila took step from the current crises in South and Central America, to describe the dramatic migration through perilous regions, a result of an accumulation of factors like inequality, corruption, mafia, and violence. Avila pointed out that oligarchic systems from different countries appear to be interconnected in complicated architectures of international tax evasion, ruthless exploitation of resources, oppression, and the use of force. In those same places, people experience the most brutal inequality, poverty and social exclusion.

Since the ‘90s the regional free trade agreement meant open borders for products but not for people. In fact, it was an international policy with devastating effects on local economies and agriculture. People on the move in search for a better future somewhere else found closed borders and security forces attempting to block them from heading north towards Mexico and the U.S. border. In these years, the police forceful response to the migrants crossing borders have been widely praised by the governments in the region.

The fact of travelling alone is a red flag, especially for women, and the first wall people meet is in their own country. People on their way to the north experience every kind of injustice. Latin America has often been regarded as a region with deep ethnic and class conflicts. Abandoning possible stereotypical representations, we see that the bodies of the people on the move are at large sexualised and racialized for political reasons. Race, therefore, is another factor to consider, especially when we look at the journey of individuals on the move. Aside, languages in South America could also represent a barrier for those who travel without translators in a continent with dozens of indigenous languages.

Avila concluded her intervention mentioning the issue of digital colonialism and the relevance of geospatial data. Digital is no longer a separated space, she warned, but a hybrid one relevant for all individuals and whose rules are dictated by a small minority. People and places can be erased by those very few companies that can collect data, and –for example– draw and delete borders.

Renata Avila during her intervention
Renata Avila during her intervention

The panel on the first day titled “Migration, Failing Policies & Human Rights Violations,” was moderated by Roberto Perez-Rocha, director for the international anti-corruption conference series at Transparency International, and delivered by Philipp Schönberger and Franziska Schmidt, coordinators of the Refugee Law Clinic Berlin, together with investigative journalist and photographer Sally Hayden. The panellists referred to their direct experience and work, and reflected on how the EU migration policy is factually enforcing practices that cause violation of human rights, suffering, and desperation.

The Refugee Law Clinic Berlin e. V. is a student association at the Humboldt University of Berlin, providing free of charge and independent legal help for refugees and people on the move in Berlin and on the Greek island of Samos. The organisation also offers training on asylum and residence law in Berlin and runs the website ihaverights.eu, designed to allow access to justice to those in marginalised communities. 

Through a legal counselling project on the Greek island of Samos, the collective helps people suffering from European illegal practices at the Union’s borders, providing the urgent need of effective ways to guarantee them access to justice and protection. As Schönberger and Schmidt recalled, refugees, and people on the move within the EU find several obstacles when it comes to the enforcement of their rights. For such a reason, they shall be guaranteed procedural counselling by the law to secure, among other services, a fair asylum procedure. However, the Clinic confirmed that such guarantees are not being completely fulfilled in Germany, nor at Europe’s borders, in Samos, Lesvos, Leros, Kos, or Kios.

The panellists explained how their presence on the island gives a chance to document that these camps of human suffering are actually a structural part of an EU migration policy aimed at deterring people from entering the Union, result of deliberate political decisions taken in Brussels and Berlin. Human rights violations occur before arriving on the island, as people are intercepted by the Greek coastguard or by the European Border and Coast Guard Agency (Frontex), and then pushed back to the Turkish border. 

The camp in Samos, with a capacity of a maximum 648 people, is instead the home to 4,300 residents, with no water, no sanitary services, poisonous food, and no medical services. Even very serious cases documented to local health authorities remain unattended. The list of violations is endless and the complete lack of adequate protection for unaccompanied minors represents another big issue in this like in all others Greek hotspots, together with the precarity of vulnerable groups, whose risks increase with race, gender, and sexual identity.

The legal team from Berlin prepares applications to the EUCHR court and in these years has filed also 60 requests for urgent procedure due to the risk of irreparable harm, which were granted, ordering adequate accommodation and medical treatment for people in extreme danger.

 Roberto Perez-Rocha (left), Sally Hayden ,Philipp Schönberger and Franziska Schmidt during the panel on the first day “Migration, Failing Policies & Human Rights Violations”
Roberto Perez-Rocha (left), Sally Hayden ,Philipp Schönberger and Franziska Schmidt during the panel on the first day “Migration, Failing Policies & Human Rights Violations”

Many observers criticise that the sufferings in the Aegean and on the Greek islands is the result of precise political decisions. Agreed in March 2016, the EU-Turkey deal is a statement of cooperation that seeks to control the crossing of people from Turkey to the Greek islands. According to the deal, every person arriving without documents by boat or without official permission or passage to the Greek islands would be returned to Turkey. In exchange, EU Member States would take Syrian refugees from Turkey. NGOs and international human rights agencies denounce that Turkey, Greece, and the EU have completely failed on humanitarian grounds and dispute the wrong premise that Turkey is a safe country for refugees and asylum-seekers.

Journalist and photographer Sally Hayden looked at the EU-Turkey deal, defining it as a prototype for what would then happen in the central Mediterranean. Libya, a country at war with multiple governments, is the destination of people on move and refugees from all over Africa, willing to cross the Mediterranean Sea. As it is illegal to stop and push people back, for years now the EU has been financing the Libyan coastguard to intercept and pull them back. What follows is a period of arbitrary and endless detention.

Hayden writes about facts; she is not an activist. When she talks about Libya, she refers to objective events she can fact check, and individual stories she has personally collected. Her reports represent a country at war, unsafe not just for refugees and people on the move but for Libyans too. With her work, the journalist has extensively documented how refugees and migrants smuggled into Libya are subject to human trafficking, forced labour, sexual exploitation and tortures, trapped in an infernal circle of violence and death. She recalled her experience with the detainees in Abu Salim, where 500 hundred people suffer from illegal and brutal incarceration inside a centre affiliated with the government in Tripoli. In July 2020, during the conflict, one of these many prisons not far from the city was bombed. At least 53 illegally detained people were killed. 

Hayden’s work provides a picture of the results of Europe’s management of the migration crises in the Mediterranean and Northern Africa. EU funds are employed for militarization of borders and externalisation of frontiers control. The political context, in which this occurs, is the cause of years and sometimes decades of lack of investment in reception and asylum systems in line with EU-State’s generic obligations to respect, protect and fulfil human rights.

All panellists called for the immediate intervention to evacuate camps and prisons that were the result of the EU migration policies, to allow migrant victims of abuses and refugees to seek justice and safety elsewhere.

Sally Hayden during the panel “Migration, Failing Policies & Human Rights Violations”
Sally Hayden during the panel “Migration, Failing Policies & Human Rights Violations”

The evening closed with the panel discussion titled “Illegal Pushbacks and Border Violence” and moderated by Likhita Banerji, a human rights and technology researcher at Amnesty International. Banerji reminded the audience that in the first nine months of 2020 there had already been 40 pushbacks, illegal rejections of entry, and expulsions without individual assessment of protection needs, had been documented within Europe or at its external borders. Since these illegitimate practices are widespread, and in some countries systematic, these pushbacks cannot be defined as incidental actions. They appear, instead, to be institutionalised violations, well defined within national policies. 

People who shall receive asylum or be rescued, are instead pushed back by police forces, who make sure that the material crossing of the borders remains undocumented. EU member States want to keep undocumented migrants, asylum seekers and refugees outside of their jurisdiction to avoid moral responsibilities and legal obligations. During the second panel of the day, Hanaa Hakiki, legal advisor at the ECCHR Migration Program, filmmaker and reporter Nicole Vögele, and Dimitra Andritsou, researcher at Forensic Architecture, had the chance to go in-depth and to consider the different aspects of these violations.

Hanaa Hakiki, in her intervention “Bringing pushbacks to justice” presented the difficulties that litigators experience in court to materially document pushbacks, which are indeed not meant to be proven. She defined pushbacks as a set of state measures, by which refugees and migrants are forced back over the border – generally shortly after having crossed it – without consideration of their individual circumstances and without any possibility for them to apply for asylum or to put forward arguments against the measures taken.

There are national and international laws that need to be considered in these cases, constituting binding legal obligations for all EU Member States. As a general principle, governments cannot enact disproportionate force, humiliating and degrading treatment or torture, and must facilitate the access to asylum, guarantee protection to people, and provide them access to individualised procedures in this sense.

Member States know that pushbacks have been illegal since a 2012 ECtHR judgment, known as the “Hirsi Jamaa Case,” which found that Italy had violated the law in forcing people back to Libya. However, the effective ban on direct returns led European countries to find other ways to avoid responsibility for those at sea or crossing their borders without documents, and concluded agreements with neighbouring countries, which are requested to prevent migrants from leaving their territories and paid to do so, by any means and without any human rights safeguards in place. By outsourcing rescue to the Libyan authorities, for example, pushbacks by EU countries turned into pullbacks by Libyan coastguard.

Land-pushbacks are still common practice. Hakiki explained that the European Centre for Constitutional and Human Rights (ECCHR) has worked with communities of undocumented migrants since 2014, considering potential legal interventions against the practice of pushbacks at EU borders, and assisting affected persons with individual legal proceedings. She presented three cases the ECCHR litigated in Court (N.D. and N.T. v. Spain; AA vs North Macedonia; SB vs. Croatia) proving that European countries illegally push people back, in violation of human rights laws. Despite the fact that this is still a common practice, it is very difficult to document these violations and have the authorities condemned.

Likhita Banerji (left), Hanaa Hakiki, Nicole Vögele and Dimitra Andritsou during the panel “Illegal Pushbacks and Border Violence”
Likhita Banerji (left), Hanaa Hakiki, Nicole Vögele and Dimitra Andritsou during the panel “Illegal Pushbacks and Border Violence”

During the beginning of the Syrian conflict, in 2015, refugees were able to travel via Serbia and Hungary into Central and Northern Europe. A couple of years later the EU decided to close down again this so-called Balkan Route, with the result that more and more people found themselves stuck in Bosnia-Herzegovina, prevented from continuing onward to Europe’s territories. From there a person can try to enter the European Union dozens of times, and each time is stopped by Croatian security forces, beaten, and then dragged back across the border to Bosnia-Herzegovina.

After having seen the effects of these illegal practices and met victims of dozens of violent pushbacks in Sarajevo, in 2019 the reporter Nicole Vögele and her crew succeeded in filming a series of these cross-border expulsions from Croatia to Bosnia Herzegovina near the village of Gradina, in the municipality of Velika Kladuša. The reporter, one of the few who succeeded in documenting this practice, also interviewed those who had just been pushed back by the Interventna Policija officers. The response of the Bosnian authorities to her reportage was a complete denial of all accusations. 

Vögele then presented footage taken at the EU external border in Croatia, in March 2020, showing masked men beat up refugees and illegally pushing them back to Bosnia. The journalist and her team found the original video, analysed its metadata, and interviewed the man captured on it. Once again, their work could prove that these practices are not isolated incidents.

The panel closed with the investigation by Forensic Architecture part of a broader project on cases of pushbacks across the Evros/Meriç river. Team member Dimitra Andritsou presented the organisation founded to investigate human rights violations using a range of techniques, flanking classical investigation methods including open-source investigation video analysis, spatial and architectural practice, and digital modelling.

Forensic Architecture works with and on behalf of communities affected by state and military violence, producing evidence for legal forums, human rights organisations, investigative reporters and media. A multidisciplinary research group – of architects, scholars, artists, software developers, investigative journalists, archaeologists, lawyers, and scientists – based at the University of London, works in partnership with international prosecutors, human rights organisations, political and environmental justice groups, and develops new evidentiary techniques to investigate violations of human rights around the world, armed conflicts, and environmental destruction.

The Evros/Meriç River is the only border between Greece and Turkey that is not sea. For years migrants and refugees trying to cross it to enter Europe have been reporting that unidentified and generally masked men catch, detain, beat, and push people back to Turkey. Mobile phones, documents, and the few personal things they travel with are confiscated or thrown into the river, not to leave any evidence of these violations behind. As Andridsou described, both Greek and EU authorities systematically deny any wrongdoing, refusing to investigate these reports. The river is part of a wider ecosystem of border defence and has been weaponised to deter and let die those who attempt to cross it. 

Nicole Vögele during the panel “Illegal Pushbacks and Border Violence”
Nicole Vögele during the panel “Illegal Pushbacks and Border Violence”

In December 2019, the German magazine Der Spiegel obtained rare videos filmed on a Turkish Border Guard’s mobile phone and on a surveillance, camera installed on the Turkish banks of the river, which apparently documented one of these many pushback operations. Forensic Architecture was commissioned to analyse the footages. A team of experts was then able to geolocate and timestamp the material and could confirm that the images were actually taken few hundred metres away from a Greek military watchtower in Greece.

Andritsou then presented the case of a group of three Turkish political asylum seekers, who entered Greek territory on 4 May 2019, always crossing the Evros/Meriç river. In this case a team of Forensic Architecture could cross-reference different evidence sources, such as new media, remote sensing, material analysis, and witness testimony and verify the group’s entry and their illegal detention in Greece. A pushback to Turkey on the 5 May 2019 led to their arrest and imprisonment by the Turkish authorities.

Ayşe Erdoğan, Kamil Yildirim, and Talip Niksar had been persecuted by the Turkish government on allegations of involvement in Fettulah Gulen’s movement. The group on the run had shared a video appealing for international protection against a possible forced return to Turkey and digitally recorded the journey via WhatsApp. All their text messages with location pins, photographs, videos and metadata prove their presence on Greek soil, prior to their arrest by the Turkish authorities. The investigation could verify that the three were in a Greek police station too, a fact that matches their statement about having repeatedly attempted to apply for asylum there. Their imprisonment is a direct result of the Greek authorities contravening the principle of non-refoulement.

Some keywords resonated throughout the first day of the conference, as a fil rouge connecting the speakers and debates held during the panels and commentaries by the public. Violence, arbitrariness and lawlessness are wilfully ignored –if not backed– by EU Member States, with authorities constantly trying to hide the truth. Thousands of people live under segregation, with no account or trace of being in custody of authorities free to do with them whatever they want. 

Likhita Banerji (left), Hanaa Hakiki, Nicole Vögele and Dimitra Andritsou during the panel “Illegal Pushbacks and Border Violence”
Likhita Banerji (left), Hanaa Hakiki, Nicole Vögele and Dimitra Andritsou during the panel “Illegal Pushbacks and Border Violence”

Technology has always been a part of border and immigration enforcement. However, over the last few years, as a response to increased migration into the European Union, governments and international organisations involved in migration management have been deploying new controversial tools, based on artificial intelligence and algorithms, conducting de facto technological experiments and research involving human subjects, refugees and people on the move. The second day of the conference opened with the video contribution by Petra Molnar, lawyer and researcher at the European Digital Rights, author of the recent report “Technological Testing Grounds” (2020) based on over 40 conversations with refugees and people on the move.

When considering AI, questions, answers, and predictions in its technological development reflect the political and socioeconomic point of view, consciously or unconsciously, of its creators. As discussed in the Disruption Network Lab conference “AI traps: automating discrimination” (2019)— risk analyses and predictive policing data are often corrupted by racist prejudice, leading to biased data collection which reinforces privileges of the groups that are politically more guaranteed. As a result, new technologies are merely replicating old divisions and conflicts. By instituting policies like facial recognition, for instance, we replicate deeply ingrained behaviours based on race and gender stereotypes, mediated by algorithms. Bias in AI is a systematic issue when it comes to tech, devices with obscure inner workings and the black box of deep learning algorithms.

There is a long list of harmful tech employed at the EU borders is long, ranging from Big Data predictions about population movements and self-phone tracking, to automated decision-making in immigration applications, AI lie detectors and risk-scoring at European borders, and now bio-surveillance and thermal cameras to contain the spread of the COVID-19. Molnar focused on the risks and the violations stemming from such experimentations on fragile individuals with no legal guarantees and protection. She criticised how no adequate governance mechanisms have been put in place, with no account for the very real impacts on people’s rights and lives. The researcher highlighted the need to recognise how uses of migration management technology perpetuate harms, exacerbate systemic discrimination, and render certain communities as technological testing grounds.

Once again, human bodies are commodified to extract data; thousands of individuals are part of tech-experiments without consideration of the complexity of human rights ramifications, and banalizing their material impact on human lives. This use of technology to manage and control migration is subject to almost no public scrutiny, since experimentations occur in spaces that are militarized and so impossible to access, with weak oversight, often driven by the private sector. Secrecy is often justified by emergency legislation, but the lack of a clear and transparent regulation of the technologies deployed in migration management appears to be deliberate, to allow for the creation of opaque zones of tech-experimentation. 

Molnar underlined how such a high level of uncertainty concerning fundamental rights and constitutional guarantees would be unacceptable for EU citizens, who would have ways to oppose these liberticidal measures. However, migrants and refugees have notoriously no access to mechanisms of redress and oversight, particularly during the course of their migration journeys. It could seem secondary, but emergency legislation justifies the disapplication of laws protecting privacy and data too, like the GDPR. 

Petra Molnar, lawyer and researcher author of the report “Technological Testing Grounds” (2020)
Petra Molnar, lawyer and researcher author of the report “Technological Testing Grounds” (2020)

The following part of the conference focused on the journey through Sub-Saharan and Northern Africa, on the difficulties and the risks that migrants face whilst trying to reach Europe. In the conversation “The Journey of Refugees from Africa to Europe,” Yoseph Zemichael Afeworki, Eritrean student based in Luxemburg, talked of his experience with Ambre Schulz, Project Manager at Passerell, and reporter Sally Hayden. Afeworki recalled his dramatic journey and explained what happens to those like him, who cross militarized borders and the desert. The student described that migrants represent a very lucrative business, not just because they pay to cross the desert and the sea, but also because they are used as cheap labour, when not directly captured for ransom.

Once on the Libyan coast, people willing to reach Europe find themselves trapped in a cycle of waiting, attempts to cross the Mediterranean, pullbacks and consequent detention. Libya is a country at war, with two governments. The lack of official records and the instability make it difficult to establish the number of people on the move and refugees detained without trial for an indefinite period. Libyan law punishes illegal migration to and from its territory with prison; this without any account for individual’s potential protection needs. Once imprisoned in a Libyan detention centre for undocumented migrants, even common diseases can lead fast to death. Detainees are employed as forced labour for rich families, tortured, and sexually exploited. Tapes recording inhuman violence are sent to the families of the victims, who are asked to pay a ransom.

As Hayden and Afeworki described, the conditions in the buildings where migrants are held are atrocious. In some, hundreds of people live in darkness, unable to move or eat properly for several months. It is impossible to estimate how many individuals do not survive and die there. An estimated 3,000 people are currently detained there. The only hope for them is their immediate evacuation and the guarantee of humanitarian corridors from Libya –whose authorities are responsible for illegal and arbitrary detention, torture and other crimes– to Europe.

Yoseph Zemichael AfeworkiandAmbre Schulz (video), Dr. Tatiana Bazzichelli (left) and Sally Hayden
Yoseph Zemichael AfeworkiandAmbre Schulz (video), Dr. Tatiana Bazzichelli (left) and Sally Hayden

The second day closed with the panel “Politics & Technologies of Fear” moderated by Walid El-Houri, researcher, journalist and filmmaker. Gaia Giuliani from the University of Coimbra, Claudia Aradau, professor of International Politics at King’s College in London, and Joana Varon founder at Coding Rights, Tech and Human Rights Fellow at Harvard Carr Center.

Gaia Giuliani is a scholar, an anti-racist, and a feminist, whose intersectional work articulates the deconstruction of public discourses on the iconographies of whiteness and race, questioning in particular the white narrative imaginary behind security and borders militarization. In her last editorial effort, “Monsters, Catastrophes and the Anthropocene: A Postcolonial Critique” (2020), Giuliani investigated Western visual culture and imaginaries, deconstructing the concept of “risk” and “otherness” within the hegemonic mediascape.

Giuliani began her analyses focusing on the Mediterranean as a border turned into a biopolitical dispositive that defines people’s mobility –and particularly people’s mobility towards Europe– as a risky activity, a definition that draws from the panoply of images of gendered monstrosity that are proper of the European racist imaginary, to reinforce the European and Western “we”. A form of othering, the “we” produces fears through mediatized chronicles of monstrosity and catastrophe. 

Giuliani sees the distorted narrative of racialized and gendered bodies on the move to Europe as essential to reinforce the identification of nowadays migrations with the image of a catastrophic horde of monsters, which is coming to depredate the wealthy and peaceful North. It´s a mechanism of “othering” through the use of language and images, which dehumanizes migrants and refugees in a process of mystification and monstrification, to sustain the picture of Europe as an innocent community at siege. Countries of origins are described as the place of barbarians, still now in post-colonial times, and people on the move are portrayed as having the ability to enact chaos in Europe, as if Europe were an imaginary self-reflexive space of whiteness, as it was conceived in colonial time: the bastion of rightfulness and progress. 

As Giuliani explained, in this imaginary threat, migrants and refugees are represented as an ultimate threat of monsters and apocalypse, meant to undermine the identity of a whole continent. Millions of lives from the South become an indistinct mass of people. Figures of race that have been sedimented across centuries, stemming from colonial cultural archives, motivate the need to preserve a position of superiority and defend political, social, economic, and cultural privileges of the white bodies, whilst inflicting ferocity on all others.

This mediatized narrative of monsters and apocalypse generates white anxiety, because that mass of racialized people is reclaiming the right to escape, to search for a better life and make autonomous choices to flee the objective causes of unhappiness, suffering, and despair; because that mass of individuals strives to become part of the “we”. All mainstream media consider illegitimate their right to escape and the free initiative people take to cross borders, not just material ones but also the semiotic border that segregate them in the dimension of “the barbarians.” An unacceptable unchained and autonomous initiative that erases the barrier between the colonial then and the postcolonial now, unveiling the coloniality of our present, which represents migration flows as a crisis, although the only crisis undergoing is that of Europe.

On the other side, this same narrative often reduces people on the move and refugees to vulnerable, fragile individuals living in misery, preparing the terrain for their further exploitation as labour force, and to reproduce once again racialized power relations. Here the process of “othering” revivals the colonial picture of the poor dead child, functional to engender an idea of pity, which has nothing to do with the individual dignity. Either you exist as a poor individual in the misery –which the white society mercifully rescues– or as a part of the mass of criminals and rapists. However, these distinct visual representations belong to the same distorted narration, as epitomized in the cartoons published by Charlie Hebdo after the sexual assaults against women in Cologne on New Year’s Eve 2015. 

Gaia Giuliani (video), Walid El-Houri and Joana Varon during the panel “Politics & Technologies of Fear”
Gaia Giuliani (video), Walid El-Houri and Joana Varon during the panel “Politics & Technologies of Fear”

Borders have been rendered as testing ground right for high-risk experimental technologies, while refugees themselves have become testing subjects for these experiments. Governments and non-state actors are developing and deploying emerging digital technologies in ways that are uniquely experimental, dangerous and discriminatory in the border and immigration enforcement context. Taking step from the history of scientific experiments on racialized and gendered bodies, Claudia Aradau invited the audience to reconsider the metaphorical language of experiments that orients us to picture high-tech and high-risk technological developments. She includes instead also tech in terms of obsolete tools deployed to penalise individuals and recreate the asymmetries of the digital divide mirroring the injustice of the neoliberal system.

Aradau studies technologies and the power asymmetries in their deployment and development. She explained that borders have been used as very profitable laboratories for the surveillance industry and for techniques that would then be deployed widely in the Global North. From borders and prison systems –in which they initially appeared– these technologies are indeed becoming common in urban spaces modelled around the traps of the surveillance capitalism. The fact that they slowly enter our vocabularies and daily lives makes it difficult to define the impact they have. When we consider for example that inmates’ and migrants’ DNA is collected by a government, we soon realise that we are entering a more complex level of surveillance around our bodies, showing tangibly how privacy is a collective matter, as a DNA sequence can be used to track a multitude of individuals from the same genealogic group.

Whilst we see hyper-advanced tech on one side, on the other people on the move walk with nothing to cross a border, sometimes not even shoes, with their personal belongings inside plastic bags, and just a smartphone to orientate themselves, and communicate and ask for help. An asymmetry, which is –once again– being deployed to maintain what Aradau defined as matrix of domination: no surveillance on CO2 emissions and environmental issues due to industrial activities, no surveillance on exploitations of resources and human lives; no surveillance on the production of weapons, but massive deployment of hi-tech to target people on the move, crossing borders to reach and enter a fortress, which is not meant for them. 

Aradau recalled that in theory, protocols ethics and demands for objectivity are necessary when it comes to scientific experiments. However, the introduction in official procedures of digital tech devices and software such as Skype, WhatsApp or MasterCard or a set of apps developed by either non-state or state actors, required neither laboratories nor the randomized custom trials that we usually associate with scientific experimentation. These heterogeneous techniques specifically intended to work everywhere and enforced without protocols, need to be understood under neoliberalism: they rely on pilot projects trials and cycles of funds and donors, whose goal is every time to move to a next step, to finance more experiments. Human-rights-centred tech is far away.

Thus, we see always more experiments carried out without protocols, from floating walls tech to stop migrants reaching the Greek shores, to debit cards used as surveillance devices. Creative experiments come also with the so-called refugees’ integration, conceived by small-scale injections of devices into their reality for limited periods, with the purpose of speculatively recompose rotten asymmetries of power and injustice. In Greece, as Aradau mentioned, the introduction of Skype in the process of the asylum application became an obstacle, with applicants continually experiencing debilitation through obsolete technology that doesn’t work or devices with limited access, disorientation through contradictory and outdated information.

There is also a factual aspect: old and slow computers, documents that have not been updated or have been updated at different times, and lack of personnel are justified by saying that resources are limited. A complete lack or shortage of funds, which is one other typical condition of neoliberalism, as we can see in Greece. In this, tech recomposes relations of precarity in a different guise.

Aradau concluded her contribution focusing on the technologies that are deployed by NGOs, completely or partially produced elsewhere, often by corporate actors who remain entangled in the experiments through their expertise and ownership. Digital platforms such as Facebook, Microsoft, Amazon, or Google not only shape relations between online users, she warned, but concerning people on the move and refugees too. Google and Facebook –for example– dominate the relations that underpin the production of refugee apps by humanitarian actors. 

Google is at the centre of a sort of digital humanitarian ecosystem, not only because it can host searches or provide maps for the apps, but also because it simultaneously intercepts data flows so that it acts as a surplus data extractor. In addition, social networks reshape digital humanitarianism through data extractive relations and provide big part of the infrastructure for digital humanitarianism. Online humanitarianism becomes thus a particularly vulnerable site of data gathering and characterised by an overall lack of resources –similarly to the Greek state. As a result, humanitarian actors cannot tackle the depreciation messiness and obsolescence of their tech and apps. 

Joana Varon and Walid El-Houri during the panel “Politics & Technologies of Fear”
Joana Varon and Walid El-Houri during the panel “Politics & Technologies of Fear”

The last day of the conference concentrated on the urgent need to creating safe passages for migration, and pictured the efforts of those who try to ensure safer migration options and rescue migrants in distress during their journey. Lieke Ploeger, community director of the Disruption Network Lab, presented the panel discussion “Creating Safe Passages”, moderated by Michael Ruf, writer and director of documentary and theatre plays. Ruf´s productions include the “Asylum Dialogues” (2011) and the “Mediterranean Migration Monologues” (2019), which have been performed in numerous countries more than 800 times by a network of several hundred actors and musicians. This final session brought together speakers from the Migrant Media Network (MMN), the Migrant Offshore Aid Station (MOAS), and SeaWatch e.V. to discuss their efforts to ensure safer migration options, as well as share reliable information and create awareness around migration issues. 

The talk was opened by Thomas Kalunge, Project Director of the Migrant Media Network, one of r0g_agency’s projects, together with #defyhatenow. Since 2017 the organisation has been working on information campaigns addressed to people in rural areas of Africa, to explain that there are possible alternatives for safer and informed decisions, when they choose to reach other countries, and what they may come across if they decide to migrate. 

The MMN team organises roundtable discussions and talks on various topics affecting the communities they meet. They build a space in which young people take time to understand what migration is nowadays and to listen to those, who already personally experienced the worst and often less discussed consequences of the journey. To approach possible migrants the MMN worked on an information campaign on the ethical use of social media, which also helps people to learn how to evaluate and consume information shared online and recognise reliable sources.

The MMN works for open culture and critical social transformation, and provides young Africans with reliable information and training on migration issues, included digital rights. The organisation also promotes youth entrepreneurship “at home” as a way to build economic and social resilience, encouraging youth to create their own opportunities and work within their communities of origin. They engage on conversations on the dangers of irregular migration, discussing together rumours and lies, so that individuals can make informed choices. One very relevant thing people tend to underestimate, is that sometimes misinformation is spread directly by human smugglers, warned Kalunge.

The MMN also provides people from remote regions with offline tools that are available without an internet connection, and training advisors and facilitators who are then connected in a network. The HyracBox for example is a mobile, portable, RaspberryPi powered offline mini-server for these facilitators to use in remote or offline environments, where access to both power and Internet is challenging. With it, multiple users can access key MMN info materials.

An important aspect to mention is that the MMN does not try to tell people not to migrate. European government have outsourced borders and migration management, supporting measures to limit people mobility in North and Sub-Saharan Africa, and it is important to let people know that there are real dangers, visible and invisible barriers they will meet on their way.

Visa application processes –even for legitimate reasons of travel– are very strict for some countries, often without any information being shared, even with people who are legitimately moving for education, to work or get medical treatment. The ruling class that makes up the administrative bureaucracy and outlines its structures, knows that who controls time has power. Who can steal time from others, who can oblige others to waste time on legal quibbles and protocol matters, can suffocate the others’ existence in a mountain of paperwork. 

Human smugglers then become the final resort. Kaluge explained also that, at the moment, the increased outsourcing of the European border security services to the Sahel and other northern Africa countries is leading to diversion of routes, increased dangerousness of the road, people trafficking, and human rights violations.

An image from the panel discussion “Creating Safe Passages” with Thomas Kalunge
An image from the panel discussion “Creating Safe Passages” with Thomas Kalunge

Closing the conference, Regina Catrambone presented the work that MOAS does around the world and the campaign for safe and legal routes that is urging governments and international organisations to implement regular pathways of migration that already exist. Mattea Weihe presented instead the work of SeaWatch e. V., an organisation which is also advocating for civil sea rescue, legal escape routes and a safe passage, and which is at sea to rescue migrants in distress.

The two panellists described the situation in the Central Mediterranean. Since the beginning of the year, over 500 migrants have drowned in the Mediterranean Sea (November 2020). While European countries continue to delegate their migration policy to the Libyan Coast Guard, rescue vessels from several civilian organisations have been blocked for weeks, witnessing the continuous massacre taking place just a few miles from European shores. With no common European rescue intervention at sea, the presence of NGO vessels is essential to save humans and rescue hundreds of people who undertake this dangerous journey to flee from war and poverty.

However, several EU governments and conservative and far right political parties criminalise search and rescues activities, stating that helping migrants at sea equals encouraging illegal immigration. A distorted representation legitimised, fuelled and weaponised in politics and across European society that has led to a terrible humanitarian crisis on Europe’s shores. Thus, organisations dedicated to rescuing vessels used by people on the move in the Mediterranean Sea see all safe havens systematically shut off to them. Despite having hundreds of rescued individuals on board, rescue ships wait days and weeks to be assigned a harbour. Uncertainty and fear of being taken back to Libya torment many of the people on board even after having been rescued. After they enter the port, the vessels are confiscated and cannot get back out to sea.

By doing so, Europe and EU Member States violate human rights, maritime laws, and their national democratic constitutions. The panel opened again the crucial question of humanitarian corridors, human-rights-based policy, and relocations. In the last years the transfer of border controls to foreign countries, has become the main instrument through which the EU seeks to stop migratory flows to Europe. This externalisation deploys modern tech, money and training of police authorities in third countries moving the EU-border far beyond the Union’ shores. This despite the abuses, suffering and human-rights violations; willingly ignoring that the majority of the 35 countries that the EU prioritises for border externalisation efforts are authoritarian, known for human rights abuses and with poor human development indicators (Expanding Fortress, 2018).

It cannot be the task of private organizations and volunteers to make up for the delay of the state. But without them no one would do it. 

States are seeking to leave people on the move, refugees, and undocumented migrants beyond the duties and responsibilities enshrined in law. Most of the violations, and the harmful technological experimentation described throughout the conference targeting migrants and refugees, occurs outside of their sovereign responsibility. Considering that much of technological development occurs in fact in the so-called “black boxes,” by acting so these state-actors exclude the public from fully understanding how the technology operates.

The fact that the people on the move on the Greek islands, on the Balkan Route, in Libya, and those rescued in the Mediterranean have been sorely tested by their journeys, living conditions, and, in many cases, imprisoned, seems to be irrelevant. The EU deploys politics that make people who have already suffered violence, abuse, and incredibly long journeys in search of a better life, wait a long time for a safe port, for a visa, for a medical treatment.

Dr. Tatiana Bazzichelli, founder and director of the Distruption Network Lab and Lieke Ploeger, community director of the Disruption Network Lab, closing the 21st conference “Borders of Fear”
Dr. Tatiana Bazzichelli, founder and director of the Distruption Network Lab and Lieke Ploeger, community director of the Disruption Network Lab, closing the 21st conference “Borders of Fear”

All participants who joined the conference expressed the urgent need for action: Europe cannot continue to turn its gaze away from a humanitarian emergency that never ends and that needs formalised rescue operations at sea, open corridors, and designated authorities enacting an approach based on defending human rights. Sea rescue organisations and civil society collectives work to save lives, raise awareness and demand a new human rights-based migration and refugee policy; they shall not be impeded but supported.

The conference “Borders of Fear” presented experts, investigative journalists, scholars, and activists, who met to discuss wrongdoings in the context of migration and share effective strategies and community-based approaches to increase awareness on the issues related to the human-rights violations by governments. Here bottom-up approaches and methods that include local communities in the development of solutions appear to be fundamental. Projects that capacitate migrants, collectives, and groups marginalized by asymmetries of power to share knowledge, develop and exploit tools to combat systematic inequalities, injustices, and exploitation are to be enhanced. It is imperative to defeat the distorted narrative, which criminalises people on the move, international solidarity and sea rescue operations.

Racism, bigotry and panic are reflected in media coverage of undocumented migrants and refugees all over the world, and play an important role in the success of contemporary far-right parties in a number of countries. Therefore, it is necessary to enhance effective and alternative counter-narratives based on facts. For example, the “Africa Migration Report” (2020) shows that 94 per cent of African migration takes a regular form and that just 6 per cent of Africans on the move opt for unsafe and dangerous journeys. These people, like those from other troubled regions, leave their homes in search of a safer, better life in a different country, flee from armed conflicts and poverty. It is their right to do so. Instead of criminalising migration, it is necessary to search for the real causes of this suffering, war and social injustice, and wipe out the systems of power behind them. 

Videos of the conference are also available on YouTube via:

For details of speakers and topics, please visit the event page here: https://www.disruptionlab.org/borders-of-fear 

Upcoming programme:

The 23rd conference of the Disruption Network Lab curated by Tatiana Bazzichelli is titled “Behind the Mask: Whistleblowing During the Pandemic.“ It will take place on March 18-20, 2021. More info: https://www.disruptionlab.org/behind-the-mask 

To follow the Disruption Network Lab sign up for the newsletter and get informed about its conferences, meetups, ongoing researches and projects.

The Disruption Network Lab is also on Twitter, Facebook and Instagram.

DATA CITIES: Smart Technologies, Tracking & Human Rights

On September 25, 2020, the Disruption Network Lab opened its 20th conference “Data Cities: Smart Technologies, Tracking & Human Rights” curated by Tatiana Bazzichelli, founder and program director of the organisation, and Mauro Mondello, investigative journalist and filmmaker. The two-day-event was a journey inside smart-city visions of the future, reflecting on technologies that significantly impact billions of citizens’ lives and enshrine new unprecedented concentrations of power, characterising the era of surveillance capitalism. A digital future which is already here.

Smart urbanism relies on algorithms, data mining, analytics, machine learning and infrastructures, providing scientists and engineers with the capability of extracting value from the city and its people, whose lives and bodies are commodified. The adjective ‘smart’ represents a marketing gimmick used to boost brands and commercial products. When employed to designate metropolitan areas, it describes cities which are liveable, sustainable and efficient thanks to technology and the Internet.

The conference was held at Berlin’s Kunstquartier Bethanien and brought together researchers, activists and artists to discuss what kind of technologies are transforming metropolises and how. The Disruption Network Lab aimed at stimulating a concrete debate, devoid of the rhetoric of solutionism, in which participants could focus on the socio-political implications of algorithmic sovereignty and the negative consequences on fundamental rights of tracking, surveillance and AI. They shared the results of their latest work and proposed a critical approach, based on the motivation of transforming mere opposition into a concrete path for inspirational change.

Lieke Ploeger, Community Director of the Disruption Network Lab (left), and Tatiana Bazzichelli, Founder and Programme Director of the Disruption Network Lab
Lieke Ploeger, Community Director of the Disruption Network Lab (left), and Tatiana Bazzichelli, Founder and Programme Director of the Disruption Network Lab

The first part of the opening keynote “Reclaiming Data Cities: Fighting for the Future We Really Want” was delivered by Denis “Jaromil” Roio, ethical hacker, artist and activist. In his talk, moderated by Daniel Irrgang, research fellow at the Weizenbaum Institute for the Networked Society, Jaromil focused on algorithmic sovereignty and the incapacity to govern technological transformation which characterises our societies today. 

Jaromil looked at increasing investments in AI, robots and machine learning, acknowledging that automated decision-making informed by algorithms has become a dominant reality extending to almost all aspects of life. From the code running on thousands of e-devices to the titanic ICTs-infrastructures connecting us, when we think about the technology surrounding us, we realise that we have no proper control over it. Even at home, we cannot fully know what the algorithms animating our own devices are adopted for, if they make us understand the world better or if they are instead designed to allow machines to study and extract data from us for the benefit of their creators. The same critical issues and doubts emerge with a large-scale implementation of tech within so-called “smart cities”, maximization of the “Internet of Things” born in the 1980s.

Personal data is a lucrative commodity and big data means profit, power, and insights, which is essential to all government agencies and tech firms. Jaromil announced a call-to-action for hackers and programmers, to get involved without compromise and play a key role in building urban projects which will safeguard the rights of those living in them, taking into consideration that by 2050, an estimated 70 per cent of the world’s population may well live in cities. 

Jaromil observed that there is too often a tremendous difference between what we see when we look at a machine and what really happens inside it. Since the dawn of the first hacking communities, hackers preferred writing their own software and constructing their own machines. They were free to disassemble and reassemble them, having control over all the functions and direct access to the source code. This was also a way to be independent from the corporate world and authorities, which they mistrusted. 

Today, users are mostly unaware of the potential of their own tech-devices, which are no longer oriented strictly towards serving them. They have no exposure to programming and think Computer Science and Informatics are way too difficult to learn, and so entrust themselves entirely to governments and tech firms. Jaromil works to simplify interface and programming language, so people can learn how to program and regain control over their tech. He supports minimalism in software design and a process of democratisation of programming languages which works against technocratic monopolies. His Think & Do TankDyne.org—is a non-profit software house with expertise in social and technical innovation, gathering developers from all over the world. It integrates art, science and technology in brilliant community-oriented projects (D-CENT, DECODE, Commonfare, Devuan), promoting decentralisation and digital sovereignty to encourage empowerment for the people.

Julia Koiber (left), Denis “Jaromil” Roio and Daniel Irrgang during the keynote “Reclaiming Data Cities: Fighting for the Future We Really Want”
Julia Koiber (left), Denis “Jaromil” Roio and Daniel Irrgang during the keynote “Reclaiming Data Cities: Fighting for the Future We Really Want”

The second keynote speaker, Julia Koiber, managing director at SuperrrLab, addressed issues of technology for the common good, open data and transparency, and—like the previous speaker—reflected on uncontrolled technological transformation. Koiber noticed that the more people are mobilising to be decision-makers, rather than passive data providers, the more they see how difficult it is to ensure that publicly relevant data remains subject to transparent control and public ownership. In the EU several voices are pushing for solutions, including anonymised user data to be classified as ‘common good’ and therefore free from the control of tech companies.

Recalling the recent Canadian experience of Sidewalk Labs (Alphabet Inc.’s company for urban tech development), Koiber explained that in order to re-imagine the future of neighbourhoods and cities, it is necessary to involve local communities. The Google’s company had proposed rebuilding an area in east Toronto, turning it into its first smart city: an eco-friendly digitised and technological urban planning project, constantly collecting data to achieve perfect city living, and a prototype for Google’ similar developments worldwide. In pushing back against the plan and its vertical approach, the municipality of Toronto made clear that it was not ready to consider the project unless it was developed firmly under public control. The smart city development which never really started died out with the onset of the COVID-19 crisis. Its detractors argue that city dwellers were meant to be human sensors collecting data to test new tech-solutions and increase corporate profit. Data collected during the provision of public services and administrative data should be public; it belongs to the people, not to a black box company.

As Jaromil and Koiber discussed, in the main capitals of the world the debate on algorithmic sovereignty is open and initiatives such as the “Manifesto in favour of technological sovereignty and digital rights for cities,” written in Barcelona, reflect the belief that it will be crucial for cities to achieve full control and autonomy of their ICTs, which includes service infrastructures, websites, applications and data owned by the cities themselves and regulated by laws protecting the interests and fundamental rights of their citizens. Their implementation shall come within people-centric projects and a transparent participatory process.

Julia Koiber (left), Denis “Jaromil” Roio and Daniel Irrgang during the keynote “Reclaiming Data Cities: Fighting for the Future We Really Want”
Julia Koiber (left), Denis “Jaromil” Roio and Daniel Irrgang during the keynote “Reclaiming Data Cities: Fighting for the Future We Really Want”

The work of the conference continued with the panel “Making Cities Smart for Us: Subverting Tracking & Surveillance,” a cross-section of projects by activists, researchers and artists digging into the false myth of safe and neutral technologies, proposing both counterstrategies and solutions to tackle issues introduced in the opening keynote.

Eva Blum-Dumontet, English researcher on privacy and social-economic rights, dedicated her work to the impact of tech on people, particularly those in vulnerable situations. She opened the talk with the observation that the term ‘smart city’ lacks of an official definition; it was coined by IBM’s marketing team in 2005 without a scientific basis. Since then, tech firms from all over the world have been developing projects to get into governments’ favour and to build urban areas that integrate boundless tech-solutions: security and surveillance, energy and mobility, construction and housing, water supply systems and so on. 

As of today, thanks to smart cities, companies such as IBM, Cisco, Huawei, Microsoft and Siemens have found a way to generate the satisfaction of both governments and their suppliers, but do not seem to act in the public’s best interest. In their vision of smart urbanism people are only resources: like water, buildings and administrative services, they are something to extract value from. 

Blum-Dumontet explained that when we refer to urban tech-development, we need to remember that cities are political spaces and that technology is not objective. Cities are a concentration of countless socio-economic obstacles that prevent many individuals from living a dignified life. Privilege, bias, racism and sexism are already integrated in our cities´ (tech-)infrastructures. The researcher acknowledged that it is very important to implement people-centric solutions, while keeping in mind that as of now our cities are neither inclusive nor built for all, with typical exclusion of, for instance, differently abled individuals, low-income residents and genderqueer people.

Panel discussion “Making Cities Smart for Us: Subverting Tracking & Surveillance” with Eva Blum- Dumontet, Andreas Zingerle, Linda Kronman and Tatiana Bazzichelli
Panel discussion “Making Cities Smart for Us: Subverting Tracking & Surveillance” with Eva Blum- Dumontet, Andreas Zingerle, Linda Kronman and Tatiana Bazzichelli

A sharp critique of the socio-economic systems causing injustice, exploitation and criminalisation, also lies at the core River Honer’s work. River is a web developer at Expedition Grundeinkommen and anti-capitalist tech activist, who wants to support citizens and activists in their struggle for radical transformation toward more just cities and societies without relying on solutions provided by governments and corporations.

Her work methodology includes critical mapping and geospatial analyses, in order to visualise and find solutions to structurally unjust distribution of services, access and opportunities in given geographic areas. Honer works with multidisciplinary teams on community-based data gathering, and turns information into geo-visualisation to address social issues and disrupt systems of discriminatory practices which target minorities and individuals. Examples of her work include LightPath, an app providing the safest well-lit walking route between two locations through various cities; Refuge Restroom, which displays safe restroom access for transgender, intersex, and gender nonconforming individuals who suffer violence and criminalisation in the city, and the recent COVID-19 tenant protection map.

Honer’s projects are developed to find practical solutions to systematic problems which underpin a ruthless political-economic structure. She works on tech that ignores or undermines the interests of capitalism and facilitates organisation for the public ownership of housing, utilities, transport, and means of production. 

The Disruption Network Lab dedicated a workshop to her Avoid Control Project, a subversive tracking and alert system that Honer developed to collect the location of ticket controllers for the Berlin public transportation company BVG, whose methods are widely considered aggressive and discriminatory.

There are many cities in the world in which activist groups, non-governmental organisations and political parties advocate for a complete revocation of fares on public transport systems. The topic has been debated for many years in Berlin too; the BVG is a public for-profit company earning millions of euros annually on advertising alone, and in addition charges expensive flat fares for all travelers.

The panel discussion was concluded with Norway-based speakers Linda Kronman and Andreas Zingerle of the KairUs collective. The two artists explored topics such as vulnerabilities in Internet-of-Things-devices and corporatisation of city governance in smart cities, as well as giving life to citizen-sensitive-projects in which technology is used to reclaim control of our living environments. As Bazzichelli explained when presenting the project “Suspicious Behaviours” by Kronman, KairUs’s production constitutes an example of digital art eroding the assumptions of objective or neutral Artificial Intelligence, and shows that hidden biases and hidden human decisions influence the detection of suspicious behaviour within systems of surveillance, which determines the social impacts of technology.

The KairUs collective also presented a few of its other projects: “The Internet of Other People’s things” addresses technological transformation of cities and tries to develop new critical perspectives on technology and its impact on peoples’ lifestyles. Their video-installation “Panopticities” and the artistic project “Insecure by Design” (2018) visualise the harmful nature of surveillance capitalism from the unusual perspective of odd vulnerabilities which put controlled and controllers at risk, such as models of CCTV and IP cameras with default login credentials and insecure security systems which are easy to hack or have by default no password-protection at all. 

Focusing on the reality of smart cities projects, the collective worked on “Summer Research Lab: U City Sogdo IDB”(2017), which looked at Asian smart urbanism and reminding the panellists that many cities like Singapore, Jakarta, Bangkok, Hanoi, Kuala Lump already heavily rely on tech. In Songdo City, South Korea, the Songdo International Business District (Songdo IBD), is a new “ubiquitous city” built from scratch, where AI can monitor the entire population’s needs and movements.  At any moment, through chip-implant bracelets, it is possible to spot where someone is located, or observe people undetected using cameras covering the whole city. Sensors constantly gather information and all services are automatised. There are no discernible waste bins in the park or on street corners; everything seems under tech-control and in order. As the artists explained, this 10-year development project is estimated to cost in excess of 40 billion USD, making it one of the most expensive development projects ever undertaken.

Panel discussion “Making Cities Smart for Us: Subverting Tracking & Surveillance” with Eva Blum- Dumontet, Andreas Zingerle, Linda Kronman and Tatiana Bazzichelli
Panel discussion “Making Cities Smart for Us: Subverting Tracking & Surveillance” with Eva Blum- Dumontet, Andreas Zingerle, Linda Kronman and Tatiana Bazzichelli

The task of speculative architecture is to create narratives about how new technologies and networks influence and shape spaces and cultures, foreseeing possible futures and imagining how and where  new forms of human activity could exist within cities changed by these new processes. Liam Young, film director, designer and speculative architect opened the keynote on the second conference day with his film “Worlds Less Travelled: Mega-Cities, AI & Critical Sci-Fi“. Through small glimpses, fragments and snapshots taken from a series of his films, he portrayed an alternative future of technology and automation in which everything is controlled by tech, where complexities and subcultures are flattened as a result of technology, and people have been relegated to the status of mere customers instead of citizens

Young employs the techniques of popular media, animation, games and doc-making to explore the architectural, urban and cultural implications of new technologies. His work is a means of visualising imaginary future worlds in order to help understand the one we are in now. Critical science fiction provides a counter-narrative to the ordinary way we have of representing time and society. Young speaks of aesthetics, individuals and relationships based on objects that listen and talk back, but which mostly communicate with other machines. He shows us alternative futures of urban architecture, where algorithms define the extant future, and where human scale is no longer the parameter used to measure space and relations.

Young’s production also focused on the Post-Anthropocene, an era in which technology and artificial intelligence order, shape and animate the world, marking the end of human-centered design and the appearance of new typologies of post-human architectures. Ours is a future of data centres, ITCs networks, buildings and infrastructures which are not for people; architectural spaces entirely empty of human lives, with fields managed by industrialised agriculture techniques and self-driving vehicles. Humans are few and isolated, living surrounded by an expanse of server stacks, mobile shelving systems, robotic cranes and vacuum cleaners. The Anthropocene, in which humans are the dominant force shaping the planet, is over.

Anna Ramskogler-Witt and Lucia Conti during the Keynote “Worlds Less Travelled: Mega-Cities, AI & Critical Sci-Fi”
Anna Ramskogler-Witt and Lucia Conti during the Keynote “Worlds Less Travelled: Mega-Cities, AI & Critical Sci-Fi”

The keynote, moderated by the journalist Lucia Conti, editor at “Il Mitte”and communication expert at UNIDO, moved from the corporate dystopia of Young, in which tech companies own cities and social network interactions are the only way people interrelate with reality, to the work of filmmaker Tonje Hessen Schei, director of the documentary film “iHuman”(2020). The documentary touches on how things are evolving from biometric surveillance to diversity in data, providing a closer look at how AI and algorithms are employed to influence elections, to structure online opinion manipulation, and to build systems of social control. In doing so, Hessen Schei depicts an unprecedented concentration of power in the hands of few individuals.

The movie also presents the latest developments in Artificial Intelligence and Artificial General Intelligence, the hypothetical intelligence of machines that can understand or learn any task that a human being can. 

When considering AI, questions, answers and predictions in its technological development will always reflect the political and socioeconomic point of view, consciously or unconsciously, of its creators. For instance —as described in the Disruption Network Lab´s conference “AI Traps” (2019)—credit scores are historically correlated with racist segregated neighbourhoods. Risk analyses and predictive policing data are also corrupted by racist prejudice leading to biased data collection which reinforces privilege. As a result new technologies are merely replicating old divisions and conflicts. By instituting policies like facial recognition, for instance, we replicate deeply ingrained behaviours based on race and gender stereotypes and mediated by algorithms. 

Automated systems are mostly trying to predict and identify a risk, which is defined according to cultural parameters reflecting the historical, social and political milieu, in order to give answers and make decisions which fit a certain point of view. What we are and where we are as a collective —as well as what we have achieved and what we still lack culturally— gets coded directly into software, and determines how those same decisions will be made in the future. Critical problems become obvious in case of neural networks and supervised learning. 

Simply put, these are machines which know how to learn and networks which are trained to reproduce a given task by processing examples, making errors and forming probability-weighted associations. The machine learns from its mistakes and adjusts its weighted associations according to a learning rule and using error values. Repeated adjustments eventually allow the neural network to reproduce an output increasingly similar to the original task, until it reaches a precise reproduction. The fact is that algorithmic operations are often unpredictable and difficult to discern, with results that sometimes surprise even their creators. iHuman shows that this new kind of AI can be used to develop dangerous, uncontrollable autonomous weapons that ruthlessly accomplish their tasks with surgical efficiency.

Lucia Conti, Editor in Chief “Il Mitte” (left), and Tatiana Bazzichelli, Founder and Programme Director of the Disruption Network Lab
Lucia Conti, Editor in Chief “Il Mitte” (left), and Tatiana Bazzichelli, Founder and Programme Director of the Disruption Network Lab

Conti moderated the dialogue between Hessen Schei, Young, and Anna Ramskogler-Witt, artistic director of the Human Rights Film Festival Berlin, digging deeper into aspects such as censorship, social control and surveillance. The panellists reflected on the fact that—far from being an objective construct and the result of logic and math—algorithms are the product of their developers’ socio-economic backgrounds and individual beliefs; they decide what type of data the algorithm will process and to what purpose. 

All speakers expressed concern about the fact that the research and development of Artificial Intelligence is ruled by a few highly wealthy individuals and spoiled megalomaniacs from the Silicon Valley, capitalists using their billions to develop machines which are supposed to be ‘smarter’ than human beings. But smart in this context can be a synonym for brutal opportunism: some of the personalities and scientists immortalised in Hessen Schei´s work seem lost in the tiny difference between playing the roles of visionary leaders and those whose vision has started to deteriorate and distort things. Their visions, which encapsulate the technology for smart cities, appear to be far away from people-centric and based on human rights.

Not only big corporations but a whole new generation of start-ups are indeed fulfilling authoritarian practises through commercialising AI-technologies, automating biases based on skin colour and ethnicity, sexual orientation and identity. They are developing censored search engines and platforms for authoritarian governments and dictators, refining high-tech military weapons, and guaranteeing order and control.

The participants on stage made clear that, looking at surveillance technology and face recognition software, we see how existing ethical and legal criteria appear to be ineffective, and a lack of standards around their use and sharing just benefit their intrusive and discriminatory nature. Current ethical debates about the consequences of automation focus on the rights of individuals and marginalised groups. Algorithmic processes, however, generate a collective impact as well that can only be partially addressed at the level of individual rights— they are the result of a collective cultural legacy. 

Nowadays, we see technologies of control executing their tasks in aggressive and violent ways. They monitor, track and process data with analytics against those who transgress or attempt to escape control, according to a certain idea of control that was thought them. This suggests, for example, that when start-ups and corporations establish goals and values within software regulating public services, they do not apply the principles developed over century-long battles for civil rights, but rely on technocratic motivations for total efficiency, control and productivity. The normalisation of such a corporatisation of the governance allows Cisco, IBM and many other major vendors of analytics and smart technologies to shape very delicate public sectors, such as police, defence, fire protection, or medical services, that should be provided customarily by a governmental entity, including all (infra)structures usually required to deliver such services. In this way their corporate tactics and goals become a structural part of public functions.

Film director Tonje Hessen Schei during the keynote “Worlds Less Travelled: Mega-Cities, AI & Critical Sci-Fi”
Film director Tonje Hessen Schei during the keynote “Worlds Less Travelled: Mega-Cities, AI & Critical Sci-Fi”

In the closing panel “Citizens for Digital Sovereignty: Shaping Inclusive & Resilient” moderated by Lieke Ploeger, community director of the Disruption Network Lab, political scientist Elizabeth Calderón Lüning reflected on the central role that municipal governments have to actively protect and foster societies of digital self-determination. In Berlin, networks of collectives, individuals and organisations work to find bottom-up solutions and achieve urban policies in order to protect residents, tenants and community spaces from waives of speculation and aggressive economic interests. Political and cultural engagement make the German capital a centre of flourishing debate, where new solutions and alternative innovative perspectives find fertile ground, from urban gardening to inclusion and solidarity. But when it comes to technological transformation and digital policy the responsibility cannot be left just at the individual level, and it looks like the city government is not leading the way in its passive reactions towards external trends and developments. 

Calderón Lüning is currently researching in what spaces and under what premises civic participation and digital policy have been configured in Berlin, and how the municipal government is defining its role. In her work she found policy incoherence among several administrations, alongside a need for channels enabling citizens to participate and articulate as a collective. The lack of resources in the last decade for hiring and training public employees and for coordinating departmental policies is slowing down the process of digitalisation and centralisation of the different administrations.

The municipality’s smart city strategy, launched in 2015, has recently been updated and refinanced with 17 million euros. In 2019 the city Senate released the Berlin Digital Strategy for the coming years. To avoid the harmful consequences of a vertical approach by the administration towards its residents, activists, academics, hackers, people from civil society and many highly qualified scientists in the digital field came together to rethink and redesign an ecological, participatory and democratic city for the 21st century. The Berlin Digital City Alliance has been working since then to arrive at people and rights-centred digital policies and is structuring institutional round tables on these aspects, coordinated by civic actors.

Digital sovereignty is the power of a society to control technological progress, self-determining its way through digital transformation. It is also the geopolitical ownership and control of critical IT infrastructures, software and websites. When it comes to tech in public services, particularly essential public services, who owns the infrastructure and what is inside the black box are questions that administrations and policy makers should be able to answer, considering that every app or service used contains at least some type of artificial intelligence or smart learning automation based on a code, which has the potential to significantly affect citizens’ lives and to set standards that are relevant to their rights. Without open scrutiny, start-ups and corporations owning infrastructures and code have exceeded influence over delicate aspects regulating our society.

Panel discussion “Citizens for Digital Sovereignty: Shaping Inclusive & Resilient Cities” with Elizabeth Calderón Lüning (left), Rafael Heiber, Alexandre Monnin (screen), and Lieke Ploeger.
Panel discussion “Citizens for Digital Sovereignty: Shaping Inclusive & Resilient Cities” with Elizabeth Calderón Lüning (left), Rafael Heiber, Alexandre Monnin (screen), and Lieke Ploeger.

Rafael Heiber, geologist, researcher and co-founder of the Common Action Forum, focused on the urgent need to understand ways of living and moving in the new space of hybridisation that cities of the future will create. Taking a critical look at the role of technologies, he described how habitability and mobility will be fundamental in addressing the challenges posed by an urban planning that lies in a tech-substratum. As he explained, bodies are relevant inside smart environments because of their interactions, which are captured by sensors. Neoliberal capitalism has turned us into relentless energy consumers in our everyday lives, not because we move too much, but because we use technology to move and tech needs our movements.

Heiber considered the way automobiles have been influencing a whole economic and financial system for longer than a century. In his view they symbolise the way technology changes the world around itself and not just for the better. Cars have transformed mobility, urban environment, social interactions and the way we define spaces. After one hundred years, with pollution levels increasing, cities are still limited, enslaved, and dominated by cars. The geologist suggested that the implementation of smart cities and new technologies might end up in this same way.

Alexandre Monnin, head of Strategy and Design for the Anthropocene, closed the panel discussion questioning the feasibility of smart cities, focusing on the urge to avoid implementing unsustainable technologies, which proved to be a waste of resources. Monnin acknowledged that futuristic ideas of smart cities and solutionism will not tackle climate change and other urgent problems. Our society is profit-oriented and the more efficient it is, the more the system produces and the more people consume. Moreover, tech doesn´t always mean simplification. Taking as example the idea of dematerialisation, which is actually just a displacement of materiality, we see today for example how video rental shops have disappeared almost worldwide, replaced in part by the online platform Netflix, which represents 15 percent of internet traffic.

Monnin warned about the environmental impact of tech, not just the enormous amount of energy consumed and Co2 produced on a daily basis, but also the amount of e-waste growing due to planned obsolescence and consumerism. Plastics are now a growing environmental pollutant and constitute a geological indicator of the Anthropocene, a distinctive stratal component that next generations will see. Monnin defines as ‘negative commons’ the obsolete tech-infrastructures and facilities that will exist forever, like nuclear power plants, which he defines as “zombie technology”.

The French researcher concluded his contribution pointing out that humanity is facing unprecedented risks due to global warming, and—as far as it is possible to know—in the future we might even not live in cities. Monnin emphasized that people shall come together to prevent zombie-tech obsolescence from happening, like in Toronto, and he wishes that we could see more examples of civil opposition and resistance to tech which is unfit for our times. Smart cities are not revolutionising anything, they constitute business as usual and belong to the past, he argued, and concluded by appealing for more consideration of the risks related to institutionalisation of what he calls “corporate cosmology” which turns cities into profit-oriented firms with corporate goals and competitors, relying on the same infrastructures as corporations do.

Panel discussion “Citizens for Digital Sovereignty: Shaping Inclusive & Resilient Cities” with Elizabeth Calderón Lüning (left), Rafael Heiber, Alexandre Monnin (screen), and Lieke Ploeger.
Panel discussion “Citizens for Digital Sovereignty: Shaping Inclusive & Resilient Cities” with Elizabeth Calderón Lüning (left), Rafael Heiber, Alexandre Monnin (screen), and Lieke Ploeger.

In its previous conference “Evicted by Greed,” the Disruption Network Lab focused on the financialisation of housing. Questions arose about how urban areas are designed and governed now and how they will look in the future if the process of speculation on peoples’ lives and privatisation of available common spaces is not reversed. Billions of people live in cities which are the products of privilege, private corporate interests and financial greed. This 20th conference focused on what happens if these same cities turn into highly digitised environments, molded by governments and billionaire elites, tech-engineers and programmers, who wish to have them functioning as platforms for surveillance and corporate intelligence, in which data is constantly used, stored and collected for purposes of profiling and control.

According to the UN, the future of the world’s population is urban. Today more than half the world’s people is living in urban areas (55 percent). By mid-century 68 percent of the world’s population will be living in cities, as opposed to the 30 percent in 1950. By 2050, the global urban population is projected to grow by 2.5 billion urban dwellers, with nearly 90 percent of the increase in Asia and Africa, as well as the appearance of dozens of megacities with a population of at least 10 million inhabitants on the international scene.

This conference presented the issue of algorithmic sovereignty and illustrated how powerful tech-firms work with governments—which are also authoritarian regimes and dictators— to build urban conglomerates based on technological control, optimisation and order. These corporations strive to appear as progressive think tanks offering sustainable green solutions but are in fact legitimising and empowering authoritarian surveillance, stealing data and causing a blurry mix of commercial and public interests.

Algorithms can be employed to label people based on political beliefs, sexual identity or ethnicity. As a result, authoritarian governments and elites are already exploiting this tech to repress political opponents, specific genders and ethnicities. In such a scenario no mass-surveillance or facial recognition tech is safe and attempts at building “good tech for common goods” might just continue to fail.

To defeat such an unprecedented concentration of power, we need to pressure governments at all levels to put horizontal dialogue, participation, transparency and a human-rights based approach at the centre of technological transformation. To this end, cities should open round tables for citizens and tech-developers, forums and public committees on algorithmic sovereignty in order to find strategies and local solutions. These will become matters of, quite literally, life and death. 

Smart cities have already been built and more are at the planning and development stages, in countries such as China, Singapore, India, Saudi Arabia, Kazakhstan, Jordan, and Egypt. As Bazzichelli pointed out, the onset of the dramatic COVID-19 crisis has pushed social control one step further. We are witnessing increasing forms of monitoring via tracking devices, drone technologies and security infrastructures. Moreover, governments, banks and corporations think that this pandemic can be used to accelerate the introduction of technologies in cities, like 5G and Internet of Things.

There is nothing wrong with the old idea that we can use technology to build liveable, sustainable, and efficient cities. But it is hard to imagine this happening with technology provided by companies that exhibit an overall lack of concern for human rights violations.

Tatiana Bazzichelli (left), Founder and Programme Director of the Disruption Network Lab and Lieke Ploeger, Community Director of the Disruption Network Lab
Tatiana Bazzichelli (left), Founder and Programme Director of the Disruption Network Lab and Lieke Ploeger, Community Director of the Disruption Network Lab

Alongside the main conference sessions, several workshops enriched the programme. Videos of the conference are also available on YouTube.

For details on speakers and topics, please visit the event page here: https://www.disruptionlab.org/data-cities

The 21th conference of the Disruption Network Lab curated by Tatiana Bazzichelli
“BORDER OF FEARS” will take place on November 27-29, live from Studio 1, Kunstquartier Bethanien, Mariannenplatz 2, 10997 Berlin.
More info here

To follow the Disruption Network Lab sign up for its Newsletter to receive information about its conferences, projects and ongoing research.

The Disruption Network Lab is also on Twitter and Facebook.

Evicted by Greed: Global Finance, Housing & Resistance

On May 29, 2020, the Disruption Network Lab opened its 19th conference, “Evicted by Greed: Global Finance, Housing & Resistance”. The three-day-event was supposed to take place in Berlin in March, on the days of the global call for the Housing Action Days. Instead, it took place online due to ongoing safety concerns relating to the coronavirus pandemic.

Chaired by Tatiana Bazzichelli and Lieke Ploeger, programme and community director of the Disruption Network Lab, the interactive digital event brought together speakers and audience members from their homes from all over the world to investigate how speculative finance drives global and local housing crises. The topic of how aggressive speculative real-estate purchases by shell companies, anonymous entities, and corporations negatively impacts peoples’ lives formed the core conversation for the presentations, panel discussions, and interactive question and answer sessions. The conference served as a platform for sharing experiences and finding counter-strategies.

In her introductory statement, Bazzichelli took stock of the situation. As the pandemic appeared, it became clear worldwide that the “stay-at-home” order and campaigns were not considering people who cannot comply since they haven’t got any place to stay. Tenants, whose work and lives have been impacted, struggle to pay rent, bills, or other essentials, and in many cases had to leave their homes or have been threatened with forced eviction. People called on lawmakers at a national and local level to freeze rent requirements as part of their response to the pandemic, but very few measures have been put in place to protect them. However, scarce and unaffordable housing is neither a new, nor a local problem found in just a few places.

Lieke Ploeger, Community Director of the Disruption Network Lab (left), and Tatiana Bazzichelli, Founder and Programme Director of the Disruption Network Lab

Christoph Trautvetter, public policy expert and German activist of
Netzwerk Steuergerechtigkeit (Network for Tax Justice) and Wem gehört
die Stadt (Who Owns the City) of the Rosa-Luxemburg Foundation, and Manuel Gabarre de Sus, Spanish lawyer and activist from the Observatory Against Economical Crime, delivered the opening keynote “Anonymous & Aggressive Investors: Who owns Berlin & Barcelona?” moderated by Eka Rostomashvili, advocacy and campaigns coordinator at Transparency International.

In the last decade waves of private equity real estate investments have reshaped the rental housing markets in cities like Berlin and Barcelona. Housing and real estate have been deformed by global capital markets and financial excess, treated as a commodity, a vehicle for predatory investment and wealth rather than a social good reflecting a human right. This led to evictions, discriminations in the housing sphere, and lack of access to basic housing-related services, all put in place by aggressive real estate investors.

Trautvetter is co-author of a recent study tracing the ownership of 400 companies owning real estate in Berlin. He explained that in the city, where about 85% of the population are renters, exploding house prices and rentals have been guaranteeing investors returns far beyond 10% per year after the financial crises of 2009. Here the emergence of corporate landlords changed the city. They are entities that own and operate rental housing on a massive scale, replacing the traditional “gentle old lady” landlord. At 17.5%, Berlin has a law proportion of direct investors renting out their properties.

Activists, politicians, and organisations of tenants are trying to fight unlawful evictions and speculative investments reshaping the German capital, but often face anonymity. Almost half of the city is in the hands of listed companies, professional investors, or indirect investors shielded by property management firms and lawyers that operate on their behalf. International private equity companies are one of the most obscure and greedy embodiments of policy failure in this context.

Gabarre de Sus focused on the problem of the opportunistic investment funds that appeared in Spain due to strong deregulation. After the global financial crisis of 2009, the rescue of the Spanish financial system ensured that hundreds of thousands of households were indirectly under public control. But the European Union and the Spanish Government decided in 2012 to sell these properties to opportunistic investors. Many say that if public ownership of these real estates had been maintained for social renting, the rent bubble of recent years would not have occurred. As a result, many vulture funds, particularly from the United States like Blackstone, Hayfin, TPG, and millionaires like the Mexican Carlos Slim, made huge profits. Since then, rent prizes have increased of more than the 50% in the main Spanish cities, more than 30 times faster than wages.

Whilst describing this process, Gabarre de Sus focused on the political and legal ties of big investment funds that invest in real estate. There are structures of political and economic interest that allow companies like Blackstone Group Inc. — one of the largest real estate private equity and investment management firms in the world declaring $140 billion of real estate assets under management, 25% of its total assets — to scale business models in which properties are bought, renovated, and then put back on the market at rents that tenants cannot afford. These actors are influential, with economic partners at international level, including banks from the world’s largest economies.

In many cases, real estate registers do not contain any information on beneficial owners or there is no way to link legal and beneficial owners, so that both authorities and citizens know very little about who owns their cities. EU legislation obliges information on real estate holders to be available to authorities and specifies that the general public shall receive access to beneficial ownership information of EU based companies. The problem is that such registers are usually maintained under self-disclosure principles based on data internally identified by the reporting entity. Access to data is often difficult and expensive. Once you get the information, it can take time to check it and find out contradictory data. Moreover, an articulated system of international shell companies, secrecy legislation, and strategic financial loopholes provides immunity and contributes to global inequality, consolidating the incessant shift of wealth from the poor to the rich.

Manuel Gabarre de Sus (top), Christoph Trautvetter and Eka Rostomashvili during the Keynote “Anonymous & Aggressive Investors: Who owns Berlin & Barcelona?”

In Berlin nearly half of the real estate investors remain anonymous and there is no certainty of how much dirty money hides behind their investments, which is something common to many places around the world. The current situation —  revealed also by the Panama Papers investigation —  shows that governments profit from illegal wealth from transnational money-laundering, hosting international criminal enterprises within their territories and capital cities, thus providing a grey area for illegal practices where false or inappropriate identification represents the other face of fraudulent records and corruption.

The panel “Foggy Properties & Golden Sands: Money Laundering in London & Dubai” moderated by Rima Sghaier, outreach and research fellow at the Hermes Centre for Transparency and Digital Human Rights, made clear how easy and common it is for global elites and organised criminality to open offshore companies, move assets, and buy real estate in big capital cities, with investments that integrate illegal funds into the financial system and legitimate economy.

Sam Leon, data investigations lead at Global Witness, referred to the relations between satellite fiscal havens such as the Virgin Islands, the Cayman, and the Channel Islands, and the City of London. These countries are linked through commercial and legal ties with high probabilities for dark money to flow through the UK’s Overseas territories and Crown Dependencies undetected. 

The UK has a public land registry, but it is difficult to effectively scrutiny data. Companies are obliged to file good quality information, but many do not and authorities are not able to check it accurately. Britain is defined by detractors as the world’s greatest enabler of corporate tax avoidance. Considering real estate, Leon explained that tens of thousands of tenants in England and Wales are in the hands of unscrupulous owners, who hide behind anonymous companies and trusts.

One loophole real estate investors use is acquiring shares in a company that owns real estate, rather than the real estate itself; the property can be then sold by selling the shares in the company with no UK corporate tax. If the company is registered in a country that guarantees secrecy and free hands, no name appears. According Global Witness in England and Wales 87,000 properties with an estimated value of more than 1 billion pounds are owned by companies incorporated in secrecy jurisdictions, which keep secret the information about the real owners. Scotland suffers from the same problem, and in this context Scottish Limited Partnerships are a major concern too.

Sam Leon and Rima Sghaier during the panel “Foggy Properties & Golden Sands: Money Laundering in London & Dubai”

Companies avoid inheritance tax and capital gain tax, riding fiscal loopholes. The use of firms based in countries which are known tax havens to purchase property is being observed all around the world, with concerns about how much property is owned by unaccountable offshore entities.

The analyses of Leon introduced topics covered by the second panellist Karina Shedrofsky, who presented her work as head of OCCRP’s research team “Dubai’s Golden Sands.” Recently leaked datasets of property and residency details were obtained by the non-profit group C4ADS, and provided to the international investigative journalists of the OCCRP as part of the Global Anticorruption Consortium, in collaboration with Transparency International.

International criticism of governments and independent organisations pointed out that Dubai has become an open market for money laundering and a safe haven for the corrupt at a global level, due to the lack of controls along with very profitable conditions. The United Arab Emirates are accused of weakly regulating the financial sector, guaranteeing secrecy, and offering the world’s criminals a range of services. The country’s land registry is not open to the public and a lack of enforcement and oversight in the property sector is ideal to stash vast amounts of dirty money.

Shedrofsky pointed out that Dubai is an absolute monarchy ruled as a business. Several transnational investigations show that its laws seem to be a facilitator for international money laundering, corruption, and other financial crimes.

The emirate has been attracting secretive real estate purchases by foreign companies and individuals for years. Construction and real estate sector represents 20% of the country’s gross domestic product (2016). In the country it is possible to move money with very little regulatory scrutiny, cash-based transactions are incentivised, and the volume of gold trafficked accounts for around 25% of global trade, with almost no questions about its origins. Wealthy investors are offered a property investment visa by an investment in real estate of minimum $272,000 dollars, and get the benefits of light financial regulation, anonymity, and banking secrecy.

Shedrofsky explained how researchers from 8 countries worked on thousands of spreadsheets maintained by real estate professionals, in an accurate cross-border investigation that led to the publication of a hundred names of wealthy people, who have invested millions in Dubai. The non-official records from the years 2014-2016 provided more than 129,000 owner’s data, which the team organised per country and verified, revealing only information that could be proven beyond doubt. A website hosting an interactive map with the detected properties is online, and anyone can check it (occrp.org/en/goldensands/).

Karina Shedrofsky during her presentation of the Dubai’s Golden Sands investigation

The first day of the conference closed with filmmaker and journalist Fredrik Gertten and Leilani Farha, former UN Special Rapporteur on the right to housing, in a live conversation moderated by Tatiana Bazzichelli. Gertten’s latest documentary investigates the factors that push people out from their own city, turning it into an unaffordable place that is more and more difficult to live in due to the extreme difference between housing prices and wage development.

From New York to Barcelona “Push: The Film” narrates how corporations and financial elites are speculating on people’s lives. Renters worldwide are drowning in a sea of self-doubt, with feelings of inadequacy and fears, because they think they are unable to keep up with life. But the documentary shows that this condition is the consequence of a system intended to harm, marginalise, and discriminate them. Even if residents should be able to afford to live in their own cities, this process inexorably condemns them to move away.

The work of the speakers on the first day of the conference reinforced the idea that crowd-based and data-driven research projects, together with independent and cross-border investigations, can allow a glimpse behind the curtain of the real estate market. Anonymity and secrecy in juxtaposition to openness and transparency, obtained through collective mobilisation, collecting, sharing, and analysing data.

A depressingly similar pattern emerges in countries all over the world. Housing has been financialised and turned into an investment vehicle, which has caused an oversupply of luxury estates and empty buildings in many cities, and a chronic shortage of adequate housing for the least advantaged, for the working class, and often for the middle class too. A process often encouraged by governments.

In this context, “financialisation” refers to tendencies within the economic system characterised by the expansion and proliferation of financial markets penetrating into a range of both economic and social sectors, and consequently affecting human rights related goods — such as housing, pensions and healthcare — making huge profits out of basic needs and human sufferings.

With regard to the financialisation of housing, not just banks, corporations, and big investment funds play this ruthless game. Fraudsters, money launderers, and organised crime are very active internationally, and look for weak financial systems and a moment of crisis to speculate on the property market.

Live conversation with Fredrik Gertten (Film Director, SE) and Leilani Farha (Global Director, The Shift and Former UN Special Rapporteur on the Right to Housing)

Ela Kagel, digital strategist and founder of Supermarkt Berlin, discussed collective solutions to tackle housing, social, and economic injustices with the sociologist Volkan Sayman, promoter of the campaign “Expropriate Deutsche Wohnen & Co!”. This movement is an example of how residents can involve themselves to determine and achieve their own objectives, acting on their rights to create a space for their perspectives and needs within an urban context.

After a majority of citizens were found to be in favour of the initiative in early 2019, a city-wide referendum could be now called on the expropriation of private housing companies with more than 3,000 housing units. Local political parties have not managed to find agreement yet and, as a result, the effects of the referendum in Berlin are likely to be minor if people do not keep on supporting it. The expropriation would put 240,000 flats under public control.

As outlined, investors from the international capital market made huge purchases in Berlin’s residential and commercial real estate: the company Deutsche Wohnen alone owns 111,500 apartments in the city. Together with Vonovia, BlackRock, Akelius, Blackstone, Carlyle, Optimum Evolution, and others, these companies own almost one fourth of the city. In the early 2000s Berlin’s government sold many public housing units and areas to these companies, instead of offering them to residents as development project to focus on local communities and their needs. The Expropriate Deutsche Wohnen & Co! community has forced large real estate companies and politicians from all parties to address the issue and successfully raised awareness among Berliners who engaged in it.

In Berlin exasperated renters successfully came together and organised themselves in several ways. They are also appealing to the local council to stop the sale of their homes, and the “Rent Price Cap,” a new policy in force since 2020, has frozen rents on around 1.4 million homes in the German capital. The “Mietendeckel” is supposed to last for 5 years. Twelve constitutional complaints have already been filed against it.

Volkan Sayman and Ela Kagel during the talk “Expropriate Deutsche Wohnen & Co!”

The keynote on the second day “The Human Rights Solution: Tackling the housing crisis” focused on the work of Leilani Farha, UN former Special Rapporteur on adequate housing, in conversation with Justus von Daniels, Editor-in-Chief of CORRECTIV, non-profit newsroom for investigative journalism. Opening the keynote Von Daniels presented the German crowdsourced project he runs — Who owns the city? — which is based on community-powered investigations collecting data to gain a better understanding of the German housing market.

Farha recalled that international human rights law recognizes everyone’s right to an adequate housing and living conditions. Global real estate today represents nearly 60% of the value of all global assets; with housing comprising nearly 75%. That´s more than twice the world’s total gross domestic product. The aspect to consider is that such a vast amount of wealth seems to have left governments accountable to real estate investors rather than to their international human rights obligations.

Farha criticised Blackstone Group Inc. and its subsidiaries for a practice she also confirms has become common throughout the industry in many countries around the world. These companies are targeting multi-family residences in neighbourhoods deemed to be “undervalued,” so a building or several buildings from an area of poor and low-income tenants. The former UN rapporteur described how Blackstone purchases a building, undertakes repairs or renovation, and then increases the rent driving existing tenants out, and replacing them with higher income ones.

As the speakers pointed out, there has been little attention given to the impact of financialisation on housing, which has caused displacement and evictions, changing urban areas forever. Until the massive financial deregulation of the 1980s, housing was built and paid for locally. Governments, local savings, and loan institutions were supposed to provide the bulk of financing for housing up. Due to an ideological shift, determined by the impact of the dominance on financial markets of big investment funds, banks and corporations, housing is increasingly intertwined with flows of global capital. Housing markets are now more responsive to these flows than to local conditions becoming a global industry.

With roots in the 2008 financial crisis, the recent massive wave of investments by international corporations, banks, and big investment funds completed the shift from housing as a place to build a home, to housing as an investment, with devastating consequences for millions of people. The current real-estate cycle started in 2009 and led to significant price increases for residential property in many cities all over the world. Among several factors, the proliferation of predatory equity funds sifting through the world searching for undervalued investment opportunities and finding them in housing.

The global goal is to guarantee everybody legal security and protection against unlawful forced evictions, harassment and other threats, to make sure that personal or household financial costs associated with housing do not threaten or compromise the attainment and satisfaction of other basic needs. We see instead that the needs of disadvantaged and marginalized groups are not taken into account at all. In urban areas public spaces and social facilities disappear together with the expression of cultural identities and ways of life of the original residents.

Justus von Daniels and Leilani Farha during the keynote “The Human Rights Solution: Tackling the housing crisis”

Statistics show that many of the less advantaged are renters, not owners. And rents have increased even faster than housing prices in many metropolitan areas. Some call for more expansion at the urban peripheries with sustainable and modern public housing projects and better infrastructures. Others call for empowering neighbourhoods and local communities to reverse the financialisation process and to improve the conditions of the areas, that are most affected by this process, building more housing for themselves, and distributing those empty ones.

The conclusive panel on the second day was moderated by Iva Čukić, cofounder of the urban development organisation Ministry of Space, set up to occupy abandoned and neglected urban spaces and fill them with projects, workplaces, housing, or alternative art galleries, to enhance everyone’s right to the city. The panel brought into dialogue different modalities of fighting property speculation, and sharing tactics of resistance in the political and media landscape, and presented concrete alternatives for the urban territory.

The first panelist to speak, Marco Clausen, is the co-founder of the Prinzessinnengarten in Berlin, an island of collective gardening in Moritzplatz. The garden represents an open space to share and develop new forms of urban life, where to practice ideas of social-ecological positive transformation, in the context of privatisation and financialisation of real estate in the city. 

In the 2000s Berlin was still a city with vast empty areas, dismissed military facilities and many old empty buildings. In the last 20 years over 3,000 sites in Berlin owned by public housing societies have been privatised. The garden started as a temporary project in 2009 and has been struggling since 2012 against private investors and speculation. Back then activists mobilised 30,000 people to stop the selling process to an investor, and obtained a new contract until 2018. The area around the garden was first in the hands of a Goldmann Sachs fund, and later to Deutsche Wohnen. A small group fought for two years to keep the garden a collective project, managing to prolong the contract for another six years and receiving public funding to rebuild the garden as an open learning and cultural centre. 

Always in Berlin, another group of activists has been mobilising to fight the Amazon tower, that is to be completed in 2023 in the area of Berlin-Friedrichshain. Yonatan Miller, tech-worker and activist from the coalition “Berlin vs. Amazon,” talked about the movement that opposes the big tech company’s project, that will reshape the area and impact many people’s lives. On one side, over the last five years Berlin has already seen the fastest increase in housing prices globally, on the other big tech corporations are known for getting into real-estate market and make things worse for local residents, gentrifying the area. Miler discussed the challenges of the activists, presenting their strategy for the struggle ahead to replicate the success story of New York’s ousting of Amazon in 2019.

The panel proceeded with the StealThisPoster collective and their online archive “stealthisposter.org” maintained by artists and activists part of a network formed around the right to housing movements of London and Rome. The group presented the practice of subvertising, the artistic hacking of corporate and political advertisements to make counter-statements by disrupting lucrative communication of induced desires and needs and parodying of them. Inside urban areas subvertising (portmanteau of subvert and advertise) is an act of reappropriation of those public spaces that have been turned into a vehicle for intrusive and harmful commercial communications.

StealThisPoster recently supported with various guerrilla actions a community fighting against the eviction of the “Lucha y Siesta,” a space of social housing and the first inhabited by an all-female squat in Rome. Their evocative pictures of Roman monuments lit at night by the words “on sale” became viral and helped the cause. However, the existence of this independent legendary social space is still at risk. Lucha Y Siesta was put on auction by the city council of Rome on April 7 this year. The short film premiere “StealThisPoster: Artivism & the Struggle of Lucha Y Siesta” that StealThisPoster created in occasion of Evicted By Greed, focuses on this experience and introduces the practice of subvertising.

A video contribute by Penny Travlou from the University of Edinburgh concluded the panel. Travlou talked about the housing crisis in Athens and the local activists of the AARG collective, Action Against Regeneration & Gentrification, born to fight against eviction, financial speculation, and to support the rights of the refugees.

Alongside the main conference sessions, a workshop on the third day enriched the programme.

The virtual tour “Visiting the Invisible” by Christoph Trautvetter discovers the anonymous and aggressive real estate investors of Berlin, drawing on the findings of the project “Wem gehört die Stadt” of the Rosa-Luxemburg Stiftung, and including further recent studies from other collectives.

Iva Čukić ,Marco Clausen (top) and Yonatan Miller during the panel “Resisting Speculation: Ecological Commons, Subvertising & Fighting Tech Domination”

The conference “Evicted by Greed” presented experts working on anti-corruption, investigative journalists, artists and activists, who met to share effective strategies and community-based approaches to increase awareness on the issues related to the financialisation of housing and its negative effects. Here bottom-up approaches and methods that include local communities in the development of solutions appear to be fundamental. Projects that capacitate collectives, minorities, and marginalized groups to develop and exploit tools to combat systematic inequalities, injustices, and speculation are to be enhanced.

Ghostly shell companies and real estate speculators evict real people from their homes. It is not possible to state that all of these companies are acting illegally, or indeed avoiding paying taxes by being based in tax havens, but it is proven that opaque offshore firms are routinely used by criminals for systemic tax evasion, to buy property as a means to launder or stash dirty money, as well as to dodge taxes.

Open registers and open debate about these issues are very important, and not just for possible judicial outcomes. It is important to find out who the owners of real estates are and give a name to the landlords. Sometimes they might not be speculation oriented individuals and might not be aware of the consequences of their investments, but have delegated ruthless intermediates, lawyers, and investment consultants. There could be hundreds of workers who invested in a pension found without knowing that their profit is based on aggressive speculation.

Equal and non-discriminatory access to public spaces and adequate housing is not possible without an appropriate and effective regulation. The researches, the projects and the investigations presented in this conference are all worthwhile experiences with proven benefits, but ultimately, they may not be enough to alter the structural forces in play. The pandemic has shown that speculators all over the world wait for moments of crisis to purchase new real estates for a lower price, taking advantage of the financial difficulties that many people are experiencing. A growing number of property investors are preparing for what they believe could be a once-in-a generation opportunity to buy distressed real-estate assets at bargain prices. The system facilitates the concentration of real states in the hands of big international landlords and governments remain inert.

The solution cannot be found in one simple formula, or by asking people to buy real estate and become direct investors and new owners, in a deregulated system based on speculation, where most of the individuals struggle to make a living. The global economic system is based on banks holding massive amounts of loans to companies based in tax havens, speculative real estate investments and a small economic elite that makes and escapes rules, defending financial deregulations and feeding social injustice.

Tatiana Bazzichelli, Founder and Programme Director of the Disruption Network Lab.

Videos of the conference are also available on YouTube.

In-depth video contributes by the speakers recorded before the conference are available here: https://www.disruptionlab.org/evicted-videos

For details of speakers and topics, please visit the event page here: https://www.disruptionlab.org/evicted-by-greed

The 20th conference of the Disruption Network Lab curated by Tatiana
Bazzichelli & Mauro Mondello is DATA CITIES: SMART TECHNOLOGIES,
TRACKING & HUMAN RIGHTS. It will take place on September 25-27 at Studio 1, Kunstquartier Bethanien, Mariannenplatz 2, 10997 Berlin. More info: https://www.disruptionlab.org/data-cities

To follow the Disruption Network Lab sign up for its Newsletter and get informed about its conferences, ongoing researches and projects.

The Disruption Network Lab is also on Twitter and Facebook.

AI TRAPS: Automating Discrimination

On June the 16th Tatiana Bazzichelli and Lieke Ploeger presented a new Disruption Network Lab conference entitled “AI TRAPS” to scrutinize Artificial Intelligence and automatic discrimination. The conference touched several topics from biometric surveillance to diversity in data, giving a closer look at how AI and algorithms reinforce prejudices and biases of its human creators and societies, to find solutions and countermeasures.

A focus on facial recognition technologies opened the first panel “THE TRACKED & THE INVISIBLE: From Biometric Surveillance to Diversity in Data Science” discussing how massive sets of images have been used by academic, commercial, defence and intelligence agencies around the world for their research and development. The artist and researcher Adam Harvey addressed this tech as the focal point of an emerging authoritarian logic, based on probabilistic determinations and the assumption that identities are static and reality is made through absolute norms. The artist considered two recent reports about the UK and China showing how this technology is yet unreliable and dangerous. According to data released under the UK´s Freedom of Information Law, 98% of “matches” made by the English Met police using facial recognition were mistakes. Meanwhile, over 200 million cameras are active in China and – although only 15% are supposed to be technically implemented for effective face recognition – Chinese authorities are deploying a new system of this tech to racial profile, track and control the Uighurs Muslim minority.

Big companies like Google and Facebook hold a collection of billions of images, most of which are available inside search engines (63%), on Flickr (25%) and on IMdB (11 %). Biometric companies around the world are implementing facial recognition algorithms on the pictures of common people, collected in unsuspected places like dating-apps and social media, to be used for private profit purposes and governmental mass-surveillance. They end up mostly in China (37%), US (34%), UK (21%) and Australia (4%), as Harvey reported.

Metis Senior Data Scientist Sophie Searcy, technical expert who has also extensively researched on the subject of diversity in tech, contributed to the discussion on such a crucial issue underlying the design and implementation of AI, enforcing the description of a technology that tends to be defective, unable to contextualise and consider the complexity of the reality it interacts with. This generates a lot of false predictions and mistakes. To maximise their results and reduce mistakes tech companies and research institutions that develop algorithms for AI use the Stochastic Gradient Descent (SGD) technique. This enables to pick a few samples selected randomly from a dataset instead of analysing the whole of it for each iteration, saving a considerable amount of time. As Searcy explained during the talk with the panel moderator, Adriana Groh, this technique needs huge amount of data and tech companies are therefore becoming increasingly hungry for them. 

Adam Harvey, Sophie Searcy and Adriana Groh during the panel discussion “THE TRACKED & THE INVISIBLE: From Biometric Surveillance to Diversity in Data Science”
Adam Harvey, Sophie Searcy and Adriana Groh during the panel discussion “THE TRACKED & THE INVISIBLE: From Biometric Surveillance to Diversity in Data Science”

In order to have a closer look at the relation between governments and AI-tech, the researcher and writer Crofton Black presented the study conducted with Cansu Safak at The Bureau of Investigative Journalism on the UK government’s use of big data.  They used publicly available data to build a picture of companies, services and projects in the area of AI and machine learning, to map what IT systems the British government has been buying. To do so they interviewed experts and academics, analysed official transparency data and scraped governmental websites. Transparency and accountability over the way in which public money is spent are a requirement for public administrations and they relied on this principle, filing dozens of requests under the Freedom of Information Act to public authorities to get audit trails. Thus they mapped an ecosystem of the corporate nexus between UK public sector and corporate entities. More than 1,800 IT companies, from big ones like BEA System and IBM to small ones within a constellation of start-ups. 

As Black explained in the talk with the moderator of the keynote Daniel Eriksson, Transparency International Head of Technology, this investigation faced systemic problems with disclosure from authorities, that do not keep transparent and accessible records. Indeed just 25% of the UK-government departments provided some form of info. Therefore details of the assignments are still unknown, but it is at least possible to list the services those companies deploying AI and machine learning can offer governments: connect data and identify links between people, objects, locations; set up automated alerts in the context of border and immigration control, spotting out changes in data and events of interest; work on passports application programs, implementing the risk-based approaches to passports application assessments; work on identity verification services using smartphones, gathering real time biometric authentications. These are just few examples.

Crofton Black and Daniel Eriksson during the panel discussion “HOW IS GOVERNMENT USING BIG DATA?”

Crofton Black and Daniel Eriksson during the panel discussion “HOW IS GOVERNMENT USING BIG DATA?”

Maya Indira Ganesh opened the panel “AI FOR THE PEOPLE: AI Bias, Ethics & the Common Good” questioning how tech and research have historically been almost always developed and conducted on prejudiced parameters, falsifying results and distorting reality. For instance, data about women’s heart attacks hadn´t been taken in consideration for decades, until doctors and scientists determined that ECG-machines calibrated on the data collected from early ´60s could neither predict heart attacks in women, nor give reliable data for therapeutic purposes, because they were trained only on male population. Just from 2007 ECG-machines were recalibrated on parameters based on data collected from female individuals. It is not possible to calculate the impact this gender inequality had on the development of modern cardiovascular medicine and on the lives of millions of women.

As the issue of algorithmic bias in tech and specifically in AI grows, all big tech firms and research institutions are writing ethics charters and establishing ethics boards sponsoring research in these topics. Detractors often refer to it as ethics-washing, which Ganesh finds a trick to mask ethics and morality as something definable in universal terms or scale: though it cannot be computed by machines, corporations need us to believe that ethics is something measurable. The researcher suggested that in such a way the abstraction and the complexity of the machine get easy to process as ethics becomes the interface used to obfuscate what is going on inside the black box and represent its abstractions. “But these abstractions are us and our way to build relations” she objected. 

Ganesh wonders consequently according to what principle it shall be acceptable to train a facial recognition system, basing it on video of transgender people, as it happened in the alarming “Robust transgender face recognition” research, based on data from people undergoing hormone replacement therapy, Youtube videos, diaries and time-lapse documentation of the transition process. The HRT Transgender Dataset used to train AI to recognize transgender people worsens the harassment and the targeting that trans-people already experience daily, targeting and harming them as a group. However, it was partly financed by FBI and US-Army, confirming that law enforcement and national security agencies appear to be very interested in these kinds of datasets and look for private companies and researchers able to provide it. 

In this same panel professor of Data Science and Public Policy Slava Jankin reflected on how machine learning can be used for common good in the public sector. As it was objected during the discussion moderated by Nicole Shephard, Researcher on Gender, Technology and Politics of Data, the “common good” isn’t easy to define, and like ethics it is not universally given. It could be identified with those goods that are relevant to guarantee and determine the respect of human rights and their practice. The project that Jankin presented was developed inside the Essex Centre for Data analytics in a synergic effort of developers, researches, universities and local authorities. Together, they tried to build an AI able to predict within reliability where children lacking school readiness are more likely to be found geographically, to support them in their transition and gaining competencies, considering social, economic and environmental conditions.

Maya Indira Ganesh during her lecture part of “AI FOR THE PEOPLE: AI Bias, Ethics & the Common Good”
Maya Indira Ganesh during her lecture part of “AI FOR THE PEOPLE: AI Bias, Ethics & the Common Good” 

The first keynote of the conference was the researcher and activist Charlotte Webb, who presented her project Feminist Internet in the talk “WHAT IS A FEMINIST AI?” 

<<There is not just one possible internet and there is not just one possible feminism, but only possible feminisms and possible internets>>. Starting from this assumption Webb talked about Feminist Human Computer Interaction, a discipline born to improve understandings about how gender identities and relations shape the design and use of interactive technologies. Her Feminist Internet is a no profit organisation funded to make internet a more equal space for women and other marginalized groups. Its approach combines art, design, critical thinking, creative technology development and feminism, seeking to build more responsible and bias-free AI able to empower people considering the causes of marginalization and discrimination. In her words, a feminist AI is not an algorithm and is not a system built to evangelize about a certain political or ideological cause. It is a tool that aims at recognizing differences without minimizing them for the sake of universality, meeting human needs with the awareness of the entire ecosystem in which it sits. 

Tech adapts plastically to pre-existing discriminations and gender stereotypes. In a recent UN report, the ‘female’ obsequiousness and the servility expressed by digital assistants like Alexa, the Google Assistant, are defined as example of gender biases coded into tech products, since they are often projected as young women. They are programmed to be submissive and accept abuses. As stated by Feldman (2016) by encouraging consumers to understand the objects that serve them as women, technologists abet the prejudice by which women are considered objects. With her projects, Webb pushes to create alternatives that educate to shift this systemic problem – rather than complying with market demands – first considering that there is a diversity crisis in the AI sector and in the Silicon Valley. Between 2.5 and 4% of Google, Facebook and Microsoft employees are black, whilst there are no public data on transgender workers within these companies. Moreover, as Webb pointed out, just 22% of the people building AI right now are female, only 18% of authors at major AI-conferences are women, whilst over 80% of AI-professors are men. Considering companies with decisive impact on society women comprise only 15% of AI research staff at Facebook and 10% in Google. 

Women, people of colour, minorities, LGBTQ and marginalized groups are substantially not deciding about designing and implementing AI and algorithms. They are excluded from the processes of coding and programming. As a result the work of engineers and designers is not inherently neutral and the automated systems that they build reflect their perspectives, preferences, priorities and eventually their bias. 

Charlotte Webb during her keynote “WHAT IS A FEMINIST AI?”
Charlotte Webb during her keynote “WHAT IS A FEMINIST AI?”

Washington Tech Policy Advisor Mutale Nkonde focused on this issue in her keynote “RACIAL DISCRIMINATION IN THE AGE OF AI.” She opened her dissertation reporting that Google´s facial intelligence team is composed by 893 people, and just one is a black woman, an intern. Questions, answers and predictions in their technological work will always reflect a political and socioeconomic point of view, consciously or unconsciously. A lot of the tech-people confronted with this wide-ranging problem seem to undermine it, showing colour-blindness tendencies about what impacts their tech have on minorities and specifically black people. Historically credit scores are correlated with racist segregated neighbourhoods and risk analyses and predictive policing data are corrupted by racist prejudice, leading to biased data collection reinforcing privileges. Without a conscious effort to address racism in technology, new technologies will replicate old divisions and conflicts. By instituting policies like facial recognition we just replicate rooted behaviours based on racial lines and gender stereotypes mediated by algorithms. Nkonde warns that civil liberties need an update for the era of AI, advancing racial literacy in Tech.

In a talk with the moderator, the writer Rhianna Ilube, the keynote Nkonde recalled that in New York´s poor and black neighbourhood with historically high crime and violence rates, Brownsville, a private landlord in social housing wanted to exchange keys for facial recognition software, so that either people accept surveillance, or they lose their homes. The finding echoes wider concerns about the lack of awareness of racism. Nkonde thinks that white people must be able to cope with the inconvenience of talking about race, with the countervailing pressures and their lack of cultural preparation, or simply the risk to get it wrong. Acting ethically isn´t easy if you do not work on it and many big tech companies just like to crow about their diversity and inclusion efforts, disclosing diversity goals and offering courses that reduce bias. However, there is a high level of racial discrimination in tech sector and specifically in the Silicon Valley, at best colour-blindness – said Nkonde – since many believe that racial classification does not limit a person’s opportunities within the society, ignoring that there are instead economic and social obstacles that prevent full individual development and participation, limiting freedom and equality, excluding marginalized and disadvantaged groups from the political, economic, and social organization. Nkonde concluded her keynote stressing that we need to empower minorities, providing tools that allow overcoming autonomously socio-economic obstacles, to fully participate in society. It is about sharing power, taking in consideration the unconscious biases of people, for example starting from those designing the technology. 

Mutale Nkonde during her keynote “RACIAL DISCRIMINATION IN THE AGE OF AI?”
Mutale Nkonde during her keynote “RACIAL DISCRIMINATION IN THE AGE OF AI?”

The closing panel “ON THE POLITICS OF AI: Fighting Injustice & Automatic Supremacism” discussed the effect of a tool shown to be not neutral, but just the product of the prevailing social economical model. 

Dia Kayyali, Leader of the Tech and Advocacy program at WITNESS, described how AI is facilitating white supremacy, nationalism, racism and transphobia, recalling the dramatic case of the Rohingya persecution in Myanmar and the oppressive Chinese social score and surveillance systems. Pointing out critical aspects the researcher reported the case of the Youtube anti-extremism-algorithm, which removed thousands of videos documenting atrocities in Syria in an effort to purge hate speech and propaganda from its platform. The algorithm was trained to automatically flag and eliminate content that potentially breached its guidelines and ended up cancelling documents relevant to prosecute war crimes. Once again, the absence of the ability to contextualize leads to severe risks in the way machines operate and make decisions. Likewise, applying general parameters without considering specificities and the complex concept of identity, Facebook imposed in 2015 new policies and arbitrarily exposed drag queens, trans people and other users at risk, who were not using their legal names for safety and privacy reasons, including domestic violence and stalking. 

Researcher on gender, tech and (counter) power Os Keyes considered that AI is not the problem, but the symptom. The problem are the structures creating AI. We live in an environment where few highly wealthy people and companies are ruling all. We have bias in AI and tech because their development is driven by exactly those same individuals. To fix AI we have to change requirements and expectations around it; we can fight to have AI based on explainability and transparency, but eventually if we strive to fix AI and do not look at the wider picture, in 10 years the same debate over another technology will arise. Keyes considered that since its very beginning AI-tech was discriminatory, racialized and gendered, because society is capitalist, racist, homo-transphobic and misogynistic. The question to pose is how we start building spaces that are prefigurative and constructed on values that we want a wider society to embrace. 

As the funder and curator of the Disruption Network Lab Tatiana Bazzichelli pointed out during the moderation of this panel, the problem of bias in algorithms is related to several major “bias traps” that algorithm-based prediction systems fail to win. The fact that AI is political – not just because of the question of what is to be done with it, but because of the political tendencies of the technology itself – is the real aspect to discuss.

In his analysis of the political effects of AI, Dan McQuillan, Lecturer in Creative and Social Computing from the London University, underlined that while the reform of AI is endlessly discussed, there seems to be no attempt to seriously question whether we should be using it at all. We need to think collectively about ways out, learning from and with each other rather than relying on machine learning. Countering thoughtlessness of AI with practices of solidarity, self-management and collective care is what he suggests because bringing the perspective of marginalised groups at the core of AI practice, it is possible to build a new society within the old, based on social autonomy.

What McQuillan calls the AI realism appears to be close to the far-right perspective, as it trivialises complexity and naturalises inequalities. The character of learning through AI implicates indeed reductive simplifications, and simplifying social problems to matters of exclusion is the politics of populist and Fascist right. McQuillan suggests taking some guidance from the feminist and decolonial technology studies that have cast doubt on our ideas about objectivity and neutrality. An antifascist AI, he explains, shall involve some kinds of people’s councils, to put the perspective of marginalised groups at the core of AI practice and to transform machine learning into a form of critical pedagogy.

Pic 7: Dia Kayyali, Os Keyes, Dan McQuillan and Tatiana Bazzichelli during the panel “ON THE POLITICS OF AI: Fighting Injustice & Automatic Supremacism”

Dia Kayyali, Os Keyes, Dan McQuillan and Tatiana Bazzichelli during the panel "ON THE POLITICS OF AI: Fighting Injustice & Automatic Supremacism"
Dia Kayyali, Os Keyes, Dan McQuillan and Tatiana Bazzichelli during the panel “ON THE POLITICS OF AI: Fighting Injustice & Automatic Supremacism”

We see increasing investment on AI, machine learning and robots. Automated decision-making informed by algorithms is already a predominant reality, whose range of applications has broadened to almost all aspects of life. Current ethical debates about the consequences of automation focus on the rights of individuals and marginalized groups. However, algorithmic processes generate a collective impact too, that can only be addressed partially at the level of individual rights, as it is the result of a collective cultural legacy. A society that is soaked in racial and sexual discriminations will replicate them inside technology. 

Moreover, when referring to surveillance technology and face recognition software, existing ethical and legal criteria appear to be ineffective and a lack of standards around their use and sharing just benefit its intrusive and discriminatory nature.

Whilst building alternatives we need to consider inclusion and diversity: If more brown and black people would be involved in the building and making of these systems, there would be less bias. But this is not enough. Automated systems are mostly trying to identify and predict risk, and risk is defined according to cultural parameters that reflect the historical, social and political milieu, to give answers able to fit a certain point of view and make decisions. What we are and where we are as a collective, what we have achieved and what we still lack culturally is what is put in software to make those same decisions in the future. In such a context a diverse team within a discriminatory conflictual society might find ways to flash the problem of bias away, but it will get somewhere else.

The truth is that automated discrimination, racism and sexism are integrated in tech-infrastructures. New generation of start-ups are fulfilling authoritarian needs, commercialising AI-technologies, automating biases based on skin colour and ethnicity, sexual orientation and identity. They develop censored search engine and platforms for authoritarian governments and dictators, refine high-tech military weapons training them using facial recognition on millions of people without their knowledge. Governments and corporations are developing technology in ways that threaten civil liberties and human rights.  It is not hard to imagine the impact of the implementation of tools for robotic gender recognition, within countries were non-white, non-male and non-binary individuals are discriminated. Bathrooms and changing rooms that open just by AI gender-detection, or cars that start the engine just if a man is driving, are to be expected. Those not gender conforming, who do not fit traditional gender structures, will end up being systematically blocked and discriminated.

Open source, transparency and diversity alone will not defeat colour-blinded attitudes, reactionary backlashes, monopolies, other-directed homologation and cultural oppression by design. As it was discussed in the conference, using algorithms to label people based on sexual identity or ethnicity has become easy and common. If you build a technology able to catalogue people by ethnicity or sexual identity, someone will exploit it to repress genders or ethnicities, China shows.
In this sense, no better facial recognition is possible, no mass-surveillance tech is safe and attempts at building good tech will continue to fail. To tackle bias, discrimination and harm in AI we have to integrate research on and development of technology with all of the humanities and social sciences, deciding to consciously create a society where everybody could participate to the organisation of our common future.


Curated by Tatiana Bazzichelli and developed in cooperation with Transparency International, this Disruption Network Lab-conference was the second of the 2019 series The Art of Exposing Injustice.
More info, all its speakers and thematic could be found here: https://www.disruptionlab.org/ai-traps
The videos of the conference are on Youtube and the Disruption Network Lab is also on Twitter and Facebook.

To follow the Disruption Network Lab sign up for its Newsletter and get informed about its conferences, ongoing researches and projects. The next Disruption Network Lab event “Citizen of evidence” is planned for September  20-21 in Kunstquartier Bethanien Berlin. Make sure you don´t miss it!

Photocredits: Maria Silvano for Disruption Network Lab