14 November 2018, 9:00am
- Despite the fact that over 65 million migrants and refugees should have their human rights protected reports from the ground are showing that basic rights have been withdrawn in certain parts of the world including in Europe and that we are assisting to the rise of negative narratives that dehumanise this community.
- The drive for using innovation in refugee camps these are becoming grounds for testing and trialing new technologies (biometrics, blockchain, AI) that retrieve personal data. The lack of transparency and accountability and the inexistence of clear and concerted policies that protect the data and privacy may be putting at risk very vulnerable populations.
- In order to find solutions and to ensure that the human rights of refugees and displaced people are protected online and offline the voices of this community need to be brought to the table making sure that refugees and displaced people are involved in the design process and are setting the agenda on the discussion, together with advocacy groups that represent them. Governments, private sector, technical communities and civil society need to work collaboratively to ensure a human rights-based approach to the development and use of emerging technologies.
IRPC Workshop at IGF 2018
Organizer(s): Minda Moreira and Marianne Franklin
Chair/Moderator: Marianne Franklin
Rapporteur: Minda Moreira
List of speakers and their institutional affiliations:
- Astri Kimball (Google), private sector
- [absent] ]Andrew Toft (Department for International Development, UK), government
- Eimear Farrell (Amnesty International), civil society
- Jean Guo (Konexio), civil society
- Valentina Pellizzer (APC Women), civil society
Theme: Human Rights, Gender & Youth
Subtheme: Refugees
Session Report
Introduction
The session was moderated by Marianne Franklin, Internet Rights and Principles Coalition (IRPC) co-chair who introduced the session looking at the UNHCR report that states that there are over 65 million refugees and displaced people.She referred to the importance of a mobile phone and connectivity in improving refugees’ well-being.
Marianne Franklin pointed out how the “Refugee Crisis” came to highlight the connectivity and accountability issues and how refugee camps need to develop all of sorts of tailor-made digital tools that can collect data in order to respond to these immediate needs, which often turn into long life needs as children have been born in these temporary dwellings.
Taking into account the many reports from the field that show how refuges and displaced persons are been stripped out of their basic rights within certain parts of the EU, UK and other parts of the world (the possibility to keep a personal phone to access info or contact family and friends, etc), the session moderator asked whether human rights online – which have been recognised by the UN, work or do not work for refugees and displaced people.
Finally she introduced in the discussion the two sides in the same coin of emerging technologies: how emerging technologies and digital tools that can provide empowerment, enrichment, hope and contact with the loved ones are also being used to track and label people who do not fall easily under the citizenship rubric.
Speaker’s input
Valentina Pellizzer, APC Women brought in her experience of working with refugees in the Balkans to highlight the dehumanisation of refugees. She pointed out that in order to tackle this problem, the issue of whiteness – which is at the core of the existence of refugees, needs to be addressed. Only by looking at the past and acknowledging the responsibility of colonialism, imperialism and capitalism and greed of white people – she added, can we find ways to counter these negative narratives.
Valentina also referred to her work at Tacit Futures to point out the existence of an infrastructure of data surveillance paid with public money that tracks mass migration and refugees. She stressed the importance of reflecting on the polarity between the fragility of infrastructure of care – based on civil society organisation taking steps to connect people on the move with their friends and family and the public-private partnerships in refugee camps, She urged for a third party – composed by civil society and the refugees themselves to monitor this partnership.
Jean Guo, Konexio, explained the work with refugees and newcomers developed at her organisation to ensure that those who are the most vulnerable, including refugees have the opportunity to access digital skills and rights.
To highlight the importance of connectivity, Jean referred to internal and external sources to show that one third of disposable income among refugees is spent on connectivity and that mobile is the most important device that a refugee would take in extreme circumstances because they are the connection to the word, resources and information. At Konexio, she added, three quarters of the students have smart phones.
Jean stressed that it is important to make the distinction between the different levels of usage and to understand how technologies and smart devices are used, as older and younger generations, may use these devices in different ways.
On protection of digital data she highlighted the importance of creating solutions that take into account the possible dangers and risks associated with connectivity. She added that developers need to be made aware, through education and information, that devices and apps that they are developing can bring out solutions but can also collect information that can put lives in danger.
She concluded her contribution with the three Digital Rights that Konexio promotes:
- the right to information,
- the right to education and
- the right to work and to open opportunities to those who were granted asylum to have the necessary skills for fully integration overcoming any professional, economic and social barriers
Astri Kimball, Google, highlighted the work of her organisation has been done to provide solutions and help support 800 000 refugees and displaced people, in terms of access (maps, people finder, google translate), educational resources through online education and positive narratives about immigration – Youtube Creators for Change, lifting up voices to counter the negative ones.
Referring to Artificial Intelligence solutions, Astri mentioned that technology can provide solutions, adding that while the protection of rights and privacy is a global challenge, the power of the data to find patterns and provide solutions is a great opportunity for society. Therefore the right balance needs to be found not to put privacy and rights at risk.
Eimear Farrell, Amnesty Tech, Amnesty International (AI), started her intervention by pointing out that while examples of tech for good and how it is used to bring good and to improve lives of refugees and migrants are all around, it is also very important to ensure that they cause no harm.
She highlighted the fact refugee camps have been the ground for testing and trialling of technologies in very vulnerable populations and that this can bring serious and sometimes unpredictable consequences.
Talking about the work that AI has been doing to address these issues, she mentioned UN’s Global Compact on Migration an Global Compact on Refugees which included language around data, biometrics and how AI together with other organisations and NGO’s advocated for improved language on privacy and data protection in these areas, especially in the case of biometrics.
AI advocates a HR based approach to the use of emerging technologies by looking at questions such as participation of the people affected, equality, non discrimination, also transparency, right to remedy and accountability. The AI is also starting to organise a coalition of groups interested to acting in these issues to look at the gaps in knowledge, provide practical information and documentation and to propose alternative solutions.
Eimear addressed the political securitization of immigration and the drive to use technology towards innovation in humanitarian contexts. She stressed the fact that refugee camps are becoming testing grounds for new technologies and that this needs to be highlighted in everyone’s interest. Digital Identity – she added, can be empowering but it needs to be under the control of the people owning this identity.
As for the partnerships between governments private sector organisations in refugee camps, she stressed the need to tackle the lack of transparency on contracts that are in place and data sharing agreements. She mentioned the crucial role of private companies to ensure that HR are respected and that any means of surveillance on very vulnerable populations often fleeing on the basis of they identity is an issue that governments need address, as identity is critical to their protection.
After the initial contributions Marianne Franklin opened the session to all participants in the room and online.
Questions from the floor
Questions from the floor included:
- Are companies (such as Google) applying HR standards at the design level of new digital tools (Emi Danish Institute for HR)
- What measures are going to implemented in collaboration with governments to ensure that the rights and digital of refugees are protected? (Faith, Honk kong Youth IGF)
Other comments from the audience addressed the negative aspects from some of the tools created to develop the positive narratives (such as Google’s Creators for Change) as these are sometimes being used to lure people to the countries where they are then trafficked. They also referred to the limitations of some other tools (e.g Google Translator and Kurdish Sonali) in addressing the needs of refugees and displaced people.
Final Contributions
On the final round of contributions from the panel, Marianne Franklin asked for concrete solutions and recommendations:
Astri Kimball mentioned how Google applies Human Rights standards to all their products that they launch and pointed to their seven principles around AI. She suggested the introduction of group privacy as a best practice.
Jean Guo, focused on inclusivity and highlighted the importance of developing solutions that really address the needs of the community they aim to address.
She suggested that tech communities need to include refugees at the table and that the new tools are developed on user experience perspective and stressed the need of testing new tools before they are launched at full capacity to ensure that they are actually useful and safe.
Addressing government collaboration she urged the end stereotyping and asked more positive perspectives, more positive examples in terms the economic productivity of the newcomers.
Eimear Farrell asked for a reflective and responsive tech and bringing in the voices of the affected stakeholders and involving them in the design process.
She stressed the importance of looking at how people use tech and at low tech solutions and working with donors and funders to make sure that they are funding the right type of solutions.
She also called for disrupting technology bending towards justice and using technologies on HR work or solutions that help further the rights of refugees and migrants and working collaboratively
Her final suggestions including the need to ensure that refugees themselves and migrants and the advocacy groups that represent them are setting the agenda on discussions around global standards on these issues and that data protection and innovation need to be mutually reinforcing.
Valentina Pellizzer called for the responsible use of data, and the need to reflect about how data is collected and working towards a standard policy. She suggested that auditing has to be done by refugees themselves and the civil society. On positive narratives, she pointed out that we have too look at how news are indexed and to what is pushing the negative narratives up.
Final Open Question
The session finished with an open-question from the floor:
Do companies who develop these tools human rights training?
Are we bringing those members of the team to the field to see what they are trying to field and what is the outcome of what they are developing.
Do they get an emotional connection to what they are developing? Are we integrating any of the refugees to training them to become coders to get them into the solutions? Are we running hackathons with them? Or is it always the same people from the tech companies who are just trying to figure out again solutions from the outside?