The human rights implications checklist

Print
Email

1. General reflections on the technical process/es being explored

Note: This could be a single technical process or a set of processes together if they are pursuing the same aim

  1. Aim and Nature of the technical process: Describe here what the process/algorithm will do (once developed) and why is it being developed.
  2. Aim: What outcome are you hoping to achieve followed by further steps that are taken?
  3. Obstacles: What are the main obstacles or barriers to achieving this aim?

2. What data is being collected

  1. What data is involved?  Is some of the data personal data?
  2. Who will carry out the process: who controls the actions of this process?  Who is responsible for carrying it out? Here think ahead: if MIRROR is taken up in practice by LEAs who will be responsible for this process?
  3. People affected: Who will/may be affected? How many people would certainly be affected?  How many people could potentially be affected?  Say a bit about the people affected (e.g. is a community identified, are individuals identifiable).
  4. Is data really needed?  Can the aim be achieved in another manner?  With less data?

3. Design choices

Design choices: in designing a process/action we make design choices, have we followed a:

  • ‘user-centred design’ choice? And is the user here an LEA, an NGO, or a researcher?
  • ‘research subject centred design’? And who is the research subject here e.g. a migrant? An NGO worker?
  • ‘privacy by design’ choice? Which are the immediate privacy concerns that would need to be addressed if the design is, say, either user-centred or research subject centred
  • ‘privacy by default’ choice? Is any personal data being collected absolutely necessary for the objective of the project or else is it simply ‘potentially likely to be useful’ or is it ‘just in case’ one needs it later on? Can we do without it or is it absolutely necessary?

4. Analysing impacts on human rights

An adverse human rights impact occurs when an action or omission removes or reduces the ability of an individual to enjoy her or his human rights.

Three background considerations:

  • human dignity
  • autonomy
  • responsibility.

Human dignity, as article 1 of the European Union Charter of Fundamental Rights states, ‘is inviolable. It must be respected and protected’. The respect of the dignity of the human person is not only a fundamental right in itself but constitutes the real basis of fundamental rights.  Indeed, the 1948 Universal Declaration of Human Rights enshrined human dignity in its preamble: ‘Whereas recognition of the inherent dignity and of the equal and inalienable rights of all members of the human family is the foundation of freedom, justice and peace in the world.’ In essence, the concept of human dignity is the belief that all people hold a special value that’s tied solely to their humanity. It has nothing to do with class, race, gender, religion, abilities, or any other factor other than them being human.  Dignity can be thought of as the ability to pursue one’s rights, claims or interests in daily life so that one can attain full realization of one’s talents, ambitions or abilities, as one would like. Central to the discussion of all human rights is respect to human dignity. In the development of any technological tool, we need to ensure that human dignity is respected. What this means in practice is that we can never lose sight of the development of the technology on the effects the technology or the process may have on human beings and that this effect should not be in any way disrespectful or harmful to the person. The discussion on other fundamental rights below helps in and supports the reflection on this fundamental approach that no technological development should in any way harm the dignity of a person.

Autonomy is a second important consideration in all our discussions on human rights. Autonomy refers to the ability of a person to act and to make independent choices that are significant for him or her. The enjoyment of fundamental rights presupposes that one is able to make choices and to act following those choices in an autonomous manner. Any action that reduces this autonomy, e.g. by not allowing a person to make a choice, needs to be further explored. There may be times that a reduction in autonomy is justified by law but this should be considered to be an exception rather than the rule. This means that in the development and design choices in the MIRROR project we need to ensure that the autonomy of any person affected by the technology or the technological processes is as far as possible not impacted. This may include for example that we may need to consider how to support the autonomy of persons whose social media post we may be using as the basis of our further analysis. At times respecting the autonomy of people also means additional steps the project or process needs to undertake to ensure this fundamental consideration.

At this point, it is important to realise that going through a human rights implications checklist requires an open mind and the realisation that the results of further reflection on human rights and implications may require fundamental changes to the way we would have planned to design or further develop our technology and technological processes. This checklist is not meant to be just a tick box exercise indicating that we have considered human rights but one where a process of deep thinking on, and possibly extensive consequences for, the design choices have been followed.

The third consideration is responsibility: Do we have a responsibility to respect fundamental rights in the design of our technologies? We have both an ethical and legal responsibility to respect fundamental rights as enshrined in the Charter of Fundamental Rights of the European Union and the European Convention of Human Rights. While, in these documents, the direct responsibility for these rights is directed towards states as the ones ultimately responsible for ensuring respect for fundamental rights, states have articulated many of these responsibilities in subsidiary laws and other legal principles towards different actors in the state. One example of this can be seen in the right to data protection: the right is addressed to states and is found in the Charter of Fundamental Rights of the European Union and the working out of this right and how this right needs to be protected in an everyday context is found in the EU General Data Protection Regulation which in turn requires developers of technology to reflect these data protection rights in the design and development of technical processes when processing personal data of individuals. Together with laws that come from the European and national level to respect the fundamental rights we are at times also guided by the judgements of the European Court of Human Rights, the Court of Justice of the European Union and judgements from national (constitutional) courts. The checklist below at times makes references to these judgements for clarification of certain concepts or obligations. This document however does not comprehensively reference to all sources of responsibility. While reflecting on the questions in the checklist below we ask you to assume that technologists do have a responsibility to consider human rights ramifications and impacts of what they are developing. Technologists additionally share responsibility with others, for example, an exclusively legal team of WP3, to also consider how these ramifications and impacts can be mitigated.

Structure of the human rights table

The table consists of four columns. In the first column, we list the relevant human right that we consider that needs further reflection during the development of technological processes in the project.

In the second column, we give a brief explanation of the right and what it consists of, we give (as far as possible) examples, sometimes fictitious, at other times related to developments in MIRROR and lastly, we introduce a set of questions to aid reflection on the ramifications and impacts that our technologies may have on this particular right.

In the third column, we note the degree of severity of impacts on this right. In this column, we have tried to follow the guidance on establishing impact severity provided by the Danish Institute for Human Rights. The purpose of establishing impact severity is not to establish which impacts need to be addressed, but to determine the order in which the identified impacts should be addressed. Following the UN Guiding Principles:

  • All human rights impacts need to be addressed;
  • Where it is not possible to address all impacts simultaneously, the impacts should be addressed in order of their ‘severity’;
  • Severity is determined by the scope (number of people affected), scale (seriousness of the impact) and irremediability (any limits to restore the individual impacted to at least the same as, or equivalent to, her or his situation before the adverse impact occurred);
  • While it is not necessary for an impact to have more than one of these characteristics to be considered ‘severe’, it is often the case that the greater the scale or the scope of an impact, the less it is ‘remediable’.

In the fourth column, we add some examples of possible mitigating strategies that can be applied by the project to reduce the impact on human rights.  When suggesting mitigating strategies, it is important to consider that any alternative measure must also be compatible with human rights standards. We take this opportunity to note that human rights impact cannot be subject to ‘offsetting’ in the same way that, for example, environmental impacts can be. For example, a carbon offset is a reduction in emissions of carbon dioxide made in order to compensate for or to offset an emission made elsewhere. With human rights impacts on the other hand, due to the fact that human rights are indivisible and interrelated, it is not considered appropriate to offset one human rights impact with a ‘positive contribution’ elsewhere. Hence, we cannot, for example, offset the wide collection of social media posts filtered on political opinion by not offering automated facial recognition capabilities in other parts of the MIRROR portal.

[Download] MIRROR Toolkit: Human Rights Table

Resources

Bundesverfassungsgericht (2006) Rasterfahndung nur bei konkreter Gefahr für hochrangige Rechtsgüter zulässig. Pressemitteilung Nr. 40/2006. Available at: https://www.bundesverfassungsgericht.de/SharedDocs/Pressemitteilungen/DE/2006/bvg06-040.html [accessed on 23 May 2022].

Carozza, P.G. (2013). Human Dignity, in The Oxford Handbook of International Human Rights Law. Oxford University Press.

Commissioner for Human Rights of the Council of Europe (2019). Unboxing Artificial Intelligence: 10 steps to protect Human Rights. Council of Europe.  Available at: https://rm.coe.int/unboxing-artificial-intelligence-10-steps-to-%20protect-human-rights-reco/1680946e64 [accessed on 26 July 2020].

Committee of Ministers (2012) Recommendation CM/Rec (2012)4 of the Committee of Ministers to member States on the protection of human rights with regard to social networking services. Council of Europe.

Committee of Ministers (2014) Recommendation CM/Rec (2014)6 of the Committee of Ministers to member States on a Guide to human rights for Internet users. Council of Europe.

Committee of Ministers (2016) Recommendation CM/Rec (2016)5 of the Committee of Ministers to member States on Internet freedom. Council of Europe.

Council of Europe (2018) ALGORITHMS AND HUMAN RIGHTS: Study on the human rights dimensions of automated data processing techniques and possible regulatory implications. Council of Europe, p.11. Available at: https://rm.coe.int/algorithms-and-human-rights-en-rev/16807956b5 [accessed on 23 May 2022].

Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA, OJ L 119 (Police Directive), Art. 3(4).

Eberle, E.J. (2012). Observations on the Development of Human Dignity and Personality in German Constitutional Law: An Overview. Liverpool Law Review, V33, issue 3, pp. 201-233. Available at: https://link.springer.com/article/10.1007/s10991-012-9120-x [accessed on 23 May 2022].

ECtHR, Goodwin v. the United Kingdom. App. no. 28957/95, § 90.

ECtHR, Malone v. The United Kingdom, App. No. 8691/79, para § 81.

ECtHR, Pretty v. the United Kingdom. App. no. 2346/02, § 65, § 41.

ECtHR, S.W. v. the United Kingdom. App. no. 87/18.

ECtHR, V.C. v. Slovakia. App. no. 18968/07, § 105.

ECtHR, Yildirim v. Turkey.  App. No.  3111/10.

European Digital Rights (2020). The human rights impact of migration control technologies. Edri.org. Available at: https://edri.org/the-human-rights-impacts-of-migration-control-technologies/ [accessed on 23 May 2022].

Fundamental Rights Agency (2018). Preventing unlawful profiling today and in the future: a guide. Fundamental Rights Agency. Available at: https://fra.europa.eu/sites/default/files/fra_uploads/fra-2018-preventing-unlawful-profiling-guide_en.pdf [accessed on 23 May 2022].

Fundamental Rights Agency (2020). Border controls and fundamental rights at external land borders. Fundamental Rights Agency. Available at: https://fra.europa.eu/en/publication/2020/border-controls-and-fundamental-rights-external-land-borders [accessed on 23 May 2022].

Fundamental Rights Agency, Council of Europe & EDPS (2018). Handbook on European data protection law. Publications Office, Luxembourg, p. 19, 83, 90.

Fundamental Rights Agency. EU Charter of Fundamental Rights, Article 1 – Human Dignity. Fundamental Rights agency.  Available at: https://fra.europa.eu/en/eu-charter/article/1-human-dignity [accessed on 23 May 2022].

High Commissioner for Human Rights (1997). Professional Training Session No.5: Human Rights and Law Enforcement. United Nations. Available at: https://www.ohchr.org/Documents/Publications/training5en.pdf [accessed on 23 May 2022].

Hildebrandt, M. (2019). Privacy and Data Protection, in Law for Computer Scientists. Lawforcomputerscientists.pubpub.org. Available at: https://lawforcomputerscientists.pubpub.org/pub/doreuiyy.

Hodgson, D.W. (2019). Human Rights and Autonomy. In: Social Justice Theory and Practice for Social Work. Springer, Singapore.

https://www.humanrights.dk/sites/humanrights.dk/files/media/dokumenter/business/hria_toolbox/hria_guidance_and_toolbox_final_jan2016.pdf [accessed on 23 May 2022].

Human Rights Careers. What is Human Dignity? Common Definitions. Humanrightscareers.com. Available at: https://www.humanrightscareers.com/issues/definitions-what-is-human-dignity/ [accessed on 23 May 2022].

Kant, M. (2006). Nothing doing? Taking stock of data trawling operations in Germany after 11 September 2001. Statewatch.org. Available at: https://www.statewatch.org/media/documents/news/2006/aug/profil.pdf [accessed on 23 May 2022].

Kett-Straub, G. Data Screening of Muslim Sleepers Unconstitutional. pp. 967 – 971.

Koops, B. Hoepman, J. & Leenes, R. (2013) Open-source intelligence and privacy by design. Computer Law & Security Review, Volume 29, Issue 6, pp. 676-688. DOI: 10.1016/j.clsr.2013.09.005.

MIRROR Deliverable 3.1.

Office of the United Nations High Commissioner for Human Rights (2012). The Corporate Responsibility to Respect Human Rights: An Interpretive Guide. United Nations.

Open Society Justice Initiative (2009). Ethnic Profiling in the European Union: Pervasive, Ineffective, and Discriminatory. Open Society Institute. Available at: https://www.justiceinitiative.org/uploads/8cef0d30-2833-40fd-b80b-9efb17c6de41/profiling_20090526.pdf [accessed on 23 May 2022].

Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), Art. 25, 22, 14, 13.

Stanley, J. (2016). Eight Problems with Police ‘Threat Scores’. ACLU.org. Available at: https://www.aclu.org/blog/privacy-technology/surveillance-technologies/eight-problems-police-threat-scores [accessed on 23 May 2022].

The Conversation (2018) We should teach human rights law to software engineers. Thenextweb.com. Available at: https://thenextweb.com/syndication/2018/12/25/we-should-teach-human-rights-law-to-software-engineers/ [accessed on 26 July 2020].

The Danish Institute for Human Rights (2016). Human Rights Impact Assessment: Guidance and Toolbox. The Danish Institute for Human Rights, Denmark’s National Human Rights Institution. Available at: https://www.humanrights.dk/sites/humanrights.dk/files/media/dokumenter/business/hria_toolbox/hria_guidance_and_toolbox_final_jan2016.pdf [accessed on 23 May 2022].

United Nations (1968). The Vienna Convention on Road Traffic. Treaties.un.org. Available at: https://treaties.un.org/Pages/ViewDetailsIII.aspx?src=TREATY&mtdsg_no=XI-B-19&chapter=11 [accessed on 26 July 2020].

Vallor, S. & Narayanan, A. An Introduction to Software Engineering Ethics. Available at: https://www.scu.edu/media/ethics-center/technology-ethics/Students.pdf [accessed on 26 July 2020].

Migration-Related Risks Caused by Misconceptions of Opportunities and Requirement

MIRROR has received funding from the European Union’s Horizon 2020 research and innovation action program under grant agreement No 832921.

© All rights reserved