1 CONTACT Feedback on this document is welcome: Tamilla Abdul-Aliyeva t.abdul-aliyeva@amnesty.nl Gwen van Eijk g.vaneijk@amnesty.nl WWW.AMNESTY.CO.UK / TECH DISCRIMINATORY RISK PROFILES Since Amnesty International published a report on ethnic profiling by the police in 2013, it has been much discussed in the political debate. More recently, partly due to the Benefits scandal, concerns are also being raised about ethnic profiling through the use of (semi- automated) risk profiles by governments and implementing organizations. Amnesty recently published two reports on this. However, the government does not want to prohibit the use of ethnicity and nationality in risk profiles. Practice shows that governments and implementing organizations are very unclear about this. Amnesty International states that the use of ethnicity/nationality in risk profiles violates the right not to be discriminated against and argues for a ban. This paper is about the use of etni- city/nationality in risk profiles used by governments in the search for potential norm violators. Consider, for example, the identification of welfare fraudsters or potential shoplifters on the basis of risk profiles, with the aim of (extra) monitoring them. R E A D MORE Amnesty International Report "We Sense Trouble. Automated Discrimination and Mass Surveillance in Predictive Policing in the Netherlands" . Click here to download the report. Amnesty International report "Xenophobic machines, Discrimination through unregulated use of algorithms in the Dutch benefits scandal " Click here to download the report. Amnesty on ethnic profiling. Click here. CONTENT S 1 Risk profiling and ethnic profiling 2 2 3 Interpretation of discrimination prohibition 4 The prohibition of discrimination applied on risk profiling 6 4 Frequently Asked Questions 7 Decision tree 10 Subscribe to DeepL Pro to translate larger documents. Visit www.DeepL.com/pro for more information. 2 Amnesty International, version 2, January 2023 3 WWW.AMNESTY.CO.UK / TECH RISK PROFILING AND ETHNIC PROFILING RISK PROFILING Risk profiles are used by governments to search for or identify potential norm violators for (additional) control. Risk profiles identify potential norm violators based on risk indicators that the creator of the risk profile believes are associated with a higher risk of violating a norm, rule or law. The illustration on p. 5 shows an example of a risk model with multiple indicators. Risk profiling is used proactively . This means that no norm violation has yet been identified. So risk profiling is not deployed in response to a concrete suspicion against a person. Sometimes ethnicity or nationality is used as an indicator in risk profiles (see Box 1 for an explanation of these terms). Different values are then assigned to different (groups of) people on the basis of their (perceived) origin. This can be done crudely, for example Dutch citizenship and non-Dutch citizenship, or more fine-grained, for example, second- generation "Otherlanders. Using ethnicity/natio- nality as an indicator leads to ethnic profiling. ETHNIC P R O F I L I N G Ethnic profiling is a form of discrimination. Ethnic profiling is defined by the UN Committee on the Elimination of Racial Discrimination (CERD) as "the practice of law enforcement that relies to some extent on race, color, descent, or national or ethnic origin as the basis for subjecting people to investigative activities or for determining whether a person is involved in criminal activities " Various investigations in the Netherlands show that people with a migration background are selected more often for controls than "white" Dutchmen. * Both the police as well as other authorities do this. People are not selected because they have done something wrong, but (partly) because of their (perceived) nationality, skin color or etni- cial origin. Ethnic profiling is highly problematic because people with a migration background are associated with crime and insecurity in the political and social debate in a generalizing way. Ethnic profiling is not only humiliating for the person undergoing it, but it also has consequences for Dutch society. People who are ethnically profiled have less trust in the authorities. They feel excluded, stereotyped and alienated - not only from the authorities, but from society as a whole. > * Amnesty International published several reports on this subject: "Proactive policing poses risk to human rights" (2013); "We Sense Trouble. Automated Discrimination and Mass Surveillance in Predictive Policing in the Netherlands" (2020); "Xenophobic Machines, Discrimination through Unregulated Use of Algorithms in the Dutch Benefits Scandal" (2021). ** T. Abdul-Aliyeva & G. van Eijk, "Discriminatory risk profiles: why there should be a ban on the use of origin as a selection criterion in (automated) risk profiling," Dutch Law Journal (2023). 1 BOX 1 THE LEGAL CONCEPTS OF NATIONALITY AND 'RACE' Discrimination law distinguishes between 'nationality' and 'race' as grounds for discrimination. This paper uses the more common word 'ethnicity' in the Netherlands, rather than the legal term 'race'. The legal concept of 'race' is not strictly defined; it includes such things as skin color, descent and ethnic or national origin. In a legal sense, nationality does have a strict meaning: it is about citizenship. In the context of risk profiling, however, the use of nationality is in practice the same as the use of "race. After all, n a t i o n a l i t y is never relevant for detecting potential norm violations. Data on someone's (alleged) nationality, country of birth, dual or second nationality are used in risk profiling to assign a higher risk of norm violation to individuals with a particular (alleged) group identity. The use of nationality as a risk indicator therefore amounts to unequal treatment on the basis of "race. "** 4 WWW.AMNESTY.COM / TECH THE USE O F E T H N I C I T Y / N A T I O N A L I T Y IN RISK P R O F I L I N G The use of ethnicity/nationality in risk profiles can lead to ethnic profiling in various ways, for example, by selecting at border controls based on appearance or by ethnicity/nationality being an indicator in an algorithmic risk model. As a government, you then treat people of a certain origin more unfavorably because you think they resemble persons who would have been associated with norm violations in the past. The selection then takes place (in part) on based on generalizations about a person's national or ethnic origin rather than on a person's actual behavior or on objective evidence against that person. However, assumptions about the behavior of a particular national or ethnic group, even if those assumptions are based on statistics, do not justify unequal treatment. Interpretations of historical data and statistics often reflect inequality in a society and can reinforce damaging generalizations and stereo- typing, in which an image of a group of people exists that does not correspond to reality. This not only affects the human dignity of the people concerned, but also contributes to the spread of xenophobia and stands in the way of effectively combating discrimination. AN EXAMPLE An often-heard argument is the following: 'Historical data in the registration systems and experiences of officials show that second generation Anderlanders are more likely to violate the norm than people from other backgrounds. For this reason, we set the risk profile so in that people with Anderlandic roots get a higher risk score. Thus, we can effectively (extra) control Otherlanders and detect more norm violations.' This is not an objective and reasonable justification for using ethnicity or nationality in the risk profile. This risk profile leads to direct discrimination. 5 WWW.AMNESTY.CO.UK / TECH INTERPRETATION OF DISCRIMINATION PROHIBITION Everyone has the right not to be discriminated a g a i n s t . The government itself may not discriminate, but it also has an obligation to actively counter discrimination. This means that the government must ensure that the (computerized) risk profiles that governments want to use do not lead to discrimination. Discrimination law recognizes several protected personal characteristics , including ethnicity. These are also called grounds for discrimination. In addition, discrimination law distinguishes between direct and indirect discrimination. In direct discrimination, unequal treatment is based directly on a protected personal characteristic. DIRECT D I S C R I M I N A T I O N To determine whether the disparate treatment is based on ethnicity, the following question must be answered: would the person be otherwise have been assessed or treated if they would not have had the protected personal characteristic (the ethnicity in question)? It follows from the jurisprudence of the European Court of Human Rights (ECtHR) that this is prohibited discrimination. Indeed, the ECtHR has ruled that for difference in treatment that is based solely or decisively on on ethnicity cannot be objectively and reasonably justified. * In the context of social protection, the EU legislator has stipulated in the "Racial Equality Directive" that direct discrimination on the basis of ethnicity is allowed only in some specific cases, including affirmative action such as for countering disadvantage. ** This EU directive was adopted in Netherlands transposed into the General Equal Treatment Act (Awgb). INDIRECT D I S C R I M I N A T I O N Even a risk profile with seemingly neutral criteria that does not use protected person characteristics can lead to unequal treatment. This is called indirect discri- mination. According to the EU Racial Equality Directive, indirect discrimination may be allowed in some cases if there is an "objective and reasonable" justification. This can only be so if there is a legitimate aim and the distinctions used to achieve that aim are appropriate, necessary and proportionate. The government must provide good reasons for justifying indirect distinctions. This test is very strict. Without objective and reasonable justification is prohibited discrimination. *** * ECHR, December 13, 2005, nos. 55762/00 and 55974/00, AN EXAMPLE In a crime detection risk profile, national origin is used as an indicator. People with a nationality of Daarland, Anderland and Verweggistan are given a higher risk score for potential involvement in crime. People with other nationality are given a lower risk score. People with the highest risk scores are subjected to additional scrutiny. This is direct discrimination based on ethnicity. 2 6 (Timishev v. Russia) ** EU Council Directive 2000/43/EC (2000). *** Racial Equality Directive (2000/43/EC), Article 2(2)(b); see also ECHR, 29 January 2013, No. 11146/11 (Horváth and Kiss v. Hungary). 7 WWW.AMNESTY.CO.UK / TECH This illustration is a simplified representation of a risk profile in which nationality is a risk indicator. The inclusion of nationality as an indicator leads to a difference in assessment of two individuals who are otherwise similar: thus, the indicator nationality is decisive in the distinction. 8 Illustration: Toscabanana 9 WWW.AMNESTY.CO.UK / TECH THE PROHIBITION OF DISCRIMINATION APPLIED TO RISK PROFILING Before this, we explained that direct discri- mination should never be allowed. In the context of risk profiling, this means that ethnicity/nationality should not be an indicator in the risk model. It does not matter how weighty the indicators are or how many other indicators there are. DECISIVENES S In the context of risk profiling, the use of ethnicity is always decisive (see also Box 2). The use of ethnicity as a risk factor means that individuals receive a higher risk score because of their ethnicity, which may result in them being more likely to be considered for (extra) control. That (extra) scrutiny fails to occur for people who are similar but do not possess the relevant ethnicity. When descent is a risk indicator, a situa- ation arises in which everyone who has that descent receives a higher risk score: they are thus judged more nade- lately than others who, except for descent, are exactly the same as them. See the illustration at p. 5. This is not to say that in practice they will all be selected for (additional) scrutiny be. But the fact that ethnicity/nationality is an indicator always increases the chances of being checked. Even without actual control, this leads to unfavorable treatment of one group compared to another, and that is discrimination. INTENTION AND E F F E C T Discrimination also occurs if there was no intent to discriminate or if the result of the risk profiling was not actually discriminatory. Similarly, if the government did not responded to the high risk score, did not act in a discriminatory manner when handling the (additional) check, the check did not show any normo- diligence, the additional check went unnoticed by the person concerned and/or the algorithm did not prove to be accurate enough to be deployed efficiently, discrimination and ethnic profiling is involved if ethnicity/nationality has been taken into account by the public service in the surveillance and detection system. * T. Abdul-Aliyeva & G. van Eijk, "Discriminatory risk profiles: why there should be a ban on the use of origin as a selection criterion in (automated) risk profiling," Dutch Law Journal AN EXAMPLE The government uses a risk model for detecting potential fraud that distinguishes between the Other nationality and non- Other nationality. Not having the Anderland nationality contributes to the final risk score. Combined with scores on other risk indicators, this leads to a higher risk score for people with the Ander- land nationality and thus an additional check. Thus, in this risk model, the individual applicant is assessed on the basis of whether or not he or she possesses the Other-country nationality. This example is taken from the Personal Data Authority's 2020 opinion on the benefits scandal. 3 BOX 2 MISINTERPRETATION OF 'DECISIVENESS' A commonly heard argument in favor of using ethnicity/nationality in risk profiles is that this indicator would only play a role "to some extent" in the difference in assessment or treatment, because the risk profile consists of multiple and/or more weighty indicators. This is an incorrect interpretation of discrimination law. This reasoning is an unjustified inversion of the premise that there is no objective and reasonable justification for unequal treatment based 'solely or decisively' on ethnicity/nationality. This premise does not mean that "some degree" of discrimination is indeed permissible. The argument that ethnicity/nationality would only count "to some extent" in the risk score also often leads to the incorrect reasoning that the discrimination test requires looking at the role of this indicator relative to other indicators in the risk model. This is not correct. To determine whether discrimination has occurred, one must look at the difference in treatment that results from the use of ethnicity/nationality as an indicator.* 10 (2023). 11 WWW.AMNESTY.CO.UK / TECH FREQUENTLY ASKED QUESTIONS 1 Does the use of ethnicity/nationality in risk profiles used to search for norm violators always lead to discrimination in the form of ethnic profiling? Yes. The use of ethnicity/nationality means that enforcement authorities rely to some extent on ethnicity/nationality as a basis for subjecting people to scrutiny, investigative activities, or for determining whether someone is involved in violating a law, norm, or rule. In this paper we have explained that this kind of risk profiling always leads to difference in treatment that is decisively based on ethnicity/nationality. In this regard, said that the standard for justifying a difference in treatment is 'strict in theory, but fatal in fact' : the conclusions are that there can be no objective and reasonable justification for this. In practice, we see that this justification is sought in, for example, generalizations based on historical or statistical data and/or the 'underbelly', which can never be an objective and reasonable justification for a difference in treatment. 2 If I set up the system so that nationality and ethnicity data are not explicitly entered as risk indicators, you can never speak of direct discrimination, can you? Yes, direct discrimination can also take place via a detour. Ethnicity/nationality is then not included one-to-one in the risk profile, but can be inferred through the use of a detour - an apparently neutral indicator - that can be clearly linked to ethnicity/nationality, in order to make a difference in treatment after all. AN E X A M P L E The police assume that certain crime is committed more in certain neighborhoods and assume that this is related to the presence of many Anderland residents in those neighborhoods. The police are therefore adding specific zip codes where many Otherlanders live to the risk profile, giving these zip code areas a higher risk score. ZIP code was chosen as an indicator in order to be able to monitor (more intensively) people with an Anglophone origin. 3 Does using nationality in a risk profile lead to ethnic profiling? Yes. In practice, the use of nationality as a risk indicator is the same as the use of ethnicity (see Box 1). In the granting process (that is, to assess whether the person in question is entitled to a legal settlement), citizenship of applicants may be a relevant indicator, but this can never be the case for risk profiling for the purpose of detecting norm violation (see also question 7). The use of country of birth, place of birth, second or dual nationality, for example, should be understood as discrimination based on ethnicity in the context of risk profiling. AN E X A M P L E Following a fraud report to the Tax Administration about 100 to 150 people with Ander- land nationality, officials initiated an under- search of all 6047 applicants with Ander- land nationality. Citing previous experience, the Inland Revenue used a variety of data - including nationality, family ties and/or living conditions - to identify larger groups that were classified as one homogeneous population. For example, the entire group is subject to additional scrutiny because of the Other nationality. This indicates that the Tax Administration did not suspect a connection between potential fraud and certain groups of people identified by their natio- nality and thus ethnicity. > 4 12 WWW.AMNESTY.COM/TECH 4 Is a risk profile different from a suspect alert? Yes. Risk profiling does not (yet) involve a concrete suspicion of any normo- tion. This is important to note, because in case there is a concrete suspicion, investigative services can use a suspect design- ment in which characteristics such as ethnicity and nationality, among other characteristics such as posture, clothing, information about the vehicle, location and time, may be included. 5 Is risk profiling guaranteed to be free of discri- mination if protected person characteristics or detours from them are not included in the risk profile? No, there is always the danger of indirect discri- mination. Indirect discrimination is also prohibited and must be avoided. Whether this exists should be investigated by researching the outcomes of the risk profile for different population groups prior to and during its use. AN E X A M P L E A municipality is developing a risk model to detect potential welfare fraudsters, using all the data the municipality has collected on welfare recipients. In the risk model, the level of fluency of the Dutch language included as a risk factor for potential fraud. The municipality examines the effects of the risk profile and finds that the level of fluency is significantly related to migration background: welfare recipients with a migration background are more likely to have a lower level of fluency, leading to a higher risk score. Using this apparently neutral criterion as a risk indi- cator will then lead to a difference in treatment of groups of people of different nationality or ethnicity. 6 May ethnicity or nationality be processed in risk profiles used for purposes other than law enforcement? Perhaps. Governments should not include ethnicity/nationality as an indicator in risk profiles when looking for potential norm violators because it leads to discrimination. Other uses should also be treated with great caution. Where laws and regulations allow, you can use ethnicity/nationality to check that the profile you intend to use does not lead to indirect discrimination. There may be cases where there is an objective and reasonable justification for the use of ethnicity/nationality in risk pro- duction. The use of ethnicity/nationality as a means of searching for others than potential norm violators, for example in the context of care or assistance. But again, it must be established beyond doubt that the use of ethnicity/nationality is necessary and proportionate, and there must be sufficient safeguards. 7 How can you use an automated system for both the award process and the norm violation detection process without discriminating? There may be an objective and reasonable justification for processing nati- onality data during the award phase. However, it is prohibited during the investigation phase. Sometimes these processes overlap. However, it is important that there is a hard cut between the award process and the process of risk profiling in search of potential normo- verters. Only then can discrimination be prevented. AN E X A M P L E In one computerized system, a government department controls grant awards and selec- tion for fraud investigations. There is one profile that serves as an award and risk profile. For the subsidy to be awarded, there must be a link to the Netherlands. When someone lives in the foreign country, having the Dutch > 13 WWW.AMNESTY.COM / TECH nationality relevant to demonstrating that binding. The indicator 'Dutchness yes/no' is therefore included in the award profile. This is allowed. But this automatically puts the indica- tor in the risk profile as well. People without the Dutch nationality a higher risk score, even when they do live in the Netherlands and are equally entitled to the subsidy as people with Dutch nationality. Because of the higher risk score, non-Dutch nationals are more likely to be under- investigated for fraud than Dutch nationals in the exact same situation. As a result, the government department directly discriminates on the basis of ethnicity (see Box 1). This should not be done. Strict separation of these stages is therefore necessary to prevent discrimination. 8 If the government uses a self-learning algorithm, can you speak of discrimination? Yes. Discrimination occurs when ethnicity/nationality is an indicator or when direct or indi- rect discrimination otherwise occurs. As a public service may you do not discriminate, and the question of whether there is discrimination does not depend on how the risk profile was created, how the indicators ended up in the risk profile, or the intent of the developer or user of the risk profile. The public service is always responsible for the effects of the risk profile. To ensure that a discriminatory risk profile is not applied, Amnesty International calls for a ban on the "live" use of self-learning algorithms. All algorithms should be tested for direct and indirect discrimination before they are used. 9 But does the Central Appeals Council (CRvB) allow the use of ethnicity/nationality in risk profiles for asset searches? The prohibition of discrimination in social protection law is regulated in the Netherlands in the General Equal Treatment Act (Awgb). The CRvB tested in a number of cases not under the Awbg but under the European Convention on Human Rights (ECHR). According to Amnesty Inter- national, this misapplied ECHR jurisprudence by leaving room for unequal treatment based "to some extent" on ethnicity (see Box 2). It also appears that in some cases the CRvB failed to recognize that nationality was used as a detour for discrimination based on ethnicity. 10 Has a test for discrimination satisfied human rights? No, there are also other human rights that risk profiles must meet, such as the right to privacy and protection of personal data. For these, see the Impact Assessment Human Rights and Algorithms (IAMA). The IAMA allows governments to have a measured discussion about the safeguards for development and deployment of an algorithmic system.* 11 What is meant by "risk profiles"? In this document, risk profiles also refers to any risk models, algorithmic systems, risk assessment tools, or comparable systems or tools, whether (semi-)automated or not, that are used to evaluate, score, or calculate the risk or likelihood of norm violation of people or groups, on the basis of which decisions are made about, among other things (but not exclusively) selecting for (additional) monitoring or enforcement, deploying an approach or intervention, or determining the outcome of the norm violation. A characteristic feature of risk profiles is that they are deployed without concrete, individualized suspicion of norm violation. * Ministry of the Interior and Kingdom Relations, Human Rights and Algorithms Impact Assessment . Click here to download a pdf. 14 NO YE S The risk profile leads to (indirect) discrimination and should not be used in this form. BESLISBOOM* Is the risk profile used to look for potential norm violations and/or to identify potential norm violators for (additional) monitoring? YES TEST FOR DIRECT DISCRIMINATION Is ethnicity/nationality included as an indicator in the risk profile? NO * Perpetrator alerts fall outside this decision tree. Unlike risk profiling, there is then a concrete suspicion focused on a concrete suspect. Ethnicity/nationality may be used in offender alerts, if in line with all relevant laws and regulations. Are there other indicators included in the risk profile chosen to select by ethnicity/nationality in a roundabout way? (For example: zip code, language proficiency, license plates). NO TEST FOR INDIRECT DISCRIMINATION Are other indicators included in the risk profile that are "ostensibly neutral" but because of a (possible) connection to ethnicity/nationality could lead to disadvantageous treatment of certain groups? (For example: zip code, language proficiency, income). There may be indi- rect discrimination, even if at first glance it does not appear so. Analyze the elaboration of the risk profile for different groups: Is there a statistical relationship between a high risk score and ethnicity? NO YE S The risk profile may be used provided that in line with all other relevant laws and regulations. The risk profile may be used provided it is in line with all other relevant laws and regulations. Does the subdivision pass the objective and reasonable justification test? Does the foregoing analysis show disparity between certain population groups? NO YE S This may lead to indirect discrimination. Analyze the effect of the risk profile for different groups: is there a statistical relationship between a high risk score and ethnicity? YE S This leads to (direct) discrimination: this risk profile should not be used that way. YE S In that case, the use of ethnicity/nationality as an indicator leads to a difference in treatment based decisively on ethnicity/nationality. There is then direct discrimination (ethnic profiling), which is prohibited. NO Ethnicity/nationality may be used to identify individuals or groups for affirmative action, e.g., care or assistance, or to detect discrimination in risk profiles, if in line with all relevant laws and regulations. This document is not concerned with this. WWW.AMNESTY.CO.UK / TECH