Scoping

Project phase

Identify where you are in the in your digital project (e.g. design or implementation phase) or where you are in relation to the use of digital solutions and tools (e.g. planned use, or currently using a digital solution).

Scoping

Digital technologies and solutions used

Identify where you are in the in your digital project (e.g. design or implementation phase) or where you are in relation to the use of digital solutions and tools (e.g. planned use, or currently using a digital solution).

Scoping

Digital activities

Identify where you are in the in your digital project (e.g. design or implementation phase) or where you are in relation to the use of digital solutions and tools (e.g. planned use, or currently using a digital solution).

Data-related activities

Data Responsibility

This section concerns the human rights risks around the collection, storage and processing of personal data.

Data-related activities

Responsible data handling

This section concerns the human rights risks around the collection, storage and processing of personal data.

Data-related activities

Privacy impact assessment

This section concerns the human rights risks around the collection, storage and processing of personal data.

Data-related activities

Sensitive data

This section concerns the human rights risks around the collection, storage and processing of personal data.

Data-related activities

Informed consent

This section concerns the human rights risks around the collection, storage and processing of personal data.

Data-related activities

Informed consent for data processing

This section concerns the human rights risks around the collection, storage and processing of personal data.

Data-related activities

Data minimization

This section concerns the human rights risks around the collection, storage and processing of personal data.

Data-related activities

Anonymized data

This section concerns the human rights risks around the collection, storage and processing of personal data.

Data-related activities

Cybersecurity

This section concerns the human rights risks around the collection, storage and processing of personal data.

Artificial Intelligence, automated decision-making and machine learning

AI application

The section includes questions around human rights risks related to Artificial Intelligence, automated decision-making and machine learning (AI for short).

Artificial Intelligence, automated decision-making and machine learning

Bias in datasets

The section includes questions around human rights risks related to Artificial Intelligence, automated decision-making and machine learning (AI for short).

Artificial Intelligence, automated decision-making and machine learning

Biased outcomes

The section includes questions around human rights risks related to Artificial Intelligence, automated decision-making and machine learning (AI for short).

Artificial Intelligence, automated decision-making and machine learning

Gender-based analysis

The section includes questions around human rights risks related to Artificial Intelligence, automated decision-making and machine learning (AI for short).

Artificial Intelligence, automated decision-making and machine learning

Post-deployment testing

The section includes questions around human rights risks related to Artificial Intelligence, automated decision-making and machine learning (AI for short).

Risk analysis

Risk profile of project

This section will help identify the risk context of the digital project or solution.

Context related topics

Data protection legislation in project country

This section concerns questions about the broader context and environment where the digital project or solution will be implemented or applied.

Context related topics

Lessons learned from similar projects

This section concerns questions about the broader context and environment where the digital project or solution will be implemented or applied.

Context related topics

Human rights impacts of previous projects

This section concerns questions about the broader context and environment where the digital project or solution will be implemented or applied.

Context related topics

Human rights impacts of previous projects

This section concerns questions about the broader context and environment where the digital project or solution will be implemented or applied.

Unintended consequences

Unintended consequences

Marginalized groups

Marginalized groups

Consider whether the project or solution impacts marginalized groups, who are particularly susceptible to human rights risks.

Marginalized groups

Potentially impacted rightsholders

Consider whether the project or solution impacts marginalized groups, who are particularly susceptible to human rights risks.

Accessibility

Users as individuals

Consider how accessible the project or solution is to its users.

Accessibility

Accessibility

Consider how accessible the project or solution is to its users.

Accessibility

Monitoring

Consider how accessible the project or solution is to its users.

Stakeholder engagement

Design with people

This section concerns the engagement of stakeholders on the potential impacts of the project or solution and whether their concerns are addressed.

Stakeholder engagement

Engagement on potential impacts

This section concerns the engagement of stakeholders on the potential impacts of the project or solution and whether their concerns are addressed.

Stakeholder engagement

Engagement with rights holders

This section concerns the engagement of stakeholders on the potential impacts of the project or solution and whether their concerns are addressed.

Stakeholder engagement

Engagement with rights holders

This section concerns the engagement of stakeholders on the potential impacts of the project or solution and whether their concerns are addressed.

Stakeholder engagement

Engagement with rights holders

This section concerns the engagement of stakeholders on the potential impacts of the project or solution and whether their concerns are addressed.

Transparency and reporting

Reporting

Transparency and reporting

Reporting

Access to remedy

Remedy

Progress

0%

Where are you in the project phase?

Select 'design' or 'implementation' if this concerns a digital project. Select 'planned use' or 'currently using' if this concerns a digital tool or solution. Select an option to see commentary on how the tool can be used for that particular option.

Comment - Project design

If used during the project design phase, you can use the tool as an early guide on how to avoid human rights risks that might otherwise materialize at later stages of the project.

Comment - Project implementation

If you are currently implementing your digital project, you can use the tool to inform decision-making around specific digital project components and to ensure a human rights-based approach to those decisions.

Comment - Planned use of digital tool

If you are currently planning to use a digital solution in your project, you can use the tool as an early guide on how to avoid human rights risks related to that solution

Comment - Currently using a digital tool

If you are currently using a digital solution in your project, you can use the tool to identify potential human rights impacts and inform decision-making around next steps for addressing those impacts.

Case example – project design phase

As an example, if you are currently designing a project about digital public participation in political processes in a region or a country you are most likely already considering data protection and privacy risks. Since the project would very clearly involve individual persons, it will be important to already at this early stage consider what the broader human rights risks might be with the project so that any preventive measures can be taken.

Case example – project implementation phase

If you, instead, would be implementing the project about digital public participation in political processes in a region or a country, you have most likely already considered data protection and privacy risks during the design phase. Since the project risks impacting individual persons (rights holders) as they use and engage with the various digital platforms, dashboards and other digital solutions, it will be important to take apply a broader human rights lens to the project in order to take preventive measures to address potential future impacts, or to mitigate any impacts that may have already occurred.

Case example – ‘Planned use of digital tool’ or ‘Currently using a digital tool’

Though a renewable energy project as such might not be considered a digital project, it may well be that digital solutions are planned to be used within that project, for specific purposes. For example, there might be a requirement that the project developer use a digital communication or social media platform to engage with local stakeholders. In this case, the correct option would be to select either ‘Planned use of tool’ or ‘Currently using a digital tool’, and to use this Digital Rights Check to assess what the human rights risks involved in using that tool might be.

Which of the following digital technologies or solutions are used in your project?

Select all applicable options. For more information about the response options, select the relevant option and see the comment.

Comment - Smartphone app

Smartphone applications can be used for many different purposes, including for e-health programmes, fintech platforms, Covid-19 track-and-trace programmes, e-governance solutions, and many more. Depending on the purposes of the app, it may include significant data processing activities as well as artificial intelligence models applied to the data.

Comment - e-learning tool

E-learning or other forms of virtual and digital learning, though rarely a high-risk tool, will often include some form of data collection. This can include collection of personal information if there is gated access to the platform, and can also include collection of other data linked to specific responses, response times etc. Data from participants and users may also be analyzed and processed in different ways.

Comment - IoT device

Internet-of-Things (IoT) devices facilitate real-time collection, sharing and analysis of data. If multiple IoT devices are connected to each other, this can include complex analysis of data from various sources. IoT devices can be used for remote monitoring, predicting maintenance needs, resource management and much more.

Comment - Digital social or communications platform (incl. social media)

Digital communication platforms will generally include a lot of data processing of user data. Platform user data can be collected, stored and shared, and the use of a platform can also facilitate large-scale analysis of the data by applying artificial intelligence models, machine learning methodologies and so forth. Finally, a digital communication platform also makes use of user generated data. Examples of use of digital communication platforms include use of social media for civil society engagement, e-medicine platforms, among other things.

Comment - Cloud services

The use of cloud services will naturally include different forms of data processing, including data collection and data storage. This means that particular attention must be paid to the data protection regulations and standards in the relevant context.

Comment - Artificial intelligence solutions

Artificial intelligence tools include anything from simple automated prioritization systems to complex data-driven automated analysis of public utilities (e.g. water use). The tools themselves require data that can then be further processed. As such, there can be issues around collecting, sharing and otherwise processing data, as well as specific issues concerning the application of the artificial intelligence itself.

Digital technologies and solutions used by GIZ

GIZ implements several projects in which information and communication technologies play an important role. These range from e-learning programmes in Honduras, Albania and Pakistan to a networked indigenous university in Bolivia, from digital start-up support and app development in Nigeria, Kenya, Tunisia and Yemen to digital financial services in Ghana, Jordan and Mozambique.

See more here.

GIZ’s Fair Forward project

The ‘FAIR Forward – Artificial Intelligence for All’ initiative aims to close this gap and reduce social inequality by providing fair and open access to language data. The project has already seen some initial successes in Rwanda. Millions of people there will soon be able to use a chatbot to receive coronavirus advice in their local language.

See more here.

Does the project or solution include any of the following digital activities?

By 'digital activities' we mean the underlying activity of the technology or solution in question. For more information about the response options, select the relevant option and see the related comment.

Comment - Data collection

This option implies that the project component somehow is developed to collect data digitally. This may include projects focused on public participation where information of individuals’ opinions related to government practices is linked to identifying information of the individual.

 

Collecting data may cause a series of impacts, the most obvious being the right to privacy, if there e.g. was no consent from the data subjects. However, the ‘threat’ of collecting data without consent may also have impacts on e.g. the freedom of expression since individuals do not want to share their opinion if that data is being collected.

Comment - Data storage

Data storage is naturally linked to data collection, since it relates to what happens to the data after it has been collected. This includes e.g. having a database of all learners that have participated in an e-learning project. It may in some cases be possible that data collected simply is not stored, or that it is stored elsewhere and by someone else.

 

Data storage in itself may primarily impact the right to privacy, particularly if there are risks of data breaches. Even when there are legitimate reasons to store data, storing that data for an excessive period of time can raise significant right to privacy concerns.

Comment - Data alteration, treatment or use

This activity relates to how the data is used. While it is possible that nothing is done with the data except for storage, often times digital data collection has the purpose of using the data to e.g. analyze patterns, make predictions, and so forth. One example is the analysis of large amounts of data to forecast disease outbreaks in a country.

 

Data alteration, treatment or use can have far-reaching impacts on human rights depending on the context. It may include using big data analytics to make predictions that end up being discriminatory. Another example: If health centers use data to improve efficiency, small errors may have severe impacts on the right to health.

Comment - Data sharing

Data sharing simply means to share data that has been collected with others. This often occurs in relation to research where data used for research is made available to other researchers.

 

First and foremost, data sharing may have impacts on the right to privacy, if the individual did not provide her/his informed consent. Further, the data shared might be used for purposes such as targeted advertising, which may among other things have discriminatory impacts.

Comment - Hosting and/or sharing user-generated content

This relates to having a platform that is made available to a certain users and where these users generate the content. This can include e-learning platforms where learners develop certain digital content, and could also concern digital communication platforms for e-participation.

 

Simply hosting or sharing content can also have impacts. If hate speech is hosted on a platform, this may have negative impacts on the right to mental health. If certain discriminatory content is shared, that naturally can have impacts on the freedom from discrimination.

Comment - Artificial Intelligence (AI) / machine learning

While there is no shared definition of AI, it generally concerns the science and engineering of making intelligent machines, especially intelligent computer programs. It includes the likes of voice and face recognition, self-driving cars as well as machine learning algorithms that can help predict weather patterns, droughts or even criminal activity.

 

Artificial intelligence have many use-cases and can therefore have a large variety of impacts. This ranges from: impacts on the right to a fair trial when AI is used in judicial systems; impacts on the right to equality and freedom if outcomes are biased to the detriment of marginalized groups; impacts on the right to health if AI-supported e-medicine platforms make sub-par decisions; impacts on the right to an adequate standard of living if AI-supported fast-track approvals for unemployment benefits are limiting the access to such benefits for certain groups; impacts on the right to privacy and many other rights in relation to COVID-19 track and tracing apps.

Comment - I don't know

It is important for project staff to understand the underlying technology of digital components, in order to assess their impacts. The tool will still let you proceed, but without specific questions related to the digital activities of your digital tool.

In a project about improved agriculture processes in a country in South Asia, a development cooperation agency is engaging with local small-holder farmers to improve their yield while also using less water resources. Due to the Covid-19 pandemic some of the engagement with community leaders has taken place over digital platforms. The digital platform used is open to access to anyone who would like to join these engagement sessions and many community leaders join. The use of the digital platform included the following digital activities:

  • Data collection – users must fill in their name; data is collected on when a user logs on, logs off and what they write on the platform
  • Data sharing – the data is shared between all partners of the project, in order to track the success of the engagement
  • Data storage – the above data is stored, in order to track success over time
  • Hosting and sharing user-generated content – the comments by the participants is user-generated content

The following activities were not relevant:

  • Data alteration, treatment or use – the data has only be logged and shared, but no further treatment of the data occurs on the platform
  • Artificial intelligence (AI) / machine learning – not relevant in this case, since no automated processing of the data took place.
Caution - No

Given that your answer indicates that none of the above activities are taking place, the guidance ends here. This may for example be the case if your project component is about digital technologies, but it does not itself use such technologies (e.g. face-to-face education on digital literacy). Feel free to check the “other” box in case you would like to continue through the other steps of the tool.

Who is the responsible party for personal data processing and controlling in your digital project?

The responsibilities concerning personal data processing and handling should be determined at the beginning of your digital project due to its relevance for GDPR compliance, the operationalisation of data protection requirements, as well as for the rights of the data subjects. The responsible party determines the means and purpose of processing personal data on behalf of GIZ.

Recommendation - GIZ

If GIZ is the responsible party and personal data are processed either by GIZ, a partner or an external service provider, submitting a VVT-notice via the GIZ Data Privacy Portal must be considered. For the different roles GIZ may take in processing personal data, consult the decision tree. If you are unsure about your role, contact the GIZ Data Helpdesk.

Recommendation - External service provider

If an external service provider processes personal data independently, contact the GIZ Data Helpdesk.

Recommendation - Partner

If a partner processes personal data independently, contact the GIZ Data Helpdesk.

Recommendation - I dont' know

In case you don’t know who bears the responsibilities within your project or how they should be divided, please consult the decision tree. The Data Helpdesk will help you to clarify questions via the GIZ Data Privacy Portal.

Has the project or solution been assessed using the GIZ Responsible Data Guidelines checklist?

See resources for a link to the checklist. If you are unsure of the answer consult with colleagues who might have been involved in such a process.

Recommendation - Yes

Review and update the initial assessment, as needed, and move on to pay particular attention on stakeholder engagement and marginalized groups.

Recommendation - No

Review the Responsible Data Guidelines checklist as soon as possible.

Recommendation - I don't know

Consult with project partners and others involved with the digital component and see whether the Responsible Data Guidelines checklist has been used.

GIZ resources:

Further resources:

Has a privacy impact assessment (e.g. data protection impact assessment) been completed in relation to the digital project or solution?

Depending on the specific digital project or solution you may have already been required to conduct some forms of impact assessments, such as privacy impacts assessments or data protection impact assessments. See resources for further info.

Comment - Yes

Review the previously conducted assessments and assess whether broader human rights risks are sufficiently covered. Update the assessment as relevant.

Potentially impacted human rights and principles - Not yet. It is underway.

Right to privacy

Recommendation - Not yet. It is underway.

Review planned privacy impact assessment and consider how human rights can be better integrated into the assessment.

Potentially impacted human rights and principles - No

Right to privacy

Recommendation - No

Assess whether you need to conduct any form of privacy impact assessment, and proceed to conduct an assessment as soon as possible, where necessary. Remember to consider human rights more broadly in the assessment.

Comment - No

Even if the specific project or component does not require such assessments, it can be useful to use the structure of those assessments to inform assessments of potential human rights impacts.

Recommendation - I don't know

Consult with project partners and others involved with the digital project or solution and see whether any form of privacy impact assessment has been conducted.

Does the project or solution collect, store or otherwise process sensitive personal data?

Sensitive data includes (according to the EU General Data Protection Regulation):
- personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade-union membership
- genetic data, biometric data processed solely to identify a human being
- health-related data
- data concerning a person’s sex life or sexual orientation.

Important note: Algorithms can be applied to generate sensitive personal data about people from non-sensitive data.

Potentially impacted human rights and principles - Yes

Right to privacy

Recommendation - Yes

The processing of sensitive data means that there is a strong need for a human rights-based approach to the project, taking point of departure in the potentially impacted individuals. Ensure that such a process is in place and revisit existing impact analyses.

Comment - No

The fact that no sensitive data is collected, stored or otherwise processed implies a lower risk in terms of potential human rights impacts.

Comment - I don't know

Consult the further resources (especially Responsible Data Guidelines and related documents) and engage with project partners and others involved in data processing, and ask whether sensitive data is processed.

A company that is developing an innovative e-health platform which include ‘telehealth’ services has received funding from a development finance bank due to the promises of the platform to deliver access to health services to previously under served populations. The e-health platform naturally collects, stores and shares personal health-related data, i.e. sensitive data. This will require elevated attention to human rights risks.

Have all individuals been informed about the processing of their data, and have they provided their informed consent to the processing?

Data processing refers to operations performed on personal data such as its collection, storage, use or dissemination (according to the EU General Data Protection Regulation).

People often do not fully understand and feel in control over what and how their data is processed. A simple digital checkbox to provide consent might not imply that the individual has been informed and fully understood his or her consent.

As such, in relation to medium- to high-risk applications of digital components it is particularly relevant to ensure that individuals have fully understood what data is collected

Comment - Yes

While having gotten consent is essential, it is also important to assess whether the consent provided has been free and informed. In some cases, it is doubtful whether consent has been given freely, particularly if the option of not providing consent was not a realistic option in practice.

Potentially impacted human rights and principles - No

Right to privacy, right to freedom information and right to participation

Recommendation - No

Review your efforts to inform individuals of how their data is used, as well as the processes in place for them to provide consent. Engage with individuals to assess how these efforts can be improved. If informed consent cannot be secured, consider whether to cease the processing of personal data.

If you are not collecting the data yourself, discuss with the partners that are collecting the data how the data is collected and how consent is given by the individuals.

Recommendation - I don't know

Engage with project partners and others involved in data processing, and ask whether informed consent has been achieved for data collection and processing, and how that has been achieved (if relevant).

As many organizations have begun using digital collaboration tools to a larger extent during the Covid-19 pandemic, the automatic data processing is increasing as well. An organization that increasingly uses e-learning platforms must be aware of the data collected when users are using the platform and must make sure that it has properly informed the users of the type of information that will be processed, as well as get the informed consent from those individuals.

Please reflect on how informed consent has been obtained.

Consider how informed consent from individual persons has been obtained and write it down. It can be an important exercise to reflect on how exactly that has happened.

  • For more on what informed consent is, see UK Information Commissioner’s Office, “What is valid consent?

Does the project or solution strictly follow the principles of data minimization?

Data minimization means that data collected, used or stored should be adequate, relevant and limited to what is necessary for the intended purposes. Where data is not minimized, risks to the right to privacy increase.

Comment - Yes

Risks related to the right to privacy have been significantly reduced. Continuously assess whether the amount and type of data collected and used should be changed, and whether data can be deleted.

Potentially impacted human rights and principles - No

Right to privacy

Recommendation - No

Review your data processing practice and ensure that data is minimized, since this can significantly lower human rights risks. Minimizing data can include e.g. initially collecting limited amounts of data as well as developing sunset clauses that ensure that data is only kept for a limited period of time after which it is deleted.

Comment - I don't know

Reach out to project partners and others involved in the project or solution, to assess whether data has been or should be minimized.

Consider the following examples:

  1. A ‘smart’ public employment service automatically sends job-seekers a general questionnaire, which includes specific questions about health conditions that are only relevant to particular manual occupations. It would be irrelevant and excessive to obtain such information from an individual who applied for an office job. Data must thus be minimized from the outset or subsequently deleted when found it is not relevant.
  2. An online classroom is set up for remote education during Covid-19 school closure. Parents are sent a general questionnaire, which includes questions about health conditions in order to assess accessibility questions. However, only certain disabilities are relevant and to obtain further health information would be excessive. Data collection, use and storage must thus be minimized.

Do you believe there is a risk that anonymized personal data can be made into identifiable personal data again?

Artificial intelligence can be applied to re-identify the individuals linked to anonymized 'individual data' by combining a various data sources and data points. Therefore, the risks to the right to privacy can remain even after data has been anonymized.

Potentially impacted human rights and principles - Yes

Right to privacy

Recommendation - Yes

First, review whether data collected and used could be further minimized. Revisit this as the project progresses since it may be possible to identify unnecessary data at later stages of a project. Second, review how collected data might be or is shared to third-parties, since that may increase the risk of de-anonymization.

Comment - No

There is often some risk of this occurring (see further references), and as such it is important to continuously review this risk as data is considered to be shared with third-parties.

Potentially impacted human rights and principles - I don't know

Right to privacy

Comment - I don't know

See further resources for more information on re-identification of personal data.

GIZ resources:

Further reading:

“AI applications can be used to identify and thereby track individuals across different devices, in their homes, at work, and in public spaces. For example, while personal data is routinely (pseudo-)anonymised within datasets, AI can be employed to de-anonymise this data. Facial recognition is another means by which individuals can be tracked and identified, which has the potential to transform expectations of anonymity in public space.”

Have you considered the security of the data that you collect and share in your project or via your solution?

Cybersecurity can be defined as “the preservation – through policy, technology, and education – of the availability, confidentiality and integrity of information and its underlying infrastructure so as to enhance the security of persons both online and offline."
- Internet Free and Secure working group of the Freedom Online Coalition (FOC),

Comment - Yes

You’ve considered an important aspect of protecting the right to privacy and other human rights that could be at risk if personal (and potentially sensitive) data would be leaked.

Recommendation - No

Assess the security of the data that you collect and share in your project or via your solution. Ensure that sufficient safeguards are in place to keep the data secure.

Which kind of AI application is used in the project?

Select an option to see a further description of the application. Choose the 'other' option if your project or solution uses a different form of AI application than those listed.

Comment - Image and object recognition

This relates to the analysis of large data sets to automate the recognition, classification, and context associated with an image or object. It includes the likes of face recognition but also analysis of vast amounts of satellite photos in order to predict migration patterns.

 

Image and object identification AI risks negatively impacting the right to privacy. Particular attention should be paid to the datasets used, how the data is collected, and how the recognition technology functions. Facial recognition technology may for example have far reaching consequences on the right to privacy, freedom of assembly, right to a fair trial, and many other rights.

Comment - Risk assessment

Risk assessments concern the analysis of large datasets to identify patterns and recommend courses of action and in some cases trigger specific actions. This may include e.g. automated credit risk scoring by banks, recidivism risks in a justice system, or risks of flooding or extreme weather events.

 

AI risk assessments often make predictions of risks based on historical data. If previous risk assessments have been biased, the predictions made by the AI application will also be biased unless the bias is corrected. Pay particular attention if the risk assessments inform decisions that may have significant impacts on rights holders the outcomes might have severe impacts on human rights. For example, if risk assessments are used within judicial systems, or in relation to distributing public benefits or not.

Comment - Process optimization and workflow automation

Process optimization & workflow automation concerns the analysis of large data sets to identify anomalies, cluster patterns, predict outcomes or ways to optimize and automate specific workflows. This may include e.g. ‘smart chat bots’ that can help to assess which users are in need of more extensive assistance and those who can be redirected to FAQ parts of a website, and to identify healthcare patients that should be prioritized for certain procedures.

 

Process optimization and workflow automation rarely pose significant risks to human rights if they concern strictly administrative tasks. However, the same process can also be used to speed up decisions concerning social assistance, issuance of permits, or other forms of licensing, which all may impact negatively on human rights. There may also be issues related to the right to access information and access to remedy as the underlying AI model can be difficult to explain and understand, which thereby makes it difficult to appeal.

Comment - Text and speech analysis

Text and speech analysis concerns the analysis of datasets to recognize, process, and tag text, speech, voice, and make recommendations based on the tagging. This may include text-to-speech technologies that can help blind individuals with accessibility of written content.

 

Potential human rights impacts related to text and speech analysis includes public service ‘chat bots’ that are not able to assist individuals from minority groups who do not speak the main language in a country. Language assistants that are perceived as human may also cause distress for individuals using the service.

Comment - Content generation

Content generation concerns the analysis of large data sets to categorize, process, triage, personalize, and serve specific content for specific contexts. This may include e.g. automatically generated news media pieces or weather forecasts based on AI review of other news sources or weather data, as well as generating and disseminating content based on official government statements and answers from ‘smart chat bots’.

 

The impacts of content generation depends on the content generated. Examples of human rights impacts include if the content generated feeds into other AI systems, making the entire process highly opaque and difficult to understand, potentially rendering decisions that no one can explain.

For more information about different AI solutions and their potential human rights impacts, see:

Regulations:

Are there processes in place to assess biases in datasets?

Bias is a form of injustice against an individual (or group). While there are different kinds of bias in AI applications, this question concerns bias in datasets such as data that is not representative or data that reflect prejudices (e.g. an AI recruiting tool fed with historical hiring decisions which favored men over women). It may be necessary to discuss the question of bias with the developer of the AI application in order to respond. See resources for more information.

Comment - Yes

That means you have considered one significant aspect of potential negative human rights impacts. However, AI tools are only as good as the data they rely on and it is therefore important to continuously assess whether the data is valid, reliable, current, sufficient etc. Existing data that is biased or discriminatory is going to produce results that are bias or discriminatory.

Potentially impacted human rights and principles - No

Right to equality and freedom from discrimination

Recommendation - No

Plan and conduct an assessment of potential biases in the datasets.

Potentially impacted human rights and principles - I don't know

Right to equality and freedom from discrimination

Recommendation - I don't know

Consult project partners and other third-parties involved in the development of the AI model. If they are not aware of any processes to assess biases in datasets, it is likely that bias has not been assessed.

Resources:

Further reading:

1. FAIR Forward – Artificial Intelligence for All initiative

  • Natural language processing AI can be used to generate and share information in a targeted and individual way. The same type of system can also be used to reach people who cannot read. In order to train such systems there is a need to have a large amount of training data. For a variety of reasons such necessary language data has been very limited from African and Asian nations. The ‘FAIR Forward – Artificial Intelligence for All’ initiative was developed in order to close this gap concerning language data and provide fair and open access to it. On the back of this work, and the increasing availability of national language data and outcome of the Fair Forward initiative has been the development of a chatbot in Rwanda, that will help millions of people there receive coronavirus advice in their local language.
  • Source: GIZ, “Artificial Intelligence For All

2. Use of AI in corona response activities

  • “Data analytics supported by AI is having a predominant role in monitoring of the infections, deaths and recovery rates all over the world. Institutions have managed to draw the trends that Covid-19 has in different countries and conditions. Data collection, monitoring and analysis technologies are also needed in developing countries in order to track the virus progress and adapting the response and mitigating measures.
  • Artificial Intelligence is being used to detect coronavirus through computerized tomography scans analysis. Imaging departments in healthcare facilities are being overwhelmed with the increased workload created by the virus. This solution improves CT diagnosis speed. Some big tech companies claim to have built AI powered analysis systems able to read scans in seconds giving a 96% accurate diagnosing of the virus. In African countries, lacking of specialised doctors and labs, these technology can be disruptive.
  • Intelligent tracing and mobile technologies are being used by governments all over the world to track people movements and prevent further expansion of the virus. The EU needs to support African governments in using the large amount of mobile data available considering the privacy-conscientious use of mobile phone data. Technology solutions are available allowing tracing people movement taking into account the risks of misuse of personal data.”
  • Source: Toolkit Digitalisierung, “Practice: Big Data and AI

Have biases in (unexpected) outputs and outcomes related to the AI application been assessed?

This question concerns the bias in the outputs and outcomes of the AI application, as opposed to the underlying data. Consider whether anyone has assessed that specifically.

Comment - Yes

That means you have considered one significant aspect of potential negative human rights impacts of automated decision-making systems.

Potentially impacted human rights and principles - No

Right to equality and freedom from discrimination

Recommendation - No

Plan and conduct an assessment of biases in outputs and outcomes related to the AI.

Potentially impacted human rights and principles - I don't know

Right to equality and freedom from discrimination

Recommendation - I don't know

Consult project partners and other third-parties involved in the development of the AI model. If they are not aware of any processes to assess biases in outcomes, it is likely that bias has not been assessed.

As an example, an AI model aimed at analyzing facial expressions of individuals and detect the emotional state of the individual might have perfectly unbiased data to learn from and is therefore not discriminatory in its development and design. However, since the science behind emotional recognition technology is questionable, there is a risk that any outputs of this AI model are discriminatory. This is particularly true when the context is not taken into account and may therefore have discriminatory impacts on for example individuals from ethnic minority groups or persons with disabilities.

Source: Vox Recode, “Artificial intelligence will help determine if you get your next job“, 12 Dec 2019

Has a specific gender-based analysis of the development and use of the AI solution been conducted?

This may include assessing whether predictions made by an AI model are discriminatory on the basis of gender. See resources for further information.

Comment - Yes

That means you have considered one significant aspect of potential negative human rights impacts of AI systems.

Potentially impacted human rights and principles - No

Right to equality, freedom from discrimination and women’s rights

Recommendation - No

Plan and conduct a gender-based analysis of the implementation of the project or solution.

Potentially impacted human rights and principles - I don't know

Right to equality, freedom from discrimination and women’s rights

Recommendation - I don't know

Consult project partners and other third-parties involved in the development of the AI. If they are not aware of any specific gender-based analysis of the system, it is likely that this has not been assessed.

GIZ resources:

Further reading:

Many virtual personal assistants (such as Siri, Alexa and Cortana) and chatbots have female names and come with a default female voice. The organization that develops or uses these virtual assistants with ‘female features’ may reinforce stereotypes as well as the social reality in which a majority of personal assistants or secretaries in both public and private sectors are women. It will therefore always be important to assess any unexpected negative impacts related to gender.

For more, see Surya Deva, “Addressing the gender bias in artificial intelligence and automation“, OpenGlobalRights, 10 April 2020

Has the AI system been tested for human rights risks after its deployment, or is there a clear plan for such testing?

Consider whether for example a chatbot has been assessed after its deployment to understand whether any particular groups are at risk of discrimination.

Comment - Yes

Constant monitoring, evaluating and retraining are essential practices to identify and correct embedded bias and disparate outcomes.

Potentially impacted human rights and principles - No

Right to equality and freedom from discrimination

Recommendation - No

Develop a plan and test the AI system for accuracy and human rights risks after its deployment.

Comment - No

Even the most well-intended algorithms can have unintended consequences. Constant monitoring, evaluating and retraining are essential practices to identify and correct embedded bias and disparate outcomes

Potentially impacted human rights and principles - I don't know

Right to equality and freedom from discrimination

Recommendation - I don't know

Consult project partners and other third-parties involved in the development and/or deployment of the AI system. If they are not aware of any such processes, it is likely that the system has not been tested for biased or discriminatory outcomes after its use and deployment.

Some health systems rely on AI risk-prediction tools in order to identify and select patients who are in need of high-risk care. If selected to these programs, the patients will be provided additional resources, as well as receive greater attention from trained providers, to help ensure that care is well coordinated. Many health systems have the AI tools as key aspects of their population health management efforts since they are considered effective at improving outcomes and satisfaction while reducing costs, which is often a pursued goal in improving healthcare in emerging economies. These high-risk care programs are very expensive to run and therefore health systems often rely on AI tools to identify patients who will benefit the most from the programs.

In attempting to identify which patients will get the most benefit for entering such programs, the AI tools must make some inference based on the data that is available. One way of identifying the greatest care need is to look at how much the patients have historically paid for their care. That, however, can turn out to be discriminatory against vulnerable populations who have had a lower ability to allocate significant resources to their care, even though the need might have existed.

In this scenario, the problem should have been identified pre-deployment already. However, if that was not the case, post-deployment analysis could illustrate the racial disparities in enrollment in the program and identified the reason for it.

For more, see: Obermeyer et al., “Dissecting racial bias in an algorithm used to manage the health of populations“, Science 366, 447–453 (2019).

Is the project or component related to a high-risk sector or public service delivery?

High-risk sectors are sectors where decisions taken can have severe impacts on individuals, this includes:
- Surveillance and remote identification
- Law enforcement
- Migration, asylum and border control
- Judiciary
- Education and vocational training
- Employment services and workers management
- Critical digital infrastructure
- Essential private service delivery, including banking and insurance.
- Essential public service delivery, including social security, and other public benefits and services.

Comment - Yes

If the project or solution is focused on a high risk sector or essential public service delivery, extra care should be taken in relation to any application or development of digital tools or services, since the potential human rights risks may be severe.

Comment - No

If not a high-risk sector or essential public service delivery, the project can in principal be considered low-to-medium risk, unless the specific application is highly sensitive.

Comment - I don't know

Consult project partners and the resources section and consider whether the sector in question is high-risk or not. In case of doubt, consider answering yes to the question since that still implies that risks may be high.

For more on high-risk sectors see: EU Commission, AI regulation proposal, Annex III

 

A project aimed at the development of an e-registration platform for migrant workers could be considered a high-risk sector since a flawed implementation of the project could both have impact on employment and on migrant status.

Is the digital solution used or applied in a way that significant human rights impacts may occur?

Significant human rights risks in implementation or use may for example exist where the user or affected individuals belong to marginalised groups. See case study for example.

Comment - Yes

High-risk applications should receive adequate resources to conduct an in-depth assessment of related human rights risks and potential preventive or mitigation measures. This should include extensive stakeholder and rights holder engagement.

Comment - No

The project or solution can be considered low-medium risk, unless it concerns a high-risk sector in a high-risk country context. In those scenarios, there is still need for extra caution.

Comment - I don't know

Consult project partners and others who are further informed about the implementation of the project or use of the solution. In case of doubt, consider answering yes to the question since it is possible that it is a high risk scenario.

Employment services and unemployment benefits can be considered part of a sensitive sector which warrants extra attention. However, if a project simply includes AI models that are used to improve scheduling of meetings with those seeking unemployment benefits this is unlikely to imply significant risks to human rights. If, on the other hand, AI models were used to predict the likelihood on successful job applicants and thereby determine resource allocation to ‘the best candidates’, potentially significant risks to human rights are at stake.

Is the project or solution developed in a high-risk country context with regard to human rights?

A high-risk context is likely to increase the potential risks for negative human rights impacts. Considering whether any of the following apply to the relevant country:
- no, or weak, data protection legislation
- poor rule of law, including inefficient court system
- conflict affected country
- significant persecution of human rights defenders
- significant limits to civil and political rights
- unstable security situation
- high likelihood or history of censorship, intimidation, violence, etc. against marginalised groups
- cybercrime is prevalent
- online harassment is common.

Comment - Yes

Heightened attention will need to be paid to all kinds of data processing and AI applications, particularly with regard to groups that are vulnerable in the specific context. Review and update country context analysis performed during the project design phase with the specific digital product or service in mind.

Comment - I don't know

Consult country context analysis developed during the project design phase (if available), and consider whether the country context is high-risk or not. In case of doubt, consider answering yes to the question since the country may be high-risk.

Comment - No

A low-to-medium risk country context often means that rule of law is stronger and that there are multiple safeguards in place on a national level.

GIZ resources:

Resources for country context:

Risk profile of project
Recommendation

Based on your answers to the risk questions, the digital component seems to involve low-to-medium risk. While that is positive, there is still need to be vigilant in case the context or application changes. Further sections around stakeholder engagement remain highly relevant even as potential risks are low.

Risk profile of project
Recommendation

Based on your answers on the risk questions, there is no high-risk application involved. However, both the country context and sector are high-risk which means that you will need to pay extra attention to the application and use of the digital component. Slight changes to the initial plan may pose significant risks.

Risk profile of project
Recommendation

Heightened attention will need to be paid to all kinds of data processing, AI applications etc., particularly with regard to groups that are vulnerable in the specific context. Any country context analysis performed during the project design phase should be updated with the specific digital product or service in mind.

Does the country where the project or solution is being applied have adequate data protection legislation?

Consider whether 1) data protection legislation exists, and 2) whether it seems adequate from a human rights perspective. In assessing the adequacy of data protection legislation, you can refer to the resources section as well as use the EU General Data Protection Regulation (GDPR) as a benchmark. Under the GDPR, the European Commission has also recognized a number of countries as providing adequate protection.

Comment - Yes

This lowers the risk-levels of data processing. However, if monitoring and enforcement is lacking, it may not mean much in practice and it will remain important to consider the human rights impacts of data related activities.

Potentially impacted human rights and principles - No

Right to privacy

Recommendation - No

The project or solution should limit data processing activities, particularly as it pertains to partners who may not have adequate data collection and processing standards and processes in place. Pay particular attention to consent for data collection and data minimization, since there will likely be no efficient external oversight of the activities.

Potentially impacted human rights and principles - I don't know

Right to privacy

Recommendation - I don't know

Identify the current status on data protection legislation in the country relevant to the project or solution.

All digital projects and solutions depend on the possibility to collect, store, treat, alter and share data, and many may also increase the possibilities to do just that. If an innovative project on the use of Internet of Things (IoT) devices for public good is implemented in a country without data protection, little can be expected in terms of access to remedy if data is not handled properly. Therefore, it is important to understand the regulatory context that the digital project or solution be implemented or applied in, in order to identify possible risks. If data protection regulation is in place and enforcement is efficient, that lowers the risk of data related activities.

Are you aware of similar digital projects or solutions that have been implemented or used in the same location previously?

This could for example be if a similar e-learning effort have been made in the same country, by your own institution or by a different one.

Comment - Yes

Consider the outcomes of past projects and engage with the project owners as well as external stakeholders, as relevant. This may help you assess which preventive and mitigation measures should be taken in your project during its implementation.

Comment - No, there have not been similar projects or solutions in the same location previously.

That means your project can potentially help others in the future. Considering being as transparent as possible to assist others implementing similar projects or using similar tools.

Recommendation - I don't know whether there have been similar projects or solutions in the same location previously.

Assess whether similar projects have taken place before. That will help inform the identification of potential human rights impacts as well as preferred actions to avoid, prevent and address impacts.

In case a project aimed at, for example, developing language databases for written languages in the same country or region has been implemented in the country in the past, and you are planning a similar project in the country or region, it could be very useful to engage with the project owners of the previous project. This could help you learn what issues they came across, how they avoided identified pitfalls and how human rights risks were handled more broadly.

Are you aware if the previous projects were involved in negative impacts on human rights?

To answer this you may need to review reports published by the relevant project itself, as well as news media or civil society reports.

Comment - Yes, the previous projects were involved in negative impacts on human rights.

Consider which kind of impacts in the next question.

Recommendation - No, the previous projects were not involved in negative impacts on human rights.

Assess similar past projects and consider what was put in place in terms of safeguards to avoid and prevent negative impacts on human rights. See if they can be replicated in your project.

Comment - I don't know whether the previous projects were involved in negative impacts on human rights.

Reach out to those involved in the project or others that have knowledge of it, and see whether they are able to share more information about potential impacts (and their prevention or mitigation).

Which negative human rights impacts were the past projects involved with?

If you have the information, please respond which human rights were impacted in the past.

Potentially impacted human rights and principles - Right to privacy

Right to privacy

Recommendation - Right to privacy
Potentially impacted human rights and principles - Freedom from discrimination

Freedom from discrimination

Potentially impacted human rights and principles - Freedom of expression

Freedom of expression

Recommendation - Freedom of expression

Take preventive measures to ensure that the digital project or component is not having the same negative impacts as past projects. Thoroughly assess the potential impacts on the right to freedom of expression.

Recommendation - I don't know

Reach out to those involved in the past project or others that have knowledge of it, and see whether they are able to share more information about potential impacts (and their prevention or mitigation). Take preventive measures to ensure that the digital project or component is not having the same negative impacts.

Recommendation

Take preventive measures to ensure that the digital project or component is not having the same negative impacts as past projects.

In the case of digital ID projects, for example, much has been written by both civil society and news media about the related potential human rights impacts. In the case of the national digital ID project in Uganda one study has identified that the project included “over-collection of personal data” when personal data was collected for registration purposes. According to the study, the human rights concerns were even greater as the data protection safeguards were inadequate. The project has also changed in scope during the implementation and data has reportedly been shared with law enforcement and telecommunications companies, posing further risks to the right to privacy and other human rights. Finally, social minorities such as the elderly and the economically disadvantaged, as well as women, have been subject to exclusionary practices.

For more, see: Research ICT Africa, “Digital Identity in Uganda“, 2021

Have unintended negative consequences of the project or solution been assessed?

This includes considering unintended consequences related to the intended project objective or use of the digital solution, but also the unintended use of the digital solution.

Comment - Yes

Discuss the potential unintended negative consequences with external stakeholders, including potentially impacted rights holders in order to verify the assessment and to identify preventive measures. Consider also the risk of those scenarios materializing.

Recommendation - No

Assess potential unintended negative consequences with the project team and other partners involved.

This could e.g.be in the form of a workshop where future scenarios and worst-case scenarios are discussed. Discuss your findings with external stakeholders, including potentially impacted rights holders, while also leaving space for them to raise any other concerns of potential negative consequences of the project.

GIZ resources:

Further reading:

  • See an example of assessment of potential future impacts in Business for Social Responsibility’s Google Facial Recognition assessment and concept of “Human Rights by Design”

If a project includes the development of an e-registration system for migrant workers it is likely that those using the system will be consulted on their experiences and any potential negative human rights impacts can swiftly be addressed. However, there is also a risk that it is the migrant workers who chose to not use the system that will be negatively impacted and an unintended consequence of the project might be that they cannot find employment.

Do the intended users and/or other rights holders belong to marginalized groups?

Particular attention must be paid to marginalized individuals/groups, as they are at a higher risk of being unable to anticipate, cope with, resist and recover from project-related risks and adverse impacts. Marginalized individuals will depend on the context but may include children, women, religious or ethnic minorities, persons with disabilities and migrants. You should always make an individual assessment of vulnerabilities based on the local realities. As an example, while a digitalization project to improve purely internal administrative processes within a government agency does not seem to involve marginalized groups, it might if the processes affect decision-making of importance to individuals (e.g. decisions about social security payments).

Comment - Yes

Seeing as marginalized groups may be impacted by the project, extra care must be taken to ensure that the marginalizedgroups are not adversely impacted. Ensure that those groups and/or their legitimate representatives are consulted and engaged with in order to be able to assess any negative impacts particularly concerning the groups in question.

Comment - No

In most scenarios this might imply that human rights risks are lower. However, make sure to consider potential discrimination risks related to marginalized groups not being part of the intended users. Make sure to specifically consider whether those purposefully not engaging in the project belong to marginalized groups.

Recommendation - I don't know

Consult the further resources, and consider whether or not marginalized groups may be impacted by the project. Consider both direct users, but also those that might otherwise be impacted.

Comment - I don't know

Consider whether marginalized groups may in fact be impacted.

 

Consider the following example: a country introduces an e-registration system for job-seekers. However, migrant workers tend to not register. While the migrant workers are not as such users of the service, they are still potentially impacted by the move to an e-registration system.

Which of the following groups may be impacted?

The list is not exhaustive and there may, depending on the context, be other marginalized groups groups that have been identified as potentially impacted by the project. If so, please select the "other" option and write down the marginalized group you have identified.

Recommendation - Children and young persons

Conduct rights holder engagement, and consider ‘children and young persons’ specifically, including:

  • Conduct consultation with children in coordination with child participation experts to facilitate participation respecting ethical standards
  • Design the process so it is accessible, inclusive and meaningful for children
  • Ensure voluntary participation in child-friendly environment
  • Ensure that engagement
  • Conduct consultations both with and about children and young people
  • Consider engagement with parents and caregivers, teachers, community leaders, youth organisations and other with children’s best interests in mind
Recommendation - Women and girls

Conduct rights holder engagement, and consider ‘women and girls’ specifically, including:

  • Consult women separately in a gender-sensitive manner
  • Include women human rights impact assessment team members
  • Include human rights impact assessment team members with knowledge of the particular rights and experiences of women and girls, particularly in relation to digital projects, products and services
  • Exclude male team members from certain interviews
  • Provide safe and comfortable space for interviews
  • Include particularly vulnerable sub-groups (e.g. female human rights defenders, young girls, etc.)
  • Consider proactive and innovative approaches to lower the barrier for women to engage (e.g. providing childcare during meetings)
Recommendation - Indigenous peoples

Conduct rights holder engagement, and consider ‘indigenous peoples’ specifically, including:

  • Include human rights impact assessment team members with knowledge of indigenous peoples’ rights and local context
  • Respect indigenous representative institutions, be sure to understand the cultural and organisational characteristics of indigenous peoples and hierarchy of authorities in order to engage with the right people in the right order and manner
  • Use appropriate language for the context
  • For projects targeting or otherwise impacting indigenous peoples, ensure that para and per-indigenous methodologies are the basis for their development, when possible
  • There is a risk of imposing unwanted processes or structures upon indigenous recipients
Recommendation - Workers and trade unions

Conduct rights holder engagement, and consider ‘workers and trade unions’ specifically, including:

  • Make sure to meet different categories of workers and trade union leaders (e.g. by gender, position, unionised vs. non-unionised, etc.)
  • Include ‘informal workers’ in human rights impact assessment
  • Fix a time that suits their work schedule
  • Consider interviewing workers outside of company premises and outside working hours
Recommendation - Minorities (national, racial, ethnic, religious or political)

Conduct rights holder engagement, and consider ‘minorities (national, racial, ethnic, religious or political)’ specifically, including:

  • Minorities may speak another language than the national language; engagement with minority groups should be conducted in a language they understand and feel most comfortable communicating in
  • Engagement should be culturally appropriate
  • Given the different characteristics of specific minority groups, it can be useful to include an anthropologist in the team who has expertise in engaging with the minority group in question
  • Ensure wide participation from within the minority community during engagement rather than only dealing with select community leaders who may not represent the community as a whole
Recommendation - Persons with disabilities

Conduct rights holder engagement, and consider ‘persons with disabilities’ specifically, including:

  • When engaging with persons with particular physical or psychological disabilities, ensure that the location for meetings and/or the way of engaging is accessible and measures are taken to make engagement meaningful (e.g. ensuring sign language interpretation, information available in braille)
Recommendation - Older persons

Conduct rights holder engagement, and consider ‘older persons’ specifically, including:

  • When engaging with older persons, ensure that the location for the meetings and mode of engaging is accessible, bearing in mind the greater likelihood of particular needs (e.g. wheelchair-friendly access and simple and user-friendly digital solutions)
Recommendation - Migrants, refugees, stateless and displaced persons

Conduct rights holder engagement, and consider ‘Migrants, refugees, stateless and displaced persons’ specifically, including:

  • Due to their insecure legal status, individuals belonging to this rightsholder group, especially those without a residence permit, may be hesitant to speak openly, fearing that they may face repercussions; it is important to provide a safe space when engaging with migrants, refugees, stateless and/or displaced persons
  • While for engagement with rightsholders is in general imperative to keep identities of interviewees confidential, for this group confidentiality requires extra special attention
  • Consider remote or virtual engagement via encrypted communication channels to protect their safety
Recommendation - Lesbian, gay, bisexual, trans, intersex and queer (LGBTIQ+) persons

Conduct rights holder engagement, and consider lesbian, gay, bisexual, trans, intersex and queer (LGBTIQ+) persons specifically, including:

  • Assessors should be appropriately trained on LGBTIQ+ issues when engaging with them
  • Ensure that LGBTIQ+ persons feel comfortable to provide information by ensuring that the collected data remains confidential
  • Consider the possibility of anonymised forms of engagement
  • When designing engagement plans ensure that the communities concerned are represented in their full diversity.

Are the intended users or beneficiaries of the digital project or solution individual persons?

The digital solution can be meant for individual persons (e.g. a digital education platform for individual users) or it can be targeted at e.g. an organization (e.g. an AI model helping with administrative processes with no interaction with external individuals). If the digital solution is meant for individuals there is increased need to focus on their ability to access and use it.

Comment - Yes

If the digital solution is meant for individual persons, it will be important to make sure it is accessible.

Comment - No

In most scenarios this might imply that human rights risks are lower. However, make sure to consider potential discrimination risks related to accessibility nonetheless since institutions also have individual users.

Comment - I don't know

If it concerns e.g. online classrooms, track and tracing apps, chatbots in public service, tele-medicine platforms, digital communications platforms, there are likely to be individual persons as users or beneficiaries.

GIZ resource:

  • Digital Principles for Inclusive Design: A Tool for Accessibility and Equity (Design with people)

Other resources:

Is the project or solution accessible, particularly to marginalized groups?

Examples of factors affecting accessibility are:
- costs: is it affordable to most people (taking into account also cost of hardware as well as data tariffs)?
- language barriers
- digital literacy of users
- digital infrastructure: is the necessary physical infrastructure (e.g. broadband access, stable internet connection) in place?
- discrimination: do certain societal groups face additional social or cultural barriers?
- awareness: is everyone of the intended user group aware of the product or service?
See further resources for additional considerations.

Comment - Yes, and it has particularly been assessed.

Consider adopting a monitoring process to ensure its accessibility over time.

Potentially impacted human rights and principles - No

Right to equality and freedom from discrimination

Recommendation - No

Work together with marginalized groups and/or their representatives to ensure that the tool. product or service is accessible, in particular to marginalized groups.

Potentially impacted human rights and principles - I don't know

Right to equality and freedom from discrimination

Recommendation - I don't know

Engage with potentially marginalized groups that might not find the digital tool, product or service accessible and hear their views, to better understand the overall accessibility.

GIZ resource:

  • Digital Principles for Inclusive Design: A Tool for Accessibility and Equity (Design for Inclusion)

Other resources:

Is the accessibility of the digital solution monitored, or will there be a plan for monitoring?

This concerns e.g. if there is a process in place to monitor who the users are and who are not using the solution, and whether there is a simple process to provide feedback around accessibility concerns, and so forth.

Comment - Yes

Accessibility related to digital products and services should be ensured throughout the life of the product or service. Ensure that you engage with intended users at regular intervals to ensure that it remains accessible.

Potentially impacted human rights and principles - No

Right to equality and freedom from discrimination

Recommendation - No

Work together with users, particularly from vulnerable groups, and/or their representatives to develop a monitoring plan in order to ensure that the tool, product or service remains accessible and that any issues with accessibility are addressed as quickly as possible.

Potentially impacted human rights and principles - I don't know

Right to equality and freedom from discrimination

Recommendation - I don't know

Consult with project partners and others responsible for the maintenance of the digital product or service in order to find out what the plan is for ensuring accessibility throughout the life of the product or service.

Were intended users and/or other rights holders involved in the design of the technology policy, solution, or system?

To design with people means involving those who will use or be affected by a technology policy, solution, or system in its design process. Successful digital initiatives are rooted in a deep understanding of user characteristics, needs, and challenges. This involves engaging multiple relevant stakeholders, such as beneficiaries and administrators, in both the initial design and subsequent iterations. Embracing human-centered design means actively engaging target groups through conversations, observations, and co-creation.

Recommendation - Yes

Make sure to request rights holders’ feedback also for updates, expansions, and quality checks. Establish inclusive avenues for feedback and redressal that are regularly monitored.

Potentially impacted human rights and principles - No

Right to meaningful participation

Recommendation - No

Involve rights holders in updates of the technology policy, solution, or system. Establish inclusive avenues for feedback and redressal that are regularly monitored.

Comment - No

Many existing digital technologies pose significant challenges for marginalized groups, hindering their usability and exacerbating inequalities. Designing with people ensures that these groups are actively involved in the development process, making sure their needs and preferences shape the final product.

Potentially impacted human rights and principles - I don't know

Right to meaningful participation

Recommendation - I don't know

Consult with project partners and others involved in the development of the digital component and see whether design with people has been implemented. If not, involve rights holders in updates of the technology policy, solution, or system. Establish inclusive avenues for feedback and redressal that are regularly monitored.

Comment - I don't know

Many existing digital technologies pose significant challenges for marginalized groups, hindering their usability and exacerbating inequalities. Designing with people ensures that these groups are actively involved in the development process, making sure their needs and preferences shape the final product.

An illustration of a gender-sensitive enrollment process is evident in India’s Aadhaar program, a nationwide biometric ID system. The enrollment centers diligently follow guidelines to foster favorable conditions for women, such as deploying female operators and volunteers to assist female enrollees. Special provisions, including a separate space for women who prefer to keep their faces covered in the presence of men due to social norms and enrollment stations tailored for physically challenged women, contribute to the inclusivity of the process.

Simultaneously, the program has showcased a commitment to designing for marginalized users. While initial studies lacked systematic sampling of persons with disabilities, recommendations emphasized integrating usability and ergonomics into iris sensor specifications. The current UIDAI specifications prioritize ease of use, incorporating features like physical, video, and audio aids, along with feedback mechanisms. To ensure inclusivity in both aspects, the Unique Identification Authority of India (UIDAI) collaborated with a visually impaired consultant advocating for universal access. Practical measures, such as piloting enrollment camps for around 870 persons with disabilities, exposed technical and procedural challenges. UIDAI responded by organizing disability sensitization workshops and incorporating pertinent questions, demonstrating their dedication to an identity project that is both inclusive and accessible.

Have you or your partners specifically engaged stakeholders on the potential impacts of the digital solution?

Stakeholders can both be internal (various project staff and functions) or external (including government actors and project partners involved in the development or use of the digital solution, as well as civil society organisations, academic institutions and rightsholder groups). Engagement can take many forms, such as: focus groups, in-person or virtual interviews, public hearings.

Potentially impacted human rights and principles - Yes, internal stakeholders.

Right to an effective remedy and right to meaningful participation

Recommendation - Yes, internal stakeholders.

Engage with external stakeholders as soon as the initial internal analysis of potential negative human rights impacts has been made. Focus specifically on marginalized groups previously identified.

Comment - Yes, internal stakeholders.

Engaging with internal stakeholders is important in order for everyone to have a common understanding of the issues.

Potentially impacted human rights and principles - No

Right to an effective remedy and right to meaningful participation

Recommendation - No

Engage with both internal and external stakeholders on the topic of potential negative human rights impacts. Focus specifically on marginalized groups previously identified.

Comment - No

Engaging with internal stakeholders on the topic of potential unintended negative human rights impacts is important in order to build internal capacity and ownership of the management of risks. Engaging with external stakeholders is important to validate your findings and analysis with the stakeholders that might have further insights to potential impacts, including the rights holders that might be impacted.

Potentially impacted human rights and principles - I don't know

Right to an effective remedy and right to meaningful participation

Recommendation - I don't know

Consult with project partners and others involved in the development of the digital component and see whether there has been any engagement with internal and external stakeholders on potential negative human rights impacts.

Have rights holders, including non-users, and/or their legitimate representatives been engaged?

Rights holders include any individual whose rights might be impacted, and there includes both intended users and non-users.

Comment - Yes, intended users and other rights holders

That means you have considered one significant aspect of stakeholder engagement, namely getting to hear the perspectives and thoughts directly from potentially impacted rights holders. The important aspect is to ensure that engagement has been meaningful.

Potentially impacted human rights and principles - Yes, intended users

Right to an effective remedy and right to meaningful participation

Recommendation - Yes, intended users

Develop a plan to engage with potentially impacted ‘non-users’ of the product or service.

Comment - Yes, intended users

It is important that potentially impacted ‘non-users’ are also engaged during stakeholder engagement. Otherwise there is a risk that you are not aware of impacts related to e.g. those who might be impacted by the fact that they are not using the digital service or product (e.g. there might be issues with accessibility)

Potentially impacted human rights and principles - No

Right to an effective remedy and right to meaningful participation

Recommendation - No

Develop a plan to engage with rights holders, including non-users and/or their legitimate representatives.

Comment - No

It is important that rights holders (users and others) are specifically engaged during stakeholder engagement. Otherwise there is a risk that you miss significant impacts that other external stakeholders (who are not rights holders) are not aware of.

Potentially impacted human rights and principles - I don't know

Right to an effective remedy and right to meaningful participation

Recommendation - I don't know

Consult with project partners and others involved in the development of the digital component and see whether engagement with rights holders has taken place.

Have they provided input on the potential impacts of the project?

This will be the case when the engagement consists not only of information sharing, but also obtains input from external stakeholders on the digital component and its potential impacts.

Comment - Yes

This is an important aspect of ensuring meaningful stakeholder engagement with regard to respecting human rights.

Potentially impacted human rights and principles - No

Right to an effective remedy and right to meaningful participation

Recommendation - No

Develop a plan for engaging with rights holders, focusing on obtaining their input on human rights risks and impacts.

Comment - No

It is important that any engagement with stakeholders, in general, and rights holders, in particular, is not one-way communication. Rather, it should be possible for rights holders or their legitimate representatives to input on the internal analysis as well as add their own perspective to topics that might not have been covered in that analysis.

Potentially impacted human rights and principles - I don't know

Right to an effective remedy and right to meaningful participation

Recommendation - I don't know

Consult with project partners and others involved in the development of the digital component and see whether rights holders have had a chance to provide their own input.

Have the issues raised been addressed?

Consider what concerns were raised and assess whether any measures have been taken to prevent or mitigate those impacts.

Comment - Yes

That means you have considered one significant aspect of stakeholder engagement, namely to ensure that the engagement is meaningful and that it impacts the project implementation process as necessary. It is also important that you report back to the engaged stakeholders on what measures have been taken.

Recommendation - No, but the process of addressing such issues is underway.

Report back to rights holders previously engaged on the progress of addressing identified potential human rights issues, and provide a preliminary timeline.

Comment - No, but the process of addressing such issues is underway.

It is important that stakeholders, particularly marginalized rights holders, receive updates on the process so that they are able to assess whether the adequate adjustments have been made. If actions are delayed, you should still report back to stakeholders with preliminary timelines even if you are not able to report back on what exact actions will be taken.

Potentially impacted human rights and principles - No, there are no such plans.

Right to an effective remedy and right to meaningful participation

Recommendation - No, there are no such plans.

Review the consultation material to see whether there are in fact adjustments and accommodations to be made to ensure that human rights impacts are adequately prevented, mitigated or otherwise addressed.

Comment - No, there are no such plans.

In order to ensure that stakeholder engagement, particularly with marginalized groups of rights holders, is meaningful it is essential that potential and actual negative human rights are addressed.

Potentially impacted human rights and principles - I don't know

Right to an effective remedy and right to meaningful participation

Recommendation - I don't know

Consult with project partners and others involved in the development and implementation of the digital component and see whether issues raised by rights holders have been addressed.

Have you or the project partner reported publicly on the potential impacts, mitigation measures, stakeholder engagement and other processes related to the questions in this tool?

This can include specific reports on the digital component in question, but also larger reports that also include information about the digital component, the potential impacts identified and other related activities. It is important to provide information that is relevant to external stakeholders, which is why it is important to cover all of the topics mentioned in the reporting.

Comment - Yes, on all of the topics.

It remains important to continue to update public communication as the project implementation and roll-out of the digital component progresses. A plan for further reporting and transparency should be developed, including how the information should reach the intended audience, which should include potentially impacted rights holders. Plan communication to rights holders so that the information is accessible to the various groups impacted by the project or component.

Comment - Yes, on some of the mentioned topics.

It is positive that some reporting has taken place in order to ensure greater transparency, which improves accountability in relation to the project in general, and the digital component, in particular. However, it is important that everything from identified impacts to the effectiveness of mitigation measures is reported in order to increase the accountability of projects.

Potentially impacted human rights and principles - Not yet. It is underway.

Right to meaningful participation and right to access to information

Comment - Not yet. It is underway.

It is important that stakeholders, particularly marginalized rights holders, receive information about the project implementation so that they are able to assess whether they agree with the impact analysis and whether appropriate actions have been taken.

Potentially impacted human rights and principles - No, there are no such plans.

Right to meaningful participation and right to access to information

Recommendation - No, there are no such plans.

Work with your partners and develop a transparency and communication plan around the impacts identified, engagement with stakeholders and planned preventive and/or mitigating actions.

Comment - No, there are no such plans.

Transparency is an important aspect of a rights-based approach to human rights impact assessments.

Potentially impacted human rights and principles - I don't know

Right to meaningful participation and right to access to information

Recommendation - I don't know

Consult with project partners and others involved in the development and implementation of the digital component and see whether any public reporting has occurred.

GIZ resource:

Other resources:

Human rights principles should guide development cooperation and programming. Among other things, this means that it is important to consider the principle of participation and inclusion in your project or in relation to the use of a digital solution. Transparency and reporting is essential in order to fulfill the principle, as it can also strengthen rights holders’ capacity to claim their rights. For more on a human rights-based approach, see:

  • Right to Participation

What has not been reported on?

Consider which of the topics listed have not been reported on in any shape or form. To report can also include simple direct communication to all impacted individuals, which can be a simple effort if it only concerns a small group of people.

Recommendation - Potential impacts

Draw up a plan on how to increase the transparency efforts in relation to the identified potential impacts.

Recommendation - Stakeholder engagement

Draw up a plan on how to increase the transparency efforts in relation to stakeholder engagement.

Recommendation - Mitigation measures

Draw up a plan on how to increase the transparency efforts in relation to mitigation measures.

Is there a mechanism in place to capture feedback, complaints or grievances by users and non-users of the digital project or solution?

It is important to have a mechanism for individuals (both users and non-users) to submit complaints or concerns about the digital project or solution. This can be in the form of telephone hotlines, basic email accounts, chat service, as well as physical mailboxes, among other things. In addition, GIZ projects need to appoint a data protection officer that may be the corporate data protection officer. Such mechanisms allow potential human rights risks and impacts to be detected early and help identify those whose rights have been adversely affected so that remedy can be provided. See resources for further information on grievance mechanisms.

Comment - Yes, for users and non-users.

It is an essential aspect of respecting human rights to have a mechanism in place that can address grievances. Review the effectiveness criteria from the UN Guiding Principles on Business and Human Rights to see whether the existing grievance mechanism can be improved (see further resources).

Potentially impacted human rights and principles - Yes, for users only.

Right to an effective remedy and right to meaningful participation

Recommendation - Yes, for users only.

Review the grievance mechanism and the communication around it, ensuring that potentially impacted non-users also can access it.

Comment - Yes, for users only.

It is important that non-users are also able to raise their complaints. This can for example be the case when those who have not registered in an e-registration project are not heard, while it may be their concerns that are of most interest to ensure that potential human rights impacts are avoided or addressed.

Potentially impacted human rights and principles - Not yet. It is underway.

Right to an effective remedy and right to meaningful participation

Recommendation - Not yet. It is underway.

Structure a plan for when the grievance mechanism will be in place and ensure that all relevant rights holders are able to access the mechanism and that they are made aware of its existence once it is in place.

Potentially impacted human rights and principles - No, there are no such plans.

Right to an effective remedy and right to meaningful participation

Recommendation - No, there are no such plans.

Work with partners to develop a mechanism that is aligned with the effectiveness criteria outlined in the UN Guiding Principles on Business and Human Rights (see further resources).

Comment - No, there are no such plans.

It is an essential aspect of respecting human rights to have a mechanism in place that can address grievances.

Potentially impacted human rights and principles - I don't know

Right to an effective remedy and right to meaningful participation

Recommendation - I don't know

Consult with project partners and others involved in the development and implementation of the digital component and see whether any grievance mechanism exists, whether it is used, and by who.

Thanks for participating

This is your results page.

Below you can find a summary of your project based on the answers you provided, as well as a compilation of the recommendations that have been given throughout the assessment. You will also be able to download a template human rights action plan to help you devise next steps to address the potential human rights impacts.

Please bear in mind that the assessment of the risk profile of your project and the recommendations given are based on your answers, and might be rendered inaccurate if there are specific features of your project affecting human rights risks that have not been taken into account in the assessment. You are encouraged to take these results as guidance and further consider the potential human rights risks and the appropriate action to take in light of the specific circumstances of your project.