Denmark

Denmark’s AI Welfare System Faces Criticism for Discrimination

Amnesty International has raised alarms about Denmark’s use of artificial intelligence in its welfare system, warning that it could discriminate against marginalized groups, including people with disabilities, low-income individuals, and migrants.

The report, titled Coded Injustice, reveals how fraud detection algorithms and mass surveillance practices undermine privacy and create fear among beneficiaries.

Hellen Mukiri-Smith, an Amnesty researcher, emphasized that the system is designed to target rather than support those it aims to protect. The algorithms, which analyze sensitive personal data, may lead to arbitrary investigations based on non-traditional living arrangements or foreign affiliations, exacerbating existing societal inequalities.

Amnesty calls for a ban on using sensitive data in fraud risk assessments and urges the Danish authorities to ensure transparency and oversight in algorithm development. The organization stresses that Denmark must uphold its legal obligations to protect human rights, including privacy and non-discrimination.

Back to top button