New Reports Warn of Gender Inequality and Growing Risks as Artificial Intelligence Rapidly Reshapes Work and Justice Systems

New Reports Warn of Gender Inequality and Growing Risks as Artificial Intelligence Rapidly Reshapes Work and Justice Systems
———————————
A pair of new reports highlight widening gender disparities in the workplace and rising concerns over the use of artificial intelligence in global judicial systems, underscoring how rapidly advancing technologies are reshaping both employment and justice — often without adequate safeguards.
More details in the following report:
According to a study reported by The Independent, women face a disproportionately high risk of job loss from AI automation and are significantly less engaged with generative AI tools than men. The findings show that women are twice as likely to hold jobs threatened by automation, particularly in female-dominated sectors such as administration, cashiering, bookkeeping, and office support. The AI Gender Gap report by consulting firm Credera reveals that women make up only 22 percent of the global AI talent pool. Researchers also found that women are 20 percent less likely to use generative AI tools, reducing their access to emerging AI-dependent roles. The social enterprise Supermums warned that this trend poses “a real risk of women getting left behind,” with founder Heather Black stressing that mothers, in particular, could “pay the price” of the accelerating technological shift.
At the same time, a new United Nations report raises alarms about the rapid, largely unregulated integration of AI into court systems worldwide. Margaret Satterthwaite, the UN special rapporteur on the independence of judges and lawyers, told Anadolu Agency that courts are adopting AI tools “on an ad hoc basis,” often without protections to preserve judicial independence. She warned that AI lacks human reasoning and moral judgment, and should never replace a human judge. While digital tools can expand access to justice — for example, through translation services or simplified legal documents — the report documents serious risks, from biased algorithms influencing sentencing to opaque “black box” systems undermining fairness.
Examples cited include China’s Smart Court system, which automates millions of cases but raises transparency concerns, and the use of risk-assessment algorithms like COMPAS in the United States, which studies show disproportionately impact racial minorities. The report also warns of “techno-capture,” where courts may become dependent on private vendors, risking a transfer of authority over judicial processes.
Satterthwaite emphasized the need for global cooperation, strict guidelines, and judicial training to prevent AI from entrenching inequality — both in the workforce and in the courtroom. Together, the two reports paint a picture of rapidly expanding AI use that, without strong protections and inclusive access, risks deepening existing social and gender disparities while reshaping core democratic institutions.




