Allow workers, their representatives and public interest groups to test how the algorithms work

This could be done by providing API access to a sandboxed version of the system, making open source the key algorithms used on the platform or provide access to anonymised/synthetic data accurately reflecting the behaviour of the system. Companies should also consider sharing their source code and training datasets directly to further improve transparency and accountability.

While public access would be a gold standard, a more limited approach may be appropriate, in which only worker representative bodies or recognised academic or civil society organisations can access the testbeds, potentially via collective agreements or licensed access. The ability for workers to test how the algorithms work cannot be confined to a single instance and instead there should be an ability to repeatedly trial the system.  

Adequate documentation should be provided to make use of these resources. Different algorithms may require different processes: no specific auditing regime should be mandated as the focus should be on providing a pragmatic, flexible and effective way of allowing engagement with how the algorithms work in practice.

About us

This campaign was launched by Privacy International (PI) on 21 January 2025 with the support of 12 organisations.

PI is a London-based non-profit, non-governmental organisation that researches and advocates globally against government and corporate abuses of data and technology. It exposes harm and abuses, mobilises allies globally, campaigns with the public for solutions, and pressures companies and governments to change. PI challenges overreaching state and corporate surveillance so that people everywhere can have greater security and freedom through greater personal privacy.

Learn more ↗
Privacy international's logo