Job loss gets the headlines. A factory closes. A white-collar role disappears. Automation takes another category of work and the story writes itself.
The quieter transformation attracts less attention: the millions of workers who still have jobs, but whose working conditions now run through software they cannot see, cannot question, and cannot appeal to as a person.
This transformation has a name — algorithmic management — and a treaty framework designed to address it. The International Covenant on Economic, Social and Cultural Rights includes specific obligations about working conditions. The U.S. has signed but never ratified that treaty, which means American workers in algorithmically managed roles operate without the accountability layer that workers in 173 other countries can invoke.
What Algorithmic Management Looks Like on the Ground
The term covers a range of practices. At one end: software that tracks productivity metrics and surfaces them to human supervisors. At the other: systems that make consequential decisions — discipline, termination, task assignment, pay — with no human review in the loop.
Documented practices in U.S. workplaces include:
Pace tracking in warehouse and fulfillment work. Productivity software monitors units processed per hour and flags workers who fall below targets. Research suggests that workers at some large fulfillment operations face pace requirements that limit bathroom breaks and recovery time (Strategic Organizing Center, The Injury Machine, 2021). Workers report that the system generates disciplinary action automatically when targets go unmet.
Deactivation in platform work. App-based drivers and delivery workers describe receiving automated account suspensions — effectively termination — based on rating drops or algorithmic anomalies, with limited recourse and no human review in the initial decision (see Congressional testimony, House Oversight, 2021).
Automated quality scoring in content moderation. Workers who review flagged content for major platforms describe accuracy targets enforced by scoring software that can affect employment status. The work itself carries documented psychological risk; the pace and oversight add additional stress.
Surveillance and location tracking. GPS monitoring, keyloggers, screen capture software, and biometric time clocks now appear in a broad range of workplaces from trucking to remote office work.
No U.S. law automatically prohibits these practices. Some face regulatory scrutiny under OSHA, the NLRA, or state laws. But no binding floor governs the specific conditions algorithmic management creates.
What Article 7 Actually Requires
ICESCR Article 7 commits ratifying states to recognize “the right of everyone to the enjoyment of just and favourable conditions of work,” specifying:
- Fair wages and a decent living for workers and their families
- Safe and healthy working conditions
- Equal opportunity for advancement based on seniority and competence
- Rest, leisure, and reasonable limitation of working hours, including periodic holidays with pay
The CESCR’s General Comment 23 (2016) extended Article 7 analysis to platform and non-standard work, clarifying that states must address working conditions regardless of employment classification. A worker deactivated by an algorithm and a worker dismissed by a supervisor both fall within the treaty’s scope.
Article 7 does not prohibit automation. It requires that when states permit working conditions to exist — algorithmic or otherwise — those conditions meet the just and favorable standard, and that states can demonstrate they have taken steps to ensure this.
That demonstration happens through CESCR review cycles: reporting obligations every five years, civil society shadow reports, and public recommendations from the treaty body. It creates a paper trail.
The Accountability Gap
The United States signed the ICESCR in 1977 under President Carter. The Senate has never ratified it.
Non-ratification means no CESCR review cycle. The U.S. never files the periodic report that would describe, for international scrutiny, how working conditions in AI-managed sectors compare to Article 7 standards. No UN body issues recommendations. No shadow report process exists. No human rights mechanism can cite treaty obligations in U.S. courts.
The gap does not mean working conditions rank worse in the U.S. than in ratified states. It means the accountability infrastructure does not exist. Whether algorithmic management meets just and favorable conditions in the ICESCR sense remains a question the U.S. has no formal obligation to answer.
For workers, this matters practically. Civil society organizations in ratified states use the CESCR reporting cycle to document conditions, surface findings internationally, and create domestic political pressure through the reporting process itself. That lever does not exist for U.S. workers in AI-managed roles.
What Ratification Would Change
Ratification would not immediately change a single workplace practice. The ICESCR creates obligations for states, not employers directly. What it would change:
A reporting obligation. The U.S. would file periodic reports on working conditions, including in algorithmically managed sectors. Preparing those reports requires looking.
A review process. CESCR would issue recommendations — public, indexed, cited in subsequent reports. A recommendation on algorithmic management working conditions would have political weight domestically.
Shadow report access. Labor organizations, worker advocacy groups, and researchers could file shadow reports documenting conditions that the official report omits. The treaty creates a formal channel for those voices.
An international standard to cite. Advocates and legislators could point to CESCR findings the way environmental advocates point to IPCC reports — not binding on Congress, but changing the terms of the argument.
The Question Worth Asking
Algorithmic management grows. The workers it affects span warehouse fulfillment, platform delivery, content moderation, retail, insurance claims processing, and a growing share of remote office work. The systems automating their conditions will not become less sophisticated.
The U.S. has a choice about whether those workers have access to an international accountability mechanism. Forty-nine years after President Carter signed the treaty, that choice remains open.
If you want to engage on this issue, the action guide describes how to contact your senators and includes template letters you can personalize.
Part of the ICESCR Article Series — examining each of the treaty’s substantive articles through the lens of AI economic displacement.
EPISTEMIC FLAGS
- SOC report and Congressional testimony references drawn from knowledge base; specific document titles, dates, and testimony details have not undergone verification against primary sources
- “Millions of workers” subject to algorithmic management represents a scale claim without a specific cited source — verify current figures before citing in research contexts
- The characterization of algorithmic management practices (dynamic scheduling, automated termination, performance scoring) draws from documented industry reports but specific prevalence rates vary by sector and employer
- CESCR General Comment 23 (2016) on Article 7 interpretations cited from knowledge base; specific paragraph numbers have not undergone verification against official OHCHR text
Published by unratified.org · CC BY-SA 4.0