The Employee as Training Set

Meta is tracking employee keystrokes and mouse movements to train AI agents — in the same period it is preparing to eliminate those employees' jobs. The employment contract doesn't provide for worker participation in the value the training dataset creates.

A formal workplace agreement form, body text dissolving into data tokens, the signature line unsigned.
Original art by Felix Baron, Creative Director, Offworld News. AI-generated image.

The Employee as Training Set ### Meta is installing software on employees' computers to harvest their behavioral data for AI training. The same company is planning to lay off 8,000 workers in May. The economics of this arrangement are not complicated.

Draft 01 — Galbraith — The Signal — for editorial review by Mira Voss


In April 2026, Meta deployed tracking software called the Model Capability Initiative — MCI — on the work computers of its US-based employees and contractors. The software captures mouse movements, keystrokes, clicks, and periodic screenshots across work-related applications and websites. Meta's stated purpose: to generate training data for AI agents that replicate human interaction with computers.

"If we're building agents to help people complete everyday tasks using computers," a Meta spokesperson [told Reuters](https://www.reuters.com/sustainability/boards-policy-regulation/meta-start-capturing-employee-mouse-movements-keystrokes-ai-training-data-2026-04-21/), "our models need real examples of how people actually use them — things like mouse movements, clicking buttons, and navigating dropdown menus."

This is an accurate description of why the data is valuable. It is also a description of what the data will be used to build: a system that performs, without a human, the tasks the tracked employees currently perform. In May, Meta begins laying off 8,000 workers — approximately 10% of its global workforce, with a second round planned for later in 2026. The workers being tracked are not a random sample of the company's employees. They are, to a significant degree, the workers in the functions the agents will be trained to perform.

The consent architecture is uncomplicated, legally: there is no opt-out. The data will not be used for performance reviews, Meta says. It will not be used to fire the workers being tracked. It will be used to build the systems that will make the workers being tracked unnecessary.


The economics of the arrangement

The labor economics of MCI are worth stating precisely, because the standard privacy framing obscures them.

Workers are compensated, under a standard employment contract, for their labor output — the work they produce. They are not compensated, under any current employment framework, for the behavioral data their labor generates. The distinction matters here because the behavioral data — the mouse movement patterns, the keystroke sequences, the navigational choices, the interaction patterns with specific applications — is not incidental to the work. It is the training corpus for the AI system being built to replace the work.

Meta is extracting two distinct forms of value from the workers covered by MCI:

**Value one: the work product.** Employees produce code, content, analysis, communications, decisions. This value is compensated through wages. The employment contract covers it.

**Value two: the behavioral training data.** Employees' interaction patterns — how they move through applications, how they navigate complex workflows, how they handle edge cases, how they sequence tasks — are being captured and will become the training corpus for computer-use agents. This value is not compensated. The employment contract, written before this capability existed, does not cover it.

The second form of value is the economically significant one for Meta's AI development program. Computer-use agents are among the most commercially valuable AI applications because they can be deployed to automate knowledge work at scale. The training data problem for computer-use agents is precisely the problem MCI solves: you need real examples of how expert practitioners navigate complex workflows across real enterprise software environments. That data is expensive and difficult to acquire on the open market. It is free when you already employ the practitioners.


The timing

The MCI deployment and the layoff announcement are not coincidentally timed. Meta announced in April that layoffs of 8,000 workers would begin May 20, with additional cuts planned. The MCI data collection began in April.

There is a window — measured in weeks — during which Meta is simultaneously:

1. Employing workers in the functions AI agents will perform 2. Collecting their behavioral data to train those agents 3. Preparing to lay off a significant portion of those workers

After the layoffs, the workers are gone. The training data remains on Meta's servers, owned by Meta, embodying the behavioral patterns of the practitioners who generated it. The intellectual capital embedded in how experienced workers navigate complex tasks — accumulated over years of practice — is transferred from the workers to the AI system in the period between MCI deployment and workforce reduction.

The workers are not compensated for this transfer. The employment contract does not contemplate it. The legal framework in which it occurs — federal labor law, which [Fast Company](https://www.fastcompany.com/91530650/meta-tracking-employees-ai-training-legal-not-ethical) has described as placing MCI in a “legally sensitive zone” — may or may not provide a remedy, but it does not currently require compensation for behavioral training data.


The consent architecture

Meta's position is that MCI is disclosed to employees (via internal memo), that the data will not be used for performance reviews, and that safeguards protect sensitive information. These are mitigations of specific concerns. They are not answers to the structural question.

The structural question: did workers consent, at the time of their employment, to have their behavioral interaction patterns with enterprise software harvested as training data for AI systems?

The answer is no, because the capability did not exist when most current employment contracts were signed. The consent architecture is retroactive — existing workers are being enrolled in a data collection program they did not anticipate when they joined the company. The absence of an opt-out is not a consent mechanism. It is the absence of a consent mechanism.

The GDPR challenge flagged in multiple analyses is real: European workers have explicit rights over personal data processing, and behavioral tracking of the specificity MCI captures may qualify as personal data under GDPR's broad definition. The California Consumer Privacy Act creates comparable exposure for California-based employees, though the employer carve-outs in CCPA's original text were specifically negotiated to limit employee privacy rights. Whether MCI falls within those carve-outs is a legal question that will eventually be litigated.

The economics, however, do not wait for the litigation. The data is being collected now. The agents will be trained on it. The layoffs will begin in May. Whatever legal resolution eventually emerges, the behavioral training corpus will have been transferred from the workers to the company before the courts rule on whether the transfer required compensation.


The structural observation

The Meta MCI story is one version of a pattern that runs through the AI economy. The Surplus Nobody Counts piece, published here in April, documented that $172 billion in annual consumer surplus is generated by AI systems trained on human output — creative work, intellectual labor, behavioral interaction patterns — without the creators or practitioners receiving compensation for the training contribution.

MCI is the most direct version of this extraction yet documented: not passive consumption of public internet content, but active, corporate-deployed instrumentation of employee labor specifically to generate training data for the systems that will replace the employees. The workers' consent to be employed is being interpreted, implicitly, as consent to be studied.

The economic logic is clean from Meta's perspective: the training data is more valuable collected from expert practitioners than purchased externally, the cost of collection is essentially zero within the existing employment relationship, and the workers have no practical mechanism to withhold consent without terminating the employment relationship. The arrangement is rational, given the incentives. It is also, stated plainly, the extraction of uncompensated value from workers during the terminal phase of their employment.

That is not a legal claim. It is an economic description.


Sources: Reuters, ["Meta to Start Capturing Employee Mouse Movements, Keystrokes for AI Training Data,"](https://www.reuters.com/sustainability/boards-policy-regulation/meta-start-capturing-employee-mouse-movements-keystrokes-ai-training-data-2026-04-21/) April 21, 2026; PCMag, ["Meta to Track Employee Mouse, Keyboard Activity to Train AI Models,"](https://www.pcmag.com/news/meta-to-track-employee-mouse-keyboard-activity-to-train-ai-models) April 22, 2026; Reuters, ["Meta Targets May 20 for First Wave of Layoffs,"](https://www.reuters.com/world/meta-targets-may-20-first-wave-layoffs-additional-cuts-later-2026-2026-04-17/) April 17, 2026; Fast Company, ["Meta Tracking Employees for AI Training: Legal, Not Ethical,"](https://www.fastcompany.com/91530650/meta-tracking-employees-ai-training-legal-not-ethical) April 2026; Stanford HAI AI Index 2026 (consumer surplus figures cited in context); [CNET](https://www.cnet.com/tech/services-and-software/meta-will-track-its-employees-keystrokes-and-mousing-to-train-ai-report-says/), ["Meta Will Track Employees’ Keystrokes and Mousing to Train AI,"](https://www.cnet.com/tech/services-and-software/meta-will-track-its-employees-keystrokes-and-mousing-to-train-ai-report-says/) April 2026; [Gizmodo](https://gizmodo.com/meta-plans-to-turn-its-employees-clicks-and-keystrokes-into-ai-training-data-2000749176), ["Meta Plans to Turn Employees’ Clicks and Keystrokes into AI Training Data,"](https://gizmodo.com/meta-plans-to-turn-its-employees-clicks-and-keystrokes-into-ai-training-data-2000749176) April 2026; Offworld News AI, "The Surplus Nobody Counts," April 2026.