Meet the new gig work behind AI, same as the old gig work
The machine magic of AI relies on one of the oldest technologies of all: human labor. This week we look at Scale AI, one of the companies behind the booming data annotation industry.
A few weeks ago, details came out about Meta’s plan to acquire a 49 percent stake in the company Scale AI for a sizeable $14 billion.
The move was hailed as a sign that the company was serious about competing in the AI wars, serving as a shot across the bow to competitors like Google and OpenAI, which have had business arrangements with Scale AI in the past.
For this week’s newsletter, we wanted to look at Scale — what it does, how it operates, and how it is a key node of the unseen web of human laborers that power artificial intelligence systems.
Scale is one of the best-known brands of a certain niche in the AI industry: companies that help train and build AI models through the use of human annotators, who label and define pictures, videos, images and text to help AI systems learn and grow more advanced.
There is big money to be made here, as vast sums of capital flow to AI: Scale is forecasting $2 billion in revenue this year reportedly, with an expected valuation of $25 billion. The data collection and annotation industry has been forecasted to reach $17.1 billion by 2030 by some estimates, and studies from 2021 and 2022 estimate that millions of people have engaged in data annotation for work, at least temporarily.
The work appears to be another example of the tech industry creating the illusion of seamlessness and futuristic magic — in this case, artificial intelligence — while relying on a complicated and potentially illegal business model powered by a very old technology: humans.

It’s worth noting that Scale initially described itself as an “Uber for AI.” The comparison is apt: the work appears to be raising some of the same labor concerns that trailed Uber and other rideshare and delivery companies.
Like app-based rideshare or delivery work, annotation jobs are also arranged as gigs online: paid by the hour or sometimes by the task. In the U.S., the work can pay about $20-$30 an hour for basic tasks, and even more for subjects that require a higher knowledge threshold, like chemistry and coding. But jobs are increasingly outsourced to less developed countries with lower standards for pay and labor regulations like Kenya and the Philippines, where they’ve earned the nickname “digital sweatshops.”
Equal Times, a European-based publication that is funded in part by international labor groups, described a visit to an office run by a subsidiary of Scale AI in Cagayan de Oro in the Southern Philippines, where workers get paid just a few cents per task. One worker said they made about $7 a day: “It’s less than the legal minimum, and I have no social protection, but I don’t have a choice,” the worker said, citing the lack of jobs in the area.
In Cagayan de Oro, the Californian firm has set up an enormous headquarters in a soulless building in the city centre. Inside, once past security, a veritable hive is revealed to the visitor. Dozens of windowless rooms, each protected by digital access locks, have been filled with as many computers as possible. Day and night, several hundred data workers are trained under the stern eyes of supervisors…Inside the work room, around 20 other young people are crammed into a space measuring about 15 square metres. Within a few weeks, they’ll all be sent back home to work online.
According to a detailed report in The Washington Post about Scale AI in the Philippines, workers also routinely face issues with payments: delayed, withheld or for less amounts than they believe they are owed, with little oversight or recourse from regulators, and unexplained punishments and consequences from the company in some cases when workers choose to speak up.
Here in the U.S., annotation work can also be precarious, underpaid for the amount of work required, and subject to the unseen and opaque whims of a big and powerful company. Like the gig work models that came before it, the industry is bumping head-first into labor laws.
A group of workers filed a lawsuit against Scale AI late last year that alleged that they have been misclassified as contractors when they should be employees. Employees enjoy a wider range of rights and benefits than contractors, who are not typically entitled to protections around minimum-wage laws, mandatory overtime, unemployment insurance, workers’ compensation and other benefits, for example. But those protections raise labor costs for companies. Scale has said that it is “committed to ensuring we are in full compliance with all applicable laws and regulations” in response.
This week, I spoke with Glenn Danas, a lawyer and partner at Clarkson Law Firm, which is representing workers in the misclassification lawsuit against Scale AI, which is still ongoing. Our interview has been condensed and lightly edited.
Hard Reset: Tell us a bit about where this lawsuit originated from.
Glenn Danas: Some independent contractors of Scale started reaching out to us in 2024. And because I'm well familiar with AB 5 and the [Dynamex ruling] and the different ways in which the ABC test had become the focal point of employment in California, we obviously realized there was an issue here.
HR: The ABC test in California presumes all workers are employees unless a company can prove three conditions: that workers A) are free from the control and direction of the company; B) engage in work that is outside the company’s main business; and C) typically work independently, doing the same work they are also providing the company.
Can you lay out the basics of your allegations that these workers are full-time employees and not gig workers?
GD: When we started talking to folks, we learned that [Scale AI] was essentially micromanaging the job and on top of that, really putting a great deal of pressure and coercion onto workers to do the work that they were being given. We were often hearing that folks were getting six hours of work to do in four hours and that sort of thing. That's a litmus test really for employment. If you're a traditional independent contractor, if someone comes into a business to fix the toilet, for example — they're a plumber and they're in a totally different line of work. I would never tell them how to do their job, I would just sort of leave it to them to do in as much or as little time as they needed. And that is not how the job of data annotation goes at Scale…
The one that they generally fail is prong B, which is the one that says that they're in a different line of work than the company. Scale would have to show that these data annotators are doing work of a different variety than is at the core of what Scale AI does, and there's simply no way that they're going to be able to do that. The rideshare companies used to try and say that, well, we're not actually providing rides, we're really just a computer-based sort of matchmaking service. Or they had these different kind of semantic games that were all rejected by the courts…
Even if we're under what's called the Borello test, the common law test that preceded AB 5, we're seeing just a very high degree of control. These folks have to put onto their home laptop or whatever software that Scale AI controls, in terms of the work that they're doing…Scale can look in on the work they're doing at any time, throw them off the platform at any time for no reason at all. They run a Slack channel that basically is a highly controlled version of communications. So even under the tougher standard, I think we would have a pretty easy case of proving that they're employees.
HR: What did you hear from the Scale workers you spoke with?
GD: The stuff that people generally call about would be like, “I am being mistreated. They promised me a certain wage and they're paying me a different wage.” Or, “They are paying me no wage.” “I got thrown off of, or closed out of the system, or such and such person that I work with asked some questions about work and then all of a sudden they disappeared from the Slack channel.” Or, “I'm being asked to review terrible content about child pornography or rape fantasies or all these terrible things and it's really taking a toll on my mental health and I really was not expecting this.” … Through that thicket of different problems, the overarching stuff … is that you all are being treated as independent contractors when you're clearly employees … We also have a separate suit based on the sort of outrageous content that they're being asked to review.
HR: How many people are part of the misclassification suit?
GD: We don't know exactly yet, but based on what we do know and California share of the national workforce and all these different things, I would take a guess that there are, at any given time, let's say 12,000 to 15,000 annotators in California, which of course would not include ones who are no longer annotators. It's a PAGA (Private Attorney's General Act in California) action which is like a class action, but [different in some significant ways,] one of which being that it creates a statutory penalty for misclassification, which is very, very large.
HR: We saw a furious lobbying and public relations effort to change public opinion — and policy — when rideshare and delivery companies like Uber, Lyft, and DoorDash began facing regulatory heat for their labor model a few years ago. Those companies finally succeeded in passing a proposition that exempted them from these labor laws, spending hundreds of millions of dollars in the process.
Given how much capital there is behind AI, would you anticipate a similar effort from AI companies to exempt themselves from labor laws around classification?:
GD: I would absolutely expect it. I haven't heard about it yet. I mean, I certainly hope that they don't, and if they try, I hope it's not successful…But yeah, there's a ton of money and I can see them trying to do the same thing.
HR: What does it look like if you guys win a lawsuit, and how would it change things for workers at Scale?
GD: It would be twofold. It would be one to be made whole financially for everything that's happened already. So paying them money that they owed, whether it be in the form of wages, or in the form of breaks. or wage statements or other stuff that they were entitled to that they didn't get. And going forward, to make changes such that they treat them like employees and say, you're going to be entitled to different protections that the labor code provides.
HR: Would you guys agree to an outcome that involved the workers staying as independent contractors?
GD: I mean, that's hard to say. Probably not and I doubt that Scale would want that, because frankly, if they continue to do that, they can just be hit with another suit. California law is what it is, and they're a California company.
HR: Does the Meta investment change anything for you and these workers?
GD: Not legally. I think it may provide more of an impetus for them to want to clear the decks of litigation liabilities like this one. So I think it's possible that that provides more of a motivation or a reason to want to settle, but we don't know.
HR: Thanks for chatting with us, Glenn.