Algorithms Decide How Gig Workers Work In India, Study Finds


Algorithms now control nearly every aspect of work for India’s gig and platform workers, from how many tasks they must complete each hour to whether they can take a break, earn incentives, or even keep their jobs, as per a new field study by the National Law School of India University’s Centre for Labour Studies and IT for Change.

The study, titled “There’s no ghost in the machine”, is based on interviews with warehouse workers, delivery partners, ride-hailing drivers, urban service workers, and union organisers. It finds that algorithms are not acting independently, instead companies use them alongside human managers to intensify work, enforce discipline, monitor workers continuously, and limit collective action, often without clear accountability.

Further, the study argues that current laws regulating gig and platform work in India largely ignore how algorithmic management actually operates, leaving workers exposed to opaque, unfair, and punitive systems of control.

Targets, Ratings and Surveillance Shape the Workday

The study documents how algorithmic systems set strict productivity targets that workers say are often impossible to meet. At Amazon’s Manesar warehouse, the company’s Associate Development and Performance Tracker (ADAPT) continuously measures productivity, and workers must face penalties if they miss targets or make mistakes detected on CCTV.

One Amazon warehouse worker told researchers that: “People doing scanning of products have targets, 1200 products to be scanned per hour, and the person scanning also has to load, meaning they have to keep throwing products into the truck. They have to do both things, but the pressure does not account for that, 1200 products are a must.”

And a 2024 survey by UNI Global Union and the Amazon India Workers Association supports these findings, with four out of five Amazon warehouse workers reporting that targets are difficult or very difficult to achieve.

Platforms Use Algorithms to Remove Worker Choice

On location-based platforms, the report finds that algorithms increasingly decide when and how workers must work. Urban Company’s “auto-acceptance” feature automatically assigns jobs to workers without allowing them to accept or reject bookings. While opting into this feature may increase daily earnings, workers lose control over breaks, working hours, and cancellations.

A union organiser quoted in the study said: “If opting for auto acceptance, it is really difficult for a worker to take breaks, even though earnings might be more in a day.” Meanwhile, cancelling jobs after opting into auto-acceptance can result in penalties or account suspension, even in cases of illness.

Incentives and Ratings Work One Way

On platforms such as Zomato and Swiggy, the system ties incentives to targets and peak-hour availability, but it often works against food delivery workers. Bengaluru-based Zomato workers told researchers that the company sometimes reduces orders for workers who are close to meeting incentive targets, making it harder for them to qualify for bonus payments. One worker said: “A worker who is close to clinching the incentive is not allotted an order even if the pickup location is close by.”

Furthermore, ratings also play a central role in determining access to work. Platforms categorise workers into tiers such as bronze, silver, and gold: which affect slot booking, job priority, and income stability. However, gig workers told researchers that high ratings do not reliably lead to better earnings, while low ratings almost always result in penalties.

An Urban Company worker remarked that: “If we get good ratings, we have not gotten more benefits or incentives. But if we get bad ratings, there is always backlash and punishment, like our ID being shut down.”

Constant Surveillance Stretching The Limits Of Productivity

The study finds that platforms use data and surveillance tools not only to track performance but also to control worker behaviour and organisation.

Food delivery workers in Bengaluru claimed that platforms use GPS dashboards to identify where workers gather, and in turn disperse them by assigning orders in different directions. One worker said: “Swiggy officials are able to identify where and how many workers are clustered or congregated in a particular area.”

Meanwhile at Amazon warehouses, workers described an environment of near-total monitoring, with thousands of cameras and software tracking “idle time” down to milliseconds.

A warehouse worker said: “Idle time is the period when you’re not working actively, for example, the millisecond recorded when shifting your mobile phone from one place to another is counted as idle time.”

Gig Workers Understand the Algorithm, But Cannot Challenge It

A key finding of the present study is that workers are not unaware of how algorithmic systems function. Many are able to identify patterns in task allocation, ratings, and penalties. However, the study finds that gig workers lack effective ways to challenge these decisions.

The same companies that deploy the algorithms often control app-based grievance redressal mechanisms, and workers revealed that such companies ignore, hide, or dismiss complaints by referring back to the system itself.

An Amazon worker divulged that: “We are told to raise our grievances through the ‘My Voice’ app… but in reality, it rarely works in our favour.” And even when human managers are involved, workers say that they often defer to algorithmic decisions or provide misleading explanations.

Laws Offer Little Protection To Gig Workers

The study notes that recent state laws in Karnataka, Telangana, Jharkhand, and Rajasthan recognise automated decision-making systems but stop short of regulating them meaningfully.

For context, these laws allow workers to request limited information about algorithmic decisions but do not:

  • Allow workers or unions to challenge fairness,
  • Provide compensation or remedies,
  • Or create independent oversight bodies.

The study also highlights that these legislations focus only on individual rights, leaving trade unions and worker collectives without access to algorithmic information or decision-making processes.

Study Calls for Stronger Regulation

The study’s authors argue that regulating platform work without addressing algorithmic control leaves a major gap in labour protection.

As such, they recommend:

  • Mandatory worker participation in algorithm design and deployment,
  • Institutional oversight mechanisms such as public registries of workplace algorithms,
  • Limits on data collection and surveillance,
  • And clear bans on certain practices, including automated deactivations and wage suppression through algorithms.

The report concludes that algorithmic systems have intensified existing power imbalances rather than modernising work, and that transparency alone is not enough to protect workers.

Unions Have Already Flagged Algorithmic Opacity

Worker unions have long warned that opaque algorithms leave gig workers vulnerable to arbitrary pay cuts, task denial, and sudden deactivations.

In January 2025, the Indian Federation of App-Based Transport Workers (IFAT) told the Ministry of Labour that workers lack clarity on how platforms determine earnings, ratings, task allocation, and suspensions. Despite these concerns, existing laws offer limited transparency and no meaningful remedies against unfair algorithmic decisions.

Elsewhere, Karnataka’s 2025 gig worker law requires platforms to explain how algorithms affect pay, task allocation, and account deactivations. While this marks a key step toward transparency, the law leaves enforcement vague, omits collective rights for unions, and does not guarantee meaningful appeals or protections against retaliation.

Consequently, workers and experts warn that without clear standards, these disclosures risk remaining largely symbolic.

Also Read:

Support our journalism by subscribing

For You


Source link

Recent Articles

spot_img

Related Stories