In a competitive marketplace, protecting India’s gig workers
Outdated, static mechanisms such as grievance redressal officers or onerous labour laws cannot keep pace with the gig economy and complaints of workers. Instead, we can look to harness the power of technology towards improving trust between platforms and gig workers
In the past few weeks, anonymous Twitter accounts such as Swiggy DE and DeliveryBhoy have made allegations regarding issues faced by delivery partners of food delivery apps. These include low payouts, opaque payout calculations and alleged cheating, unexplained differences in surge rates, order clubbing and assignments to avoid incentive pay, and zone extensions to avoid return bonuses.

Swiggy and Zomato, which offer delivery work to more than 360,000 gig workers, have responded to these allegations by insisting that earnings per order are much higher than alleged, and that full-time delivery personnel earn over ₹20,000 per month.
India’s gig economy is among the few sectors offering flexible work to unemployed millions. In her 2021 Union Budget speech, finance minister Nirmala Sitharaman mentioned the creation of a database of gig workers and extending social security to them. It is important, therefore, to examine these grievances and design policy mechanisms that protect worker rights.
Also Read | From a pandemic to an endemic, India’s vaccine drive enables a way out
Many of the grievances arise because of a trust deficit between the gig workers and the platforms. India has protected workers through heavy-handed industrial regulation and archaic labour laws, which suit the factory floor. They are irrelevant, insufficient, and ineffective in addressing disputes that originate on these platforms.
With the apparent oversupply of gig workers, the platform’s incentive is to deliver orders at the lowest marginal cost (a large component of which is gig worker fees) while keeping the customer happy. This task is assigned to algorithms. An analysis of the grievances suggests that many are linked to the way gig work is assigned (denial of high-profit surge or incentive-linked orders), performed (clubbing orders, zone boundaries), and rewarded (complex, multifactor payment calculations).
There are several factors in each of these algorithmic decisions. Work allocation can be based on weather, restaurant and customer locations, traffic, prevailing wages, and the available worker pool. The algorithms that make these decisions are flexible, learning algorithms that can account for the constantly changing input. Machine Learning (ML) and multi-factor optimisation techniques support millions of orders every day.
Crucially, most of these techniques are black-box — their inner workings are unknowable, even to the engineers that design them. Such algorithms are known to include biases. Research has shown that ML algorithms pick up pre-existing biases from their training data.
For example, a profit-maximising ML algorithm may deny orders to gig workers that are eligible for incentives, even without being programmed to do so. As a result, trust between the gig worker and the platform suffers.
However, outdated, static mechanisms such as grievance redressal officers or onerous labour laws cannot keep pace with the gig economy. Instead, we can look to harness the power of technology towards improving trust between platforms and gig workers.
Algorithm audits are one such technique, where an auditor has access to the algorithms and examine the results produced by them. Suitably qualified auditors could uncover implicit or explicit biases, or other shortcomings of such algorithms using computational and statistical techniques.
Another technique is the use of “sock puppets” where researchers use computer programmes to impersonate user accounts. Auditors can use these accounts to identify instances where the platform algorithms produce undesirable results. Other auditing techniques can also be used.
In a competitive marketplace, informed consumers can prioritise ordering from platforms that subject themselves to such audits. Workers may also choose to work for more transparent platforms. Regulators can examine work conditions as a function of work allocation, performance, and pay related to each gig, and mandate transparency related to each of these.
If successful, this approach can be replicated in other industries. The divide between algorithm makers, platform creators, investors that support them, and gig workers is real. Policymaking that mandates transparency can improve trust and ensure the welfare of gig workers while not impeding the growth of the gig economy.
Mihir Mahajan and Anupam Manur are researchers at the Takshashila Institution
The views expressed are personal
All Access.
One Subscription.
Get 360° coverage—from daily headlines
to 100 year archives.



HT App & Website
