Why You Might Soon Be Paid Like an Uber Driver—Even If You’re Not One

This post was originally published on this site.

Benjamin Valdez, a rideshare driver with Uber and Lyft in the Los Angeles area, used to drive seven days a week when the gig was more lucrative—but he says he makes far less per ride these days. When Valdez started driving, around nine years ago, he told me that he could earn anywhere from $60 to $85 to drive from West Hollywood to downtown Los Angeles at peak surge, a roughly 6-to-10-mile trip depending on the specific route. Now, if “the stars align,” he can earn between $25 and $35 for the same trip. “It’s gotten harder and harder to make money,” he said.

In recent years, rideshare drivers like Valdez have experienced shrinking incomes as the companies continue to increase their cut from each ride. Drivers and their advocates see this as part of the broader retaliation against a wave of pro-gig-worker legislation across the country, which the platforms have shaped in their favor by lobbying to keep drivers classified as independent contractors. This helps companies avoid paying out benefits such as health insurance, paid time off, and workers’ compensation.

Valdez, who is a former board member of the worker-led labor rights organization Rideshare Drivers United, says he used to receive an approximately 80 percent cut of each ride fare that he said was “relatively fair.” But since 2022, Uber and Lyft have wielded opaque algorithms that seem to incorporate data collected on workers to determine pay for each ride, offering increasingly meager amounts with little explanation.

This practice may be to blame for varying payouts for drivers completing the same ride in the same area at the same time, as reported by More Perfect Union. As a result, these companies appear to have created a system that surveils workers to utilize desperation and depress pay. Such algorithms appear to learn the lowest amounts that drivers will accept and turn them into the norm, which could explain the dive in pay experienced by Valdez and other drivers.

“There is always going to be a driver that says, ‘OK, $25 for an hour ride, that’s more than I’m making at McDonald’s,’ ” Valdez explained. Platform companies seem to be engaging in what’s known as algorithmic wage discrimination, where regularly modified formulas determine hourly pay based on an intricate web of data such as location, individual behavior, and market supply and demand.

But the scheme doesn’t end with gig workers. Experts say algorithmic wage discrimination and A.I.-influenced pay more broadly are creeping into a growing number of fields, such as health care, logistics, and tech, and could upend work as we know it.

A.I.’s growing influence on pay stems from Silicon Valley experiments. Tech giants like Uber and Lyft, which depend on massive pools of independent contractors, have both the resources and incentives to reshape the labor market, Veena Dubal, a professor of law at the University of California, Irvine, told me.

Last year, Dubal published an oft-cited paper in the Columbia Law Review on algorithmic wage discrimination. She notes that because these titans have relatively limited labor overhead, they have the funds to develop specialized algorithmic systems. Entire teams of economists and engineers even use “psychological tricks” to manipulate drivers without ever having to classify them as employees with corresponding rights.

“Platform companies have been at the cutting edge of trying to experiment with ways to control workers without it being obvious,” Dubal said. And when these experiments work, they leach into other industries and can affect people in formal employment, too.

Algorithms can be employed to sniff out desperation for income based on the extremes people are willing to take on the job, such as high trip acceptance rates among Uber drivers. With this hoard of granular information, A.I. can calculate the lowest possible pay that workers across sectors will tolerate and suggest incentives like bonuses to control their behavior. While bosses have always offered so-called variable pay—for instance, paying more for night shifts or offering performance-based salary boosts—high-tech surveillance coupled with A.I. is taking real-time tailored wages to new extremes.

“Now you have machine learning trained on identifying the desperation index of workers,” Zephyr Teachout, a professor of law at Fordham University, told me. “When you move to the formal employment context, there is every reason to think that employers who can would be interested in tailoring their wages and using behavioral data.”

The clearest parallels can be drawn in other independent contractor roles, which make up around 15 percent of U.S. workers. Dubal has found that independent contractors working with Instacart and Amazon are similarly surveilled and receive personalized pay based on information including the times of day and length of time they work, along with the types of tasks they’re willing to accept.

On the flip side, some companies also use our data to determine the most we’re willing to pay for products, and it’s possible that we could soon face eerily hyperpersonalized prices—funeral-goers, for example, could be charged more for plane tickets. This summer, the U.S. Federal Trade Commission said it was seeking information from eight companies on so-called surveillance pricing.

It’s difficult to discern just how many companies are employing A.I. to help set wages because such internal processes are usually kept hush-hush. In formal employment outside of gig work where certain rights are guaranteed, the process likely looks a bit more circuitous.

Workplaces are increasingly adopting A.I.-powered management software, which could indirectly influence pay by allocating projects and shifts based on employee data, Antonio Aloisi, an associate professor at IE University Law School in Spain, told me. A.I. is also being used to gauge salaries, benefits, and bonuses. Essentially, computers are starting to take on the role of managers.

This is already taking place everywhere from warehouses to hospitals and offices, which are ramping up digital inspection of worker behavior and in turn amassing loads of information that may affect pay. In fact, the New York Times found that 8 out of 10 of the largest private U.S. employers track productivity of workers, often in real time. “All this information creates a virtual twin of the worker that is entirely quantified, in terms of achievements, performance, time spent at work, commitment, numbers of emails processed, and so on,” Aloisi said.

In the health care sector, hospitals are embracing employee-tracking tools equipped with A.I. to schedule shifts and assign tasks. One startup, Navenio, offers monitoring of real-time staff location and movements with “inertial, pressure, signal, and GPS data.” A.I.-powered scheduling has also made its way to other industries, such as retail and hospitality.

This theoretically allows employees to work at their ideal times, and it could prevent over- and understaffing and ensure that less desirable shifts are equitably distributed. Still, algorithms may not accurately predict demand for workers. So far, the practice has encouraged short-notice and fluctuating shifts, which can cause financial stress due to income varying. And because certain shifts may be compensated differently, it may be up to algorithms to decide who will get higher-paying work.

“This is the way in which the very extreme and dystopian model of the gig economy has been penetrating the conventional economy,” Aloisi said.

Offices aren’t safe from invasive algorithms, either. Since COVID-19 ushered in widespread remote work, extensive employee monitoring has ramped up. Some companies go so far as to track employees’ keystrokes, snap screenshots of their computers, or even record faces and audio. Such detailed scrutiny could help shape bonuses, promotions, or other financial outcomes down the line.

This practice is likely more prevalent for roles such as customer service and data entry where performance is defined by, for example, how many calls someone takes or the number of words they type per minute, Rekha Gurnani Chowdhury, a compensation consultant in the tech industry, told me.

But this type of monitoring has a major flaw: It doesn’t account for working time away from the keyboard, such as brainstorming, reading old-fashioned hard copies, and talking to clients or co-workers offline. These tracking systems may end up docking employees’ pay when they’re actually working. (This was the claim in a 2021 lawsuit brought against a group of business software companies in a Texas court.)

Advocates are urging unions to get ahead of algorithmic wage discrimination. Union contracts have started to specify how A.I. can be used in relation to their work, and union members can collect and compare their pay information to identify unequal treatment. But in nonunion workplaces, A.I.-driven pay differences could stop organizing efforts in their tracks.

“The hypercustomization of [worker] treatment is by default a strategy to fragment the workforce—to convey the message that there’s a kind of salvation awaiting you if you behave the way you are instructed by the algorithm, rather than coalescing and joining forces with other workers,” Aloisi said.

Organized labor has also started cracking open up the A.I. “black box”: In 2020, unionized IBM employees in Japan claimed the company refused to explain how it uses A.I. to help evaluate staff and decide on salary increases. This kicked off an investigation by a labor relations commission that ended in a settlement this past August, when IBM agreed that it would reveal to the union the 40 different data categories involved.

Ultimately, experts say that specific legislation is likely the best bet to curb algorithmic wage discrimination. These tactics are currently considered legal in the U.S., Dubal wrote, as long as they don’t violate minimum wage or antidiscrimination laws. Teachout thinks these types of tactics could eventually be deemed unfair trade practices under state or federal law, but to her knowledge such cases haven’t been brought in court yet.

Federal and state law technically protect us from workplace discrimination based on factors like race and gender, but Teachout has noted in her research that A.I. may incorporate data that can act as a proxy for these identities. An algorithm could find, for example, that workers with less savings are less likely to leave a job even when paid low wages. This could disproportionately affect Black and Latino workers, who tend to have lower rates of savings compared to other groups.

Having experienced this firsthand, Valdez sees the potential for widespread harm as corporations continue to prioritize profits. This scheme will continue if most people are still kept in the dark, he says, and change is unlikely to come from the powers perpetuating A.I.-driven wages.

That doesn’t just endanger gig workers—but everyone else too. “In any industry, it’s a danger,” he said, “because it’s just an ever-shrinking wage when everything around us is going up in price.”