Algorithmic wage discrimination: Not just for gig workers

Algorithmic wage discrimination: Not just for gig workers

Interview Algorithmic wage discrimination, as described in an academic paper last year by UC Irvine law professor Veena Dubal, involves “the use of granular data to produce unpredictable, variable, and personalized hourly pay.”

And when these algorithms are not disclosed, they’re referred to as “black box” algorithms, as they can’t be directly scrutinized.

There are millions of workers who perform freelance work as independent contractors – 64 million in 2023, representing 38 percent of the US workforce, according to freelancing service Upwork. McKinsey in 2022 found that 36 percent of employed survey takers – which gets extrapolated to 58 million Americans – identify as independent workers.

Some of these workers offer their services through so-called gig work platforms. And in many cases, workers have little insight into how their interactions with the platform and its services could affect their pay.

One example is Shipt, a delivery service acquired in 2017 by retailer Target. As recounted by Dana Calacci, assistant professor of human-centered AI at Penn State’s College of Information Sciences and Technology, the shipping service in 2020 introduced an algorithmic payment system that left workers uncertain about their wages.

“The company claimed this new approach was fairer to workers and that it better matched the pay to the labor required for an order,” explained Calacci. “Many workers, however, just saw their paychecks dwindling. And since Shipt didn’t release detailed information about the algorithm, it was essentially a black box that the workers couldn’t see inside.”

Inscrutable algorithms may also have indirect effects on compensation, and not just for gig workers.

Wilneida Negrón, director of policy and research at labor advocacy group Coworker.org, told The Register that it takes extensive investigation and conversations with workers to identify algorithmic wage discrimination.

“But by all accounts, this is a prevalent practice and in the past few years, we have been involved in advising a few agencies on this issue,” said Negrón, who anticipates action from government agencies in the months ahead. “It’s a problem that extends beyond gig work and touches on other industries such as home health care, retail/manufacturing, transportation, and logistics.”

With regard to gig work, Negrón said workers for Instacart and Uber-owned Postmates have been expressing concern since 2019. Workers with food delivery app GoPuff and Favor (a delivery app in Texas,) and pet sitting app Rover, have also voiced complaints, she said.

“We’ve had conversations with home health workers from Honor, Visiting Angels, and Amedisys that new algorithmic pay systems have exacerbated what were already problematic pay-per-visit schemes,” Negrón added.

And similar concerns exist for those working with transportation and logistics firms.

“FedEx and UPS drivers continue to deal with the integration of AI-driven algorithms into their operations, which affects their pay among other things,” said Negrón. “The new UPS contract does not only increase wages, but give workers a bigger say in new technologies introduced and their impact.

The situation is slightly different, said Negrón, in the banking and finance industries, where workers have objected to the use of algorithmic performance and productivity measurements that indirectly affect compensation and promotion.

“We’ve heard this from Wells Fargo and HSBC workers,” said Negrón. “So, two dynamics here: the direct and indirect ways that algorithmic systems can impact wages is a growing problem that is slowly affecting white collar industries as well.”

Negrón said these systems make preexisting wage and labor concerns worse and would like to see the issue addressed through minimum wage guarantees or fair pay standards in industries where these algorithmic practices are common.

“That’s what you are seeing in some cities like NYC with gig workers and while it’s encountering some issues with workers’ pay being cut or cuts being rolled over to consumers, it’s still early to say if it’s impactful or not,” said Negrón.

“We also want to see employers have to abide by some level of algorithmic transparency and regular reporting standards that require companies to regularly report pay data, showing the distribution of wages among workers to identify, and address disparities caused by algorithmic decisions.”

There’s some irony in looking for legislation to address the issue, seeing as state laws like California’s Proposition 22 helped make the situation worse. Prop 22, approved by voters in 2020, created a specific labor law exemption to allow ride sharing and delivery firms to classify their workers as independent contractors rather than employees. As Dubal asserts in her 2023 paper, Proposition 22 “legalized the practice of algorithmic wage discrimination…”

The current US administration has signaled its willingness to push back on industry efforts to treat gig workers as independent contractors, but the situation is far from perfect.

OpenAI co-founder Ilya Sutskever’s new startup aims to create ‘safe superintelligence’

Uber Eats to rid itself of pesky human drivers with food delivery by robo Waymo

Tough luck, bosses, AI is coming for your job, too

Europe inches closer to insisting gig workers are treated as employees

The Register spoke with Saiph Savage, assistant computer science professor at Northeastern University and director of the Northeastern Civic AI research lab, to better understand how algorithmic transparency, or lack of it, affects workers.

Savage: The problem is that, currently, most gig marketplaces are black boxes. So they do not provide feedback about the types of algorithms that exist on their platforms. This means that the platforms can potentially manipulate workers.

For example, a platform could tell workers, ‘hey, you know what? Our algorithm has detected that if you continue working for two more hours, you’re going to increase your wages by 60 percent.’ And so the worker might then work those two more hours, but they might not actually make a 60 percent increase in their wages. The problem is that the platform might not tell them what is the likelihood of that happening, because there isn’t any transparency.

That makes it easier for the platforms to be able to manipulate the workers, [as if using ] a carrot on the stick to tell how they want workers to move forward.

Or workers might experience a problem within the platform. Let’s say that they’re constantly encountering scammers. This could be the case on, for example, UpWork or Toloka. Given the lack of transparency, they don’t have data if the problem that they are experiencing is a one-off thing, or if it’s a systemic problem.

It is similar with wages, since workers don’t have transparency into the wages of other workers. For example, is everyone on the platform making less than minimum wage? Or is it just that, because I’m a newbie, I’m not getting high wages?

The lack of transparency makes it easy for the platform – the good platforms – to be able to manipulate the workers. And also it makes it hard for the workers to know when they are facing systematic problems.

The Register: This has been documented on platforms like Target’s gig delivery subsidiary Shipt back in 2020. Is it still an issue on other platforms that treat their workers as independent contractors?

Savage: Yes. The book Uberland documents this very well. [The author] discusses, for example, cases where Uber would try to get workers, their drivers, to just continue working throughout the night, and it would send them messages like, ‘oh, yeah, Uber on. You’re going to increase your wages by 60 percent if you continue working.’ The problem that the book documents is again that workers don’t have the data about what is the likelihood of that actually happening. And since they don’t have the data, it’s easy to manipulate them.

The Register: So why is this happening? And why have labor laws failed to address this?

Savage: I think that the fact that these platforms are black boxes helps prevent regulation. If policymakers cannot quantify the problem, how can they create good regulation?

What my lab has been focusing on doing is creating tools that allow workers to collect their own workplace data and unite forces so that we can inform policymakers about what is taking place within these platforms. So for example, we were able to measure the hourly wages on Amazon Mechanical Turk. And from there, we showed that a majority, a great percentage of workers, were receiving less than the US minimum wage.

And from that we have connected, for example, with US senators who have sent letters to these big tech platforms to question them about the type of salaries that these digital workers are receiving. So I think a step forward is to facilitate transparency.

The Register: Are there credible legislative efforts to address the issue?

Savage: Senator Markey (D-MA) last year was questioning [PDF] the big tech platforms, precisely about how much they were paying these gig workers. Also in Massachusetts, there has been a lot of movement around the classification of gig workers. And I think we need to inform policymakers about what is taking place so that they can better regulate it.

It’s important also to realize that it’s a new type of digital labor where, since everything is automated by algorithms, new dynamics are emerging. So we still need to make efforts to educate policymakers about what is taking place.

The Register: Have any of the gig work platforms taken steps to prevent data gathering? That’s been an issue with Meta, which has made it harder for misinformation researchers to study ad platform and content moderation.

Savage: We have created our own independent tools so that if the platforms are not providing us with the data, we’re collecting our own data.

And right now we have been working with lawyers, precisely for pushing forward regulation that can allow workers to always be able to collect their own data about their workplace, because you’re right – the gig platforms could make it against the terms of service for workers to collect their own workplace data.

The Register: Is there a worst offender at the moment?

Savage: All the gig platforms are making it difficult for workers, because costs that were traditionally paid by businesses now are imposed on the workers.

And the lack of transparency makes it hard to detect just how bad the situation is. Gig workers are forced to do a lot of invisible unpaid labor. And because we don’t have the data about that, it’s hard to understand just how bad it is. ®

>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : The Register – https://go.theregister.com/feed/www.theregister.com/2024/07/06/algorithmic_wage_discrimination/

Exit mobile version