Categories
Blog

Platform drivers: From algorithmizing humans to humanizing algorithms

[By Pallavi Bansal]

I remember getting stranded in the middle of the road a few years ago when an Ola cab driver remarked that my trip had stopped abruptly and he could not take me to my destination. Frantic, I still requested him to drop me home, but he refused saying he cannot complete the ride since the app stopped working. On another unfortunate day, I was unable to find a cab back home as the drivers kept refusing to take up what they saw as a long ride. When I eventually found a cab, the driver continuously complained about how multiple short rides benefit him more. I tried to tip him after he finished the ride, but instead he requested me to book the same cab again, for a few kilometres, as that would reap more rewards. While I wanted to oblige, I couldn’t find the same driver, even though he had parked his car right outside my house. In yet another incident, I spent the entire night at the airport as I was terrified to book a cab at that late hour. I regretted not checking the flight timings before confirming the booking, having overlooked the fact that women need to be cautious about these things. 

Image credit: Pixabay / Pexels

Although my first response was to blame the cab drivers for what I saw as an unprofessional attitude, it slowly dawned on me that they have their own constraints. In the first scenario, the app had actually stopped working, so he couldn’t complete the ride due to the fear of getting penalized, which also resulted in a bad rating by me. In the second situation, I wondered why the algorithms reward shorter rides rather than longer ones. Moreover, how do they assign drivers if proximity isn’t the only factor and why was my driver not aware of that? In the third instance, why couldn’t I be assigned a woman driver to make me feel safer when traveling late at night?

I spoke to a few senior managers and executives working at popular ride-sharing apps in India to find the answers.

Constant tracking

A senior manager of a well-known ride-sharing platform explained their tracking practices on condition of anonymity:

“The location of driver-partners is tracked every two-three seconds and if they deviate from their assigned destination, our system detects it immediately. Besides ensuring safety, this is done so that the drivers do not spoof their locations. It has been noticed that some drivers use counterfeit location technology to give fake information about their location – they could be sitting at their homes and their location would be miles away. If the system identifies anomalies in their geo-ping, we block the payment of the drivers.”

While this appears to be a legitimate strategy to address fraud, there is no clarity on how a driver can generate evidence when there is an actual GPS malfunction. Another interviewee, a person in a top management position of a ride-sharing company, said, “it is difficult to establish trust between platform companies and driver-partners, especially when we hear about drivers coming up with new strategies to outwit the system every second day.” For instance, some of the drivers had a technical hacker on board to ensure that booking could be made via a computer rather than a smartphone or artificially surging the price by collaborating with other drivers and turning their apps off and on again simultaneously.

Though the ‘frauds’ committed by the drivers are out in the public domain, it is seldom discussed how constant surveillance reduces productivity and amplifies frustration resulting in ‘clever ways’ to fight it. The drivers are continuously tracked by ride-sharing apps and if they fail to follow any of the instructions provided by these apps, they either get penalized or banned from the platform. This technology-mediated attention can intensify drivers’ negativity and can have adverse effects on their mental health and psychological well-being.

Algorithmic-management

Algorithms control several aspects of the job for the drivers – from allocating rides to tracking workers’ behaviour and evaluating their performance. This lack of personal contact with the supervisors and other colleagues can be dehumanizing and disempowering and can result in the weakening of worker solidarities.

When asked if the algorithms can adjust the route for the drivers, especially for women, if they need to use the restroom, a platform executive said, “They always have the option not to accept the ride if there is a need to use the washroom. The customers cannot wait if the driver stops the car for restroom break and at the same time, who will pay for the waiting time?”

Image credit: Antonio Batinić / Pexels

While this makes sense at first glance, in reality, algorithms of a few ride-sharing platforms like Lyft penalize drivers in such cases by lowering their assignment acceptance rate (number of ride requests accepted by the driver divided by the total number of requests received). Lee and team, HCI (Human Computer Interaction) scholars from Carnegie Mellon University explored the impact of algorithmic-management on human workers in context of ride-sharing platforms and found:

 “The regulation of the acceptance rate threshold encouraged drivers to accept most requests, enabling more passengers to get rides. Keeping the assignment acceptance rate high was important, placing pressure on drivers. For example, P13 [one of the drivers] stated in response to why he accepted a particular request: ‘Because my acceptance rating has to be really high, and there’s lots of pressure to do that. […] I had no reason not to accept it, so […] I did. Because if, you know, you miss those pings, it kind of really affects that rating and Lyft doesn’t like that.’”

Uber no longer displays the assignment acceptance rate in the app and states that it does not have an impact on drivers’ promotions. Ola India’s terms and conditions state “the driver has sole and complete discretion to accept or reject each request for Service” without mentioning about the acceptance rate. However, Ola Australia indicate the following on their website: “Build your acceptance rate quickly to get prioritised for booking! The sooner and more often you accept rides (as soon as you are on-boarded), the greater the priority and access to MORE ride bookings!”

The lack of information coupled with ambiguity complicates the situation for drivers, who would try not to reject the rides under any circumstances. Moreover, the algorithms are designed to create persistent pressure on the drivers by using psychological tricks as pointed out by Noam Scheiber in an article for The New York Times:

“To keep drivers on the road, the company has exploited some people’s tendency to set earnings goals — alerting them that they are ever so close to hitting a precious target when they try to log off. It has even concocted an algorithm similar to a Netflix feature that automatically loads the next program, which many experts believe encourages binge-watching. In Uber’s case, this means sending drivers their next fare opportunity before their current ride is even over.”

The algorithmic decision-making also directs our attention to how the rides are allocated. The product manager of a popular ride-sharing app said:

“Apart from proximity, the algorithms keep in mind various parameters for assigning rides, such as past performance of the drivers, their loyalty towards the platform, feedback from the customers, if the drivers made enough money during the day etc. The weightage of these parameters keep changing and hence cannot be revealed.”

All the four people interviewed said that number of women driving professionally is considerably low. This makes it difficult for the algorithms to match women passengers with women drivers. Secondly, this may delay ride allocation for women passengers as the algorithms will first try to locate women drivers.

A lack of understanding of how algorithms assign tasks makes it difficult to hold these systems accountable. Consequently, a group of UK Uber drivers have decided to launch a legal bid to uncover how the app’s algorithms work – how the rides are allocated, who gets the short rides or who gets the nice rides. In a piece in The Guardian, the drivers’ claim says:

“Uber uses tags on drivers’ profiles, for example ‘inappropriate behaviour’ or simply ‘police tag’. Reports relate to ‘navigation – late arrival / missed ETA’ and ‘professionalism – cancelled on rider, inappropriate behaviour, attitude’. The drivers complain they were not being provided with this data or information on the underlying logic of how it was used. They want to [know] how that processing affects them, including on their driver score.”

The fact is that multiple, conflicting algorithms impact the driver’s trust in algorithms as elaborated in an ongoing study of ‘human-algorithm’ relationships.  The research scholars discovered that Uber’s algorithms often conflict with each other while assigning tasks, such as, drivers were expected to cover the airport area but at the same time, they received requests from a 20-mile radius. “The algorithm that emphasizes the driver’s role to cover the airport was at odds with the algorithm that emphasizes the driver’s duty to help all customers, resulting in a tug o’ war shuffling drivers back and forth.” Similarly, conflict is often created when drivers are in the surge area and they get pings to serve customers somewhere out of the way.

Ultimately, we need to shift from self-optimization as the end goal for workers to that of humane algorithms – that which centres workers’ pressures, stress, and concerns in this gig economy. This would also change the attitudes of the passengers, who need to see platform drivers as human drivers, facing challenges at work, like the rest of us.

Categories
Blog

Platformizing women’s labour: Towards algorithms of empowerment

[By Pallavi Bansal]

As the fifth-born daughter to a poverty-stricken couple in a small village of Karnataka, Rinky would consider herself fortunate on days she wouldn’t have to sleep on an empty stomach. Her parents pressurised her to take care of her younger brother while they struggled to make ends meet. As the siblings grew up, the brother started going to a nearby school, while Rinky managed the household chores along with her sisters. A curious teenager, Rinky coerced her brother to teach her every now and then, including how to operate a smartphone they had recently acquired. When she turned 19, she mustered the courage to move to Bengaluru in search of a better life. She survived by doing menial jobs in the beginning such as cleaning houses, cooking and washing dishes. She earned a mere amount of Rs 15,000 (about 200 USD) every month, enough just to get by. She always felt disrespected having to deal with constant humiliation until someone in the neighbourhood advised her to learn driving and partner with the ride-hailing platform Ola.  

Image credit: Renate Köppel, Pixabay

While there were initial hiccups in procuring the vehicle and learning how the app works, this move dramatically changed her life as it turned her into a micro-entrepreneur with a lucrative income of Rs 60,000 to take home every month. Besides allowing flexible work hours, the job provided her a sense of independence which was missing when she worked for others. She wasn’t really bothered about how the rides were assigned to her, though she always worried about her safety while boarding male passengers. At the same time, she was unable to comprehend how some of her colleagues earned more than her despite driving for similar number of hours.

“Rinky” is a composite character but represents stories of many such women for whom the platform-based economy has opened up a plethora of employment opportunities. The interesting aspect is that women workers are no longer confined to stereotypical jobs of salon or care workers; they are venturing into hitherto male domains such as cab driving and delivery services as well. The Babajob platform in fiscal 2016 recorded an increase of 153 per cent in women’s applications for driver jobs. According to the Road Transport Yearbook for 2015-16 (the latest such report available), 17.73 % of the 15 million licensed female drivers ride professionally. Though there are no distinct figures available for how many women are registered with Ola and Uber as drivers, ride-hailing app Ola confirmed a rise of 40 % every quarter in the number of female drivers with them. Moreover, cab aggregative service Uber announced to tie up with a Singapore-based company to train 50,000 women taxi drivers in India by this year.

Clearly, the ride-sharing economy is helping Indian women to break the shackles of patriarchy and improve their livelihood. However, the potential of these platforms cannot be fully utilized unless researchers turn an eye on the algorithms that govern them. These algorithms not only act as digital matchmakers assigning passengers to the drivers but regulate almost all aspects of the job – from monitoring workers’ behaviour to evaluating their performance. These machines often fail to treat workers as humans – as people who can fall sick, need a leisure break, socialise with others to stay motivated, de-route to pick up their kids from school, attend to an emergency at home, lose their temper occasionally, and moreover, coming to the work after facing physical abuse at home. In a normal work environment, employers tend to understand their team-members and often deal with compassion during tough times.

Image credit: Satvik Shahapur

Research shows how these data-driven management systems especially in context of ride-sharing apps impact human workers negatively as they lack human-centred design. They discovered that sometimes female drivers did not accept male passengers without pictures at night only to be penalized by these algorithms later. Moreover, drivers complain of rude passengers, which is seldom taken into consideration by platform companies and it only lowers the driver’s acceptance rate and ratings.

Technology creators need to ask themselves how to ensure that algorithms are designed to enable workers and not just be optimized for customer satisfaction. Alternatively, they need to see the extension of worker’s satisfaction as that of customer gratification given these two realms reinforce one another. By sensitizing to the needs of women like Rinky who are perhaps stepping out in this male-dominated world for the first time, programmers could create a more empowering pathway for such women workers. With the entrenched gender norms burdening women with familial duties and limiting their access to education and skills training, the intervention by platform designers can promise genuine change. While cultural change often takes a long course, by placing women at the centre, designers can accelerate this shift.

More concretely, what if platform companies did the following:

  • They create a feedback/resolution system which accounts for rejections and safeguards ratings when women drivers reject certain passengers if they consider them as potential threat.
  • They can institute flexibility in terms of wanting to go home early and this shouldn’t be translated into ‘lower incentives’, after all this is the premise of gig economy.
  • AI should aim at promoting workers’ well-being, which means following a demanding or intensive piece of work (a long ride in this case), AI could recommend a relatively easier task for drivers.
  • Another aspect is to ensure transparency in terms of how the wages are allocated to different people and an understanding of how the autonomous systems impact ratings with also a system of redressal, i.e. one that allows for corrections etc.  
  • Algorithms should encourage a community-building culture rather than individualism-oriented – social incentives could be given to those drivers who pick up rides when an assigned driver is unable to reach the destination instead of penalizing him or her.
  • While the in-built GPS system in apps can help drivers track public toilets and other places that could be used for restroom breaks, algorithms could be trained to adjust routes according to drivers’ needs and availability of amenities.
  • Moreover, popular ride-sharing platforms like Ola and Uber can consider assigning women passengers to women riders especially during the night-time. This move can make both the parties feel secure considering women dread boarding a taxi in an ‘unsafe’ country like India.

Across disciplines, if we brainstorm on reimagining these platforms as cooperative instead of competitive spaces, of human-centered versus optimization-centered, and as feminist-oriented and not just male-oriented, there may be more promise for our digital wellbeing.