Categories
Blog

Centered but invisible – On the contradictions of service design at Urban Company

[By Sai Amulya Komarraju]

Ting! A beauty worker checks her mobile. A ‘lead’ appears on her mobile screen from the platform service aggregator she has registered with. She accepts it, calls the customer through the platform that has helped her become a microentrepreneur, confirms the request and location of the customer and rides off to the location. She rings the bell. Once the customer greets her, the worker does what has become a routine since the onset of COVID-19: She sanitises her hands, dons a fresh pair of gloves, face mask, and face shield before entering the house. She sets her products neatly and gets to work. Once finished, she sprays everything she has touched with sanitiser, from the doorbell she rang to the tap she used in the customer’s washroom to fill water for a pedicure session. She packs all her belongings and collects soiled products (used waxed strips and such) to dispose of them on her way to another gig.

Meanwhile, the now relaxed customer is asked by the app to rank the beauty worker on her hygiene: Did she wear a mask? What about gloves? Did she leave any used products behind? In short, how successful was the worker in her attempts to disappear without leaving any proof of her physical presence?

Behind all this is a Standard Operating Procedure that regulates the worker’s behaviour, which is then monitored by the app with the help of the data provided by the customer. Based on this feedback, the worker receives a hygiene rating. Moreover, Machine Learning (ML) is utilized to recognize if the worker was wearing masks and gloves through pictures that the worker has to provide before the gig.

The above describes a day in the life of service partners (who provide services and are variously refereed to as service partners, providers or professionals) and customers (who avail services through the platform) associated with the app-based, on-demand platform aggregators. On-demand platforms (like Urban Company and Housejoy) match service partners or ‘pros’ with customers in need of home-based services such as cleaning or salon treatments, through leads. To do this, they charge a commission. Any hitch or issue within the service partner app or the customer app leads to the breakdown of the entire ecosystem. This is where the Software Development Engineers step in. They ensure that the entire experience from booking a service to feedback remains seamless. These engineers must at all times remain alert to whatever complaints arise, either from service partners or customers, even while working to eliminate manual intervention in other aspects. I spoke with a couple of Software Development Engineers (on the condition of anonymity) working for Urban Company to gather insights about their role within the organization, the importance of service partners and customers in process of designing technologies.

Image credit: ivabalk / Pixabay

Role of Software Development Engineers (SDEs)

On-demand platforms are veered towards maximising customers’ experience (which has long been established as a brand on its own). This is also reflected in the kind of words one uses in industrial design and innovation—such as experience economy or service economy. In order to keep up with such a fundamental organizational change, companies turn towards the concept of ‘service design’.

Speaking about what companies expect Software Development Engineers to do, SDE 2 explains:

“we translate all the business fundamentals, business logics into tech solutions. Essentially, automate the entire process. So, this is what the expectation is from you when you are working as a software developer.”

But this is not the only requirement. The idea, another SDE from Urban Company says, is to make sure that the service partners and customers (who book services on the app) are comfortable with the environment provided for them within their separate apps:

“[…] for instance, we need to create a solution to the problem of auto-suggestion of products. If a service partner working in the beauty segment is ordering products, we have to work with the team that predicts market trends and make sure that their suggestions appear at the top of the page. Then we must take into account if pros are comfortable with that placing. Should it appear right at the very top of the page, or when they go to the particular product’s page, is that where the prompt should go?” (SDE 3)

The SDEs I spoke with agree that creating smooth environments for service partners or pros is more complicated than the flows involved with customers. Therefore, more engineers work on the service partner app. SDE 4 notes that the design of the interface is such that one must take into account what the service partners are making of any new feature launched (whether in terms of understanding what it does or ease of use). SDEs must also co-ordinate with other teams that are most likely to be affected by changes they make. They must also adhere to the company’s business goals in order to create something that works, fixes, and reduces the burden of manual intervention. Although, the SDE says, “you cannot always predict how something might turn out to be, but that is what makes it exciting as well”. This mostly invisible work of making sure that features do all these things–enhance customer experience, reduce manual intervention, help service partners make decisions, but above all improve the business logic of profit-making for the company is done by the SDEs.

Asked if engineers undergo any training since they design technologies for those who are marginalized due to multiple factors (gender, class, type of work they are engaged in), I received no definite answer.

The Urban Company ecosystem. Image credit: Sai Amulya Komarraju

Service design: From productivization to servitization

The concept rather the philosophy of service design is broadly understood as the activity of planning and organizing the resources of a business, i.e., people (in the case of the platform ecosystem: service partners, employees, customers), props (AI and ML based algorithms), and various other processes (workflows, Standard Operating Procedures and other dimensions involved in order to ensure smooth services) to directly improve the employees’ experience (in this case it is would include both SDEs and service partners). This ensures that every component is laid out and thought through in detail to ensure a smooth ecosystem. Ecosystems are best understood as collaborative environments where various resources of the company work together to co-create values.

The philosophy of service design shines through in what my interviewees explain: UC assumes that SDEs take into account the views of service partners during all stages of development of a feature. SDE 1 and 2 report that UC focuses on a ‘win-for-all’ approach. In fact, a recent study by Fairwork India has found that UC tops the list of companies that provide “fairwork” based on 5 principles: 1) Fair pay 2) Fair conditions 3) Fair contracts 4) Fair management 5) Fair representation. Confirming this, SDE 3 states that engineers regularly call partners (personal information is encrypted and not shared with anyone) to check if a particular feature seems okay to them. “It is common sense, you know, I mean you are making something for someone, whom to call, if not the recipient?” SDE 2 says that it is easier to guess what a customer wants “because you are one yourself… we have all availed services… but understanding the POV of the pros is difficult… we all call and talk with pros as and when required”. In fact, SDE 2 also admits that when she joined the platform, she was uncomfortable with “round the clock tracking” of service partners. However, when the service providers themselves expressed that this was an acceptable trade-off, she made her peace with it.

“I think the idea is you want them [service partners] to succeed as well. They do really work hard. So, again, no one tells you to do it, but you think about it, how do we give them the best chance to succeed and then create a feature” says SDE 4. For instance, SDEs collaborated closely with the business team to anticipate “sprees” (such as the sudden demand for roll-on waxing), so that service partners could stock up on products needed for such services. However, this view must be balanced by the fact that the business logic of profit-making is supreme, in the face of which even long-term, scalable tech solutions must take a backseat accruing what SDE 2 refers to as a “tech debt”.

This logic inevitably organizes the relationships within the ecosystem in a hierarchical fashion. Customers and their experience and satisfaction are placed at the apex since they bring business, and software engineers enable “extra-legal” mechanisms (rating, tracking etc.) to monitor the service partners through the app in order to ensure quality of services. Even though service partners are considered as a crucial resource (SDE 3), the oversupply of workers compared to the demand, and control mechanisms in the form of rating and reviews serve to maintain power asymmetries between the platform, customer, and the service partner.

The inadequacy of service design

In some sense, when SDEs speak of developing Standard Operating Procedures in order to provide a holistic experience for the customer, they move beyond thinking about mere productivity of service partners. But this does not take away from the fact that workers are still expected to display skill and dexterity at work. They are expected to take a minimum number of leads (which can be read as productivity of a particular partner) and their ratings and continued association with the platform depends on customer satisfaction.

The aim of service design is to move beyond thinking in narrow terms of providing “goods” to the broader concept of offering services. In short, not productivization but servitiziation is the goal. However, this necessarily requires productizing the worker’s skills. We need to problematize this move from good-dominant to service-dominant logic. The burden of delivering the actual experience ultimately falls squarely on the shoulders of service partners. This is especially so in the case of home-based services such as beauty and wellness, where a worker’s physical labor involved in the performance of beauty-work contributes the most in creating a feeling of wellbeing for customers. This burden is reinforced by the fact that their work is constantly supervised by both the app and the customers. The multitude of problems and the high degree of precarity gig workers in the home-based sector face is well documented. Therefore, despite of the human-centric focus of service design, the burden of delivering customer satisfaction with the goal to generate profit is felt more keenly by the service partner first and foremost.

My interviews reveal that SDEs do think about the service partners and that there is a modicum of care they feel towards them. Still, there is much left to be desired in terms of ensuring that all resources are equally empowered within the ecosystem. For human-centric design to live up to its name, it is imperative that businesses adopt an ethics of care within design that could help balance logics of business, technology and the needs of workers.

Categories
Blog

Is feminist design a solution to platform workers’ problems?

[By Pallavi Bansal]

Imagine a scenario in which you do not get shortlisted for a job interview – not because you are underqualified – but because the algorithms were trained on data sets that excluded or underrepresented your gender for that particular position. Similarly, you found out that you are consistently paid less than your colleagues in a sales job – not because of your inability to fetch clients or customers for the company – but because the rewarding algorithms favoured clients belonging to a certain religion, race or ethnicity. Further, you are asked to leave the company immediately without any notice or opportunity to interact with your manager – not because you committed a mistake – but because the clients rated you low based on prejudice.

While these biases, favouritism and discrimination could soon become a reality in mainstream workplaces due the exponential growth of decision-making algorithms, it is already causing disruption in the online gig economy. Recently, researchers at George Washington University found social bias in the algorithms related to dynamic pricing used by ride-hailing platforms Uber, Lyft and Via in Chicago, US. The study found “fares increased for neighbourhoods with a lower percentage of people above 40, a lower percentage of below median house price homes, a lower percentage of individuals with a high-school diploma or less, or a higher percentage of non-white individuals.” The authors of this paper Akshat Pandey and Aylin Caliskan told the American technology website, VentureBeat, “when machine learning is applied to social data, the algorithms learn the statistical regularities of the historical injustices and social biases embedded in these data sets.”

These irregularities are also spotted in relation to gender. A study conducted by Stanford researchers documented a 7% pay gap in favour of men, using a database of a million drivers on Uber in the United States. However, this study highlighted an even bigger problem that the researchers attributed to the following factors – differences in experience on the platform, constraints over where to work (drive), and preference for driving speed. A Cambridge University researcher Jennifer Cobbe told Forbes, “rather than showing that the pay gap is a natural consequence of our gendered differences, they have actually shown that systems designed to insistently ignore differences tend to become normed to the preferences of those who create them.” She said the researchers shifted the blame to women drivers for not driving fast enough and ignored why the performance is evaluated on the basis of speed and not other parameters such as safety. Further, in context to women workers in the Indian gig economy, it is imperative to understand whether these biases are socially inherent. For instance, if certain platform companies segregate occupations based on gender, then the resulting pool will inherently lack gender variation. This also compels us to ponder whether the concentration of female labour in beauty and wellness services, cleaning or formalised care work is a result of an inherent social bias or technical bias.

To make sense of all of this and understand how we can improve the design of these digital labour platforms, I spoke to Uday Keith, a Senior AI Developer with Wipro Digital in Bengaluru. His responses drew my attention towards Informatics scholar Bardzell’s feminist human-computer interaction design paradigm, which I use to contextualize them.

Illustration by Pallavi Bansal

PB: How can we overcome biases in algorithms?

UK: First of all, algorithms are not biased, it is the datasets which are biased. The imbalances in the datasets can be corrected via a method known as SMOTE (Synthetic Minority Over-sampling Technique) where the researchers recommend over-sampling the minority and under-sampling the majority class. In order to achieve this, we need to bring diversity to our training datasets and identify all the missing demographic categories. If any category is underrepresented, then the models developed with this data will fail to scale properly. At the same time, it is essential for the AI developers to continuously monitor and flag these issues as the population demographics are dynamic in nature.

This points us toward the two core qualities proposed by Bardzell – Pluralism and Ecology.  According to her, it is important to investigate and nurture the marginal while resisting a universal or totalizing viewpoint. She stresses to consider the cultural, social, regional, and national differences in order to develop technology. The quality of ecology further urges designers to consider the broadest contexts of design artifacts while having an awareness of the widest range of stakeholders. This means AI developers cannot afford to leave out any stakeholder in the design process and should also consider if their algorithms would reproduce any social bias. 

PB: Can there be a substitute for the gamification model?

UK: To simplify the process and ensure equity in the gig economy, platform companies can advise AI developers to introduce a “rule”. This would mean fixing the number of minimum rides or tasks a platform worker gets in a day, which can also help in ensuring a minimum wage to them and provide a certain level of income security. The introduction of a fixed rule can even eliminate social biases as this would not result in a particular gender or social group getting less work. Further, the reward system can undergo a major overhaul. For instance, rather than incentivizing them to drive more and indulge in compulsive game-playing, platform companies can build algorithms that provide financial rewards when the drivers follow traffic rules and regulations, drive within permissible speed limits, and ensure a safe riding experience. In fact, we can even provide options to the customers where they could be given discount coupons if they allow drivers to take short breaks.

Elaborating on participation, Bardzell suggests ongoing dialogue between designers and users to explore understanding of work practices that could inform design. This also means if the platform companies and AI developers are oblivious to the needs and concerns of labour, they may end up designing technology that could unintentionally sabotage users. Secondly, an advocacy position should be taken up carefully. In the earlier example, “driving fast” was considered as a performance evaluator and not “safety”, which usually happens because the designers run the risk of imposing their own “male-oriented” values on users.

PB: How work allocation can be more transparent?

UK: Well, deep learning algorithms used by various companies have a “black box” property attached to them to a certain extent. These algorithms are dynamic in nature as they keep learning from new data during use. One can only make sense of this by continuously recording the weightage assigned to the pre-decided variables.

The quality of self-disclosure recommended by Bardzell calls for users’ awareness of how they are being computed by the system. The design should make visible the ways in which it affects people as subjects. For instance, platform companies can display the variables and the corresponding algorithmic weightage per task assigned on the smartphone screen of the workers. So, if a platform driver has not been allocated a certain ride due to his past behaviour, then the technology should be transparent to reveal that information to him. Uncovering the weightage given to various decision-making algorithms will enable the platform worker to reform their behaviour and gives them a chance to communicate back to the companies in case of discrepancies or issues.

PB: How can we improve the rating systems?

UK: The platform companies have started using qualitative labels that could help users to rate the workers better. However, we do need to see whether sufficient options are listed and suggest changes accordingly. Moreover, if we want to completely avoid the numerical rating system, we can ask the users to always describe their feedback by writing a sentence or two. This can be analysed using Natural Language Processing (NLP), a subfield of Artificial Intelligence that helps in understanding human language and derive meaning.

Bardzell writes about the quality of embodiment in respect to meaningful interactions with the technology and acknowledging the whole humanness of individuals to create products that do not discriminate based on gender, religion, race, age, physical ability, or other human features. This concept should also be applied in relation to how users rate workers and whether they discriminate on the basis of appearances or other factors. Hence, there is a strong need to include the qualitative rating systems along with the quantitative ones.

Additionally, Uday Keith recommends defining “ethics” and frequently conducting ethics-based training sessions since a diverse set of people form the team of data scientists, which comprises of roughly 10% of women in an urban Indian city Bengaluru. He concluded by remarking that the issues in the platform economy are more of a system design fault than that of an algorithmic design – the companies consciously want to operate in a certain way and hence do not adopt the above recommendations.

These pointers make the case for the adoption of a feminist design framework that could bring about inclusive labour reforms in the platform economy. As Bardzell says, “feminism has far more to offer than pointing out instances of sexism,” because it is committed to issues such as agency, fulfilment, identity, equity, empowerment, and social justice.

Categories
Blog

Making opportunities inclusive for first-time digital users

[By Shrinath V]

A couple of years ago, our house help came in early. She brought her daughter with her. The daughter was working at a nearby fashion store as a salesgirl after her graduation. The previous night, she had arrived home from work, distraught and weeping. The mother could not understand what she was upset about. She thought my wife could help calm her.

After having some coffee, the daughter calmed down a bit and spoke about what had happened the previous day at the store. The store owner was unhappy with her using her mobile during working hours. He had threatened to put up photos of her slacking on Facebook. This had terrified her, as she thought her reputation amongst her friends and her local community was at stake. She could not sleep that night, fearing that she and her family would lose face.

After she narrated the incident, we checked whether the sales manager was a Facebook friend. He was not. She later confessed that she had deleted her Facebook account a while ago. Why was she so agitated then? She assumed that any photos of hers posted there would be seen by all her friends. Facebook’s privacy settings had been too complex for her to understand, so she assumed the worst. We had to reassure her that once she deleted her Facebook account, no one could tag her or make any content public. Even if she had not deleted her account, she could remove tags from photos others posted before any of her friends could see them. It took her a while to get convinced about this, but when she left, she was a lot calmer than she came.  

This incident got me thinking.

I live in Bangalore, often called India’s Silicon Valley. Bangalore has a huge population working in the technology domain. Most college students carry smartphones. Here was a college educated salesgirl in an urban fashion store. We would assume that she would be comfortable with social media usage. And yet, she was so confused by the controls on the site that she thought it was a threat to her reputation. Finer aspects like abuse of power and violation of privacy were tough for her to comprehend. A threat about posting photos on Facebook from someone who was not even her friend had turned her into a nervous wreck.

Image credit: Victorgrigas 

The truth is that this girl is representative of many first-time digital users across growth economies. Thanks to cheaper smartphones and data plans, many users are getting their first taste of the internet. But many aspects that seem trivial to long-time technology users are seen very differently by such new adopters.

New opportunities and challenges

Smartphones have fueled the imagination of many who have just started understanding the power of the internet. In some ways, this has been timely. We are already seeing that the world post COVID-19 will rely a lot more on digital technologies. As we shift to transacting more online, we will see a larger number of gig jobs. From entertainment to education, smartphones, apps, and online services will play a greater role in lives of the new digital initiates.

A lot of this, no doubt, will improve the lives of billions. Going online is opening new vistas for exploration and providing new opportunities. Thanks to smartphones, new entrepreneurs and business models are aplenty. We see housewives post extra plates of lunch on WhatsApp groups for others in their locality to order. Local teachers take to Telegram to coach students appearing for exams. Drivers-on-hire get you and your car safely back home after a late night at the bar, so you need not drive when drunk.

And yet, there are unexpected challenges. Internet-driven models and services are largely designed for people who are comfortable with digital literacy. There are a lot of assumptions baked into how these are designed or delivered. As first-time digital users start using these services, many of these assumptions do not hold.

As digital technologies are likely to play a bigger role in the future of work, here are some points to consider.

Better terminology & representations

Websites and apps often have different privacy and consent policies. These are difficult enough for us to understand but can be befuddling to first-time digital users. Most are written in legal language that is difficult for common users to understand. They are made easy to click through so the apps can claim they received approval from users. As these vary per app or website, it is often easy to lose track of what one has agreed to. A more inclusive design could involve a common set of representations for terms like privacy and consent, preferably with videos explaining what the users are signing up for. For gig workers, this could greatly improve their understanding of what permissions the business asks of them. For example, knowing that you are being tracked only when you are on the job and not otherwise can be reassuring.  

Better explanation of downside risk

Many first-time digital users sign up for gigs based on referrals from friends. But often, the downside risks are not well understood. A while ago, I took an auto rickshaw (tuk-tuk) to office. As I chatted with the driver, I realized that he had earlier signed up as a cab driver for one of the many ride-hailing apps. As part of the deal, he purchased his taxi on a loan arranged by them. After a few months, he wanted to take a vacation and get home. He parked his taxi at their designated garage. When he returned, he was told that he had to pay a huge per-day parking charge before he could take his vehicle. This shocked him, but the company agent said it was part of the initial agreement he had signed. He did not know enough to debate them. After a few days, he realized his negotiation was going nowhere, and the taxi loan payments were due.

Image credit:  Andy Gray

He finally opted to forego the taxi and the money he had paid for the loan as he felt there was no other choice. This made him wary of gig opportunities in the future, and he decided to take up a safer, though less remunerative, option. He would have understood things much better if the downside risks had been better explained. This could again be done by using tools like video in languages that gig workers are comfortable with.

Better avenues for grievance redressal

A food delivery executive I spoke to recently complained about a late-night delivery he had to make a couple of days ago. He had picked up the food but was accosted by local bullies on the way. He could keep his phone, but they grabbed the food. When he rang up the food tech firm, he was told that he would have to pay for the food stolen from his remuneration. As online businesses grow, we will see many such cases of grievances that come up. First-time digital users may not be aware of grievance redressal mechanisms in place. More education on these and better policies will help.


Shrinath V is a product management consultant and founder of The Better Product Studio. In his last corporate role, he was the Head of products for location services on Nokia’s phones built for the Next Billion Users. He has been a mentor to various startups building for this segment over the last several years. 

Categories
Blog

Platformizing women’s labour: Towards algorithms of empowerment

[By Pallavi Bansal]

As the fifth-born daughter to a poverty-stricken couple in a small village of Karnataka, Rinky would consider herself fortunate on days she wouldn’t have to sleep on an empty stomach. Her parents pressurised her to take care of her younger brother while they struggled to make ends meet. As the siblings grew up, the brother started going to a nearby school, while Rinky managed the household chores along with her sisters. A curious teenager, Rinky coerced her brother to teach her every now and then, including how to operate a smartphone they had recently acquired. When she turned 19, she mustered the courage to move to Bengaluru in search of a better life. She survived by doing menial jobs in the beginning such as cleaning houses, cooking and washing dishes. She earned a mere amount of Rs 15,000 (about 200 USD) every month, enough just to get by. She always felt disrespected having to deal with constant humiliation until someone in the neighbourhood advised her to learn driving and partner with the ride-hailing platform Ola.  

Image credit: Renate Köppel, Pixabay

While there were initial hiccups in procuring the vehicle and learning how the app works, this move dramatically changed her life as it turned her into a micro-entrepreneur with a lucrative income of Rs 60,000 to take home every month. Besides allowing flexible work hours, the job provided her a sense of independence which was missing when she worked for others. She wasn’t really bothered about how the rides were assigned to her, though she always worried about her safety while boarding male passengers. At the same time, she was unable to comprehend how some of her colleagues earned more than her despite driving for similar number of hours.

“Rinky” is a composite character but represents stories of many such women for whom the platform-based economy has opened up a plethora of employment opportunities. The interesting aspect is that women workers are no longer confined to stereotypical jobs of salon or care workers; they are venturing into hitherto male domains such as cab driving and delivery services as well. The Babajob platform in fiscal 2016 recorded an increase of 153 per cent in women’s applications for driver jobs. According to the Road Transport Yearbook for 2015-16 (the latest such report available), 17.73 % of the 15 million licensed female drivers ride professionally. Though there are no distinct figures available for how many women are registered with Ola and Uber as drivers, ride-hailing app Ola confirmed a rise of 40 % every quarter in the number of female drivers with them. Moreover, cab aggregative service Uber announced to tie up with a Singapore-based company to train 50,000 women taxi drivers in India by this year.

Clearly, the ride-sharing economy is helping Indian women to break the shackles of patriarchy and improve their livelihood. However, the potential of these platforms cannot be fully utilized unless researchers turn an eye on the algorithms that govern them. These algorithms not only act as digital matchmakers assigning passengers to the drivers but regulate almost all aspects of the job – from monitoring workers’ behaviour to evaluating their performance. These machines often fail to treat workers as humans – as people who can fall sick, need a leisure break, socialise with others to stay motivated, de-route to pick up their kids from school, attend to an emergency at home, lose their temper occasionally, and moreover, coming to the work after facing physical abuse at home. In a normal work environment, employers tend to understand their team-members and often deal with compassion during tough times.

Image credit: Satvik Shahapur

Research shows how these data-driven management systems especially in context of ride-sharing apps impact human workers negatively as they lack human-centred design. They discovered that sometimes female drivers did not accept male passengers without pictures at night only to be penalized by these algorithms later. Moreover, drivers complain of rude passengers, which is seldom taken into consideration by platform companies and it only lowers the driver’s acceptance rate and ratings.

Technology creators need to ask themselves how to ensure that algorithms are designed to enable workers and not just be optimized for customer satisfaction. Alternatively, they need to see the extension of worker’s satisfaction as that of customer gratification given these two realms reinforce one another. By sensitizing to the needs of women like Rinky who are perhaps stepping out in this male-dominated world for the first time, programmers could create a more empowering pathway for such women workers. With the entrenched gender norms burdening women with familial duties and limiting their access to education and skills training, the intervention by platform designers can promise genuine change. While cultural change often takes a long course, by placing women at the centre, designers can accelerate this shift.

More concretely, what if platform companies did the following:

  • They create a feedback/resolution system which accounts for rejections and safeguards ratings when women drivers reject certain passengers if they consider them as potential threat.
  • They can institute flexibility in terms of wanting to go home early and this shouldn’t be translated into ‘lower incentives’, after all this is the premise of gig economy.
  • AI should aim at promoting workers’ well-being, which means following a demanding or intensive piece of work (a long ride in this case), AI could recommend a relatively easier task for drivers.
  • Another aspect is to ensure transparency in terms of how the wages are allocated to different people and an understanding of how the autonomous systems impact ratings with also a system of redressal, i.e. one that allows for corrections etc.  
  • Algorithms should encourage a community-building culture rather than individualism-oriented – social incentives could be given to those drivers who pick up rides when an assigned driver is unable to reach the destination instead of penalizing him or her.
  • While the in-built GPS system in apps can help drivers track public toilets and other places that could be used for restroom breaks, algorithms could be trained to adjust routes according to drivers’ needs and availability of amenities.
  • Moreover, popular ride-sharing platforms like Ola and Uber can consider assigning women passengers to women riders especially during the night-time. This move can make both the parties feel secure considering women dread boarding a taxi in an ‘unsafe’ country like India.

Across disciplines, if we brainstorm on reimagining these platforms as cooperative instead of competitive spaces, of human-centered versus optimization-centered, and as feminist-oriented and not just male-oriented, there may be more promise for our digital wellbeing.