REPORTS

B’AI Global Forum Launch Event Report on “Gender Justice in the AI Era: A dialogue on engagement and activism”

Lim Dongwoo(Research Assistant of the B’AI Global Forum)

・Date:March 17, 2021(WED)18:30 〜 20:00
・Venue:Zoom & YouTube
・Language:English(with Japanese interpretation)
・Organizer : B'AI Global Forum, Institute for AI and Beyond at the University of Tokyo
(Click here for details on the event)

On Wednesday, March 17, 2021, the B’AI Global Forum held an inaugural event “Gender Justice in the AI ​​Era: Dialogue on engagement and activism” through Zoom and YouTube. It was an important event for the official launch of the B’AI Global Forum. This event was held in English, but Japanese simultaneous interpretation was also provided so that more participants could actively participate in the discussion.

 

The structure of this event was divided into two sessions. In the first session, Anita Gurumurthy gave a keynote lecture on the theme of “Towards feminist futures in the intelligence economy.” Gurumurthy is an activist who has conducted joint research on the political economy of the Internet environment in India, gender justice, and women’s participation in the engineering field. In the second part, Asumi Saito, co-founder of Waffle, and Akihiro Nakao, Professor in the Graduate School of Interdisciplinary Information Studies at the University of Tokyo, participated in a panel discussion. The issues of each part are summarized as follows.

 

In the first session, Gurumurthy considered today’s capitalism, which operates under the influence of the algorithm, and pointed out that the algorithm elevates capitalism to social order rather than economic principles. She also introduced some examples of how the work created under the platform is restructuring the social structure of power and gender through algorithms.

 

The first example was a case of Damini, who works for the Urban Company (UC), an online platform that provides services helping with household affairs and repairs. In particular, Gurumurthy pointed out that from the standpoint of service providers—workers, it is unclear who would be assigned to task and how the assignment is made. She also criticized the fact that the commission point calculation method was not disclosed and that service providers could not see customer reviews.

 

The second example is a case of Sushila, who is registered with Uber and Ola of Ride Hailing (a service that matches drivers and passengers), and the opacity of matching was also pointed out in this case. Specifically, even if the driver registration is stopped or invalidated by the algorithm, the reason is not clarified.

 

The following example is about Jayashree who works at Amazon Mechanical Turk (AMT), which is a service platform developed by Amazon to contract for cheap unskilled computer labor. In this example, Gurumurthy said that work was assigned priority to American residents, and that only about 20% was allocated to India, and that discrimination was being made based on the nationality of the users. Workers who registered with AMT were always worried about if their accounts would be suspended, and they couldn’t even file a claim to the algorithm’s decision.

 

The last example is about Sakshi and Diya, who belong to the Self-Employed Women’s Association in India, which was approached by Amazon for a business alliance with a platform called Saheli. The problem pointed out in this example was that the algorithm determined where and how the product was posted, and the correlation between the prominent place and the number of sales was unknown. It was also criticized that although product providers and buyers can write reviews, it is unclear whether bad reviews result in poor visibility of the product.

 

Based on the results of the above examples, Gurumurthy described algorithms as powerful actors to transform society. First, she pointed out platform labor as a self-monitoring system. In other words, the fear of being subject to arbitrary penalties by algorithms causes workers to lose their independence and constantly monitor and discipline their behavior in order to secure their work and income. Gurumurthy also explained the social factors that determine the options. She pointed out that algorithmic labor is justified by the traditional world of patriarchal social structures. And the conditions of labor embedded in the algorithm and the social environment outside the algorithm influence each other and immobilize the harsh reality of India.

 

According to Gurumurthy, there are two things that need to be done to rectify the platform ecosystem. The first is that workers should be given the right to access to data. Specifically, in the working environment on the platform, it is necessary to take affirmative action for female workers / sellers and design a gender-equal algorithm. Second, the digital framework itself must be separated from the current colonial perspective. Algorithms should be based on new social relationships, which should create conditions that will lead to fundamental social change.

 

In this keynote, Gurumurthy looked at the situation of human rights violations on an algorithmic platform through women’s narratives. Finally, Gurumurthy said that we must strive to help the most marginalized people obtain new subjecthood, and what had been considered “computer work” must be reorganized as a political activity based on new institutional ethics.

 

In the second session, the panel discussion was held with the participation of Asumi Saito of Waffle and Professor Akihiro Nakao of the Graduate School of Interdisciplinary Information Studies, the University of Tokyo. First of all, there were comments by Ms. Saito and Professor Nakao about impressions of Gurumurthy’s keynote lecture, and in particular, Prof. Nakao explained the connection between his research and Gurumurthy’s activities by introducing his project of applying ITC (Information Technology Communication) to fishery activities in Hiroshima and the results obtained from it. He added, “Through ITC, we can remove prejudice and rationally restructure the labor market without risking human.”

 

After that, a Q & A session followed. The first question from the audience was, “I understand Gurumurthy’s story that by correcting the algorithm bias, we can actually bring about economic equality, but how do we actually achieve that?” Gurumurthy said that it is difficult for individuals to disconnect from the platform and work alone, therefore the algorithm itself must be newly constructed, and in particular, companies should fulfill their social responsibilities through self-regulation. Saito sympathized with Gurumurthy, adding that in Japan the AI ​​industry is not very mature yet, so we have time to develop relevant policies and systems. In response, Professor Yuko Itatsu, an Associate Director for the B’AI Global Forum, added the question, “Is there any organization that educates the general public about algorithm bias?” Saito cited MIT Technology Review and ‘fast.ai’.

 

The second question was, “Is there an opportunity for a wider group of people, including men, to participate in this activism?” Citing her own educational activities for trade unions, Gurumurthy emphasized that the goal of feminist analysis is not to exclude men, but to analyze how and why oppression works. Professor Itatsu also asked Professor Nakao about “how local 5G is different from public 5G, for example, can local 5G correct injustice?” Professor Nakao said, “It is difficult to tell exactly the difference between what local and public 5G can do, but for the time being, it is important to collect a lot of data and make it transparent how it is used.” In response, Professor Itatsu asked, “Is it really possible to change the ethics and behavior of giant technology companies?” Professor Nakao said it’s important to raise the issue because the big tech companies are not aware of it. In response to the same question, Gurumurthy said that rather than believing in the company, external pressure needs to be applied, for example, academia and civil society should analyze more sharply and announce the results. She also added that public infrastructure such as public cloud services is needed as well.

 

The final question was how education could help restructure the IT industry and improve algorithm bias. Professor Nakao replied that it is important to encourage many students to face challenges and work hard to solve problems. Gurumurthy also said that we need to teach students how to operate data by having them do artificial intelligence experiments using small amounts of data. Saito pointed out that most of today’s education is designed by men and privileged people, and it might create structural disparities in the next generation. She also said that it is important to incorporate various backgrounds of diverse people into technical education.

 

Finally, in his closing remarks Professor Yujin Yaguchi, an Executive Manager of the B’AI Global Forum, emphasized that trying to understand the social significance of technology from the perspective of gender and social justice is the spirit of the BAI Global Forum, which is a core perspective that explores not only gender equity but also social equity as a whole. It can be said that this event was of great significance in that experts from various fields gathered and shared specific cases and multifaceted perspectives.