In an effort to increase gender diversity in the field of artificial intelligence (AI), the second annual AI For Social Good Lab initiative launched on May 14 in Montreal. The program gave 30 undergraduate women from across Canada the opportunity to use artificial intelligence to address a social issue of their choice. Currently, fewer than 25 per cent of all employees in the tech industry are women, with women making up less than 5 per cent of tech startups owners.
The six-week lab, which was held, in part, at McGill University, was initiated by the OSMO Foundation, McGill’s Reasoning and Learning Lab (RLLAB), and the Montreal Institute for Learning Algorithms (MILA) and co-organized with DeepMind. After two weeks of lectures, an open hackathon, and a week of industry workshops, the participants presented a total of eight projects at the AIForGood closing event on June 21 at the Notman House before a crowd of students, sponsors, and other assorted AI enthusiasts.
The projects included a variety of web, Android, and iOS apps. While participants could choose to address any social issue close to their hearts, many of the projects tied into the program’s theme of inclusivity. The MoodMap team presented their live emotion-recognition system as a useful tool for those with emotional blindness, or those who struggle to understand emotional cues because of conditions such as Autism Spectrum Disorder (ASD), PTSD, and brain trauma. An audience favourite, Biasly, flags implicit gender bias in outgoing text messages.
Summer Lab Diversity Coordinator Jihane Lamouri believes that having a variety of perspectives, as the program encourages, is crucial to the development of AI: If a society is biased, so, too, are its machines. At the Lab’s closing event, Lamouri referenced the alleged sexism that machine translation services like Google Translate or Microsoft’s Bing Translator exhibit. When translating phrases from gender-neutral languages like Finnish or Turkish, machine translators may assign gender pronouns illustrative of a gender bias to the English translation. Users have complained that in the hands of a machine translator the phrase “they are engineer” becomes “he is an engineer,” whereas the phrase “they are a nurse” becomes “she is a nurse.” Lamouri hopes that having more women in the industry will lead to the identification of gender bias in AI.
Marin Ito and Wan-Chun Su, U3 and U2 McGill students in Computer Science, were part of the team behind AEyeAlliance, a Braille-to-English text converter. Their aim was to provide an AI tool that could help make translation services more inclusive.
“[We wanted] an app where [anyone] can take a picture of Braille and get an instant translation,” Ito said during her closing event pitch.
The project was inspired by Ito’s experience as an international student cooking with her host mother, who was blind and used Braille to label her kitchen supplies.
“Currently, our model is able to recognize and convert not only Braille letters, but also words, sentences, numbers and symbols into English text, ” Su said in an interview with The McGill Tribune. “If time permits, we want to implement a [functionality] that will convert Braille into other languages as well.”
Many of the participants appreciated being surrounded by other women, who are underrepresented in the tech industry.
“I feel so welcomed by all these women,” Chloe Xueqi Wang, U3 Science, said.
Many of the organizers, including Doina Precup, associate professor in the School of Computer Science, hoped that the Lab would help bolster inclusivity in AI over time.
“All of us in AI, or science more generally, recognize that we should do a lot of work to increase diversity in our field,” Precup said. “It’s not just good for society, it’s also good for our discipline […and] our research. This is a first step, but we hope that there will be a voyage in machine learning for everyone here.”
Other projects from AIForGood: ShouldIEatThis?, BSafe, EnergyForGood, MR_AI