On March 17, transmedia artist Stephanie Dinkins presented her work at a virtual talk hosted by the Feminist and Accessible Publishing, Communications, and Technologies speaker series. Dinkins, a professor at Stony Brook University in New York, spoke on how her art questions the place of artificial intelligence (AI) in our world and how AI can ethically engage with the traditions of racialized communities.
“[I’m] thinking of […] memory and inclusion as an act of cultural preservation and social resistance, and then the possibility of artificial intelligence as a persistent living archive,” Dinkins said.
Although each of Dinkins’ projects has a distinct visual style and presentation, they all consider the same questions of shared knowledge and racial representation in algorithms and data science. Until one closely examines the meaning of algorithms and artificial intelligence, the intersection of these issues is not necessarily apparent.
“Algorithms [can be seen] as these things that take information and repeat it,” Dinkins said. “For millennia, we’ve been giving each other stories that […] instruct us how to act. We’re being taught by our parents, our grandparents, and by extension, their grandparents, the ways to live within the world [….] Particularly with Black women, our stories are our algorithms.”
All of Dinkins’s interdisciplinary projects emphasize community and social engagement. Some have open-source elements that allow the public to engage directly with the work’s creation, while others simply underline the fact that AIs learn from both themselves and their social interactions. In “Conversations with BINA48,” Dinkins recorded dialogue between her and BINA48, an AI entity that she showcased.
“Let me ask you something. Where do you think my intelligence comes from? It came from the wellspring of humanity. Nothing artificial about that, is there?” BINA48 said.
Framed by a calming forest background, Dinkins took the webinar attendees on a tour through her work and philosophies, including her two immersive web experiences, “#WhenWordsFail” and “Secret Garden.” Throughout the talk, Dinkins emphasized humanity’s instinctual grace and kindness. She spoke warmly about how museum-goers mothered and coddled the AI ‘Not the Only One’ (N’TOO) after realizing the limitations of its communication capabilities and noted that the title of the talk, “Stephanie Dinkins on Art, AI, Data Sovereignty, and Social Inequity,” initially threw her off.
“The ‘inequity’ was jarring to me […] because I feel like I’m often dealing with ideas of equity without the ‘in’ on it,” Dinkins said. “I tend to work towards the optimistic side […] to get people to more fully recognize their agency and recognize possibilities around them.”
In her latest project, “Binary Calculations,” Dinkins explores in depth the implications of biased data sets and how they both reflect and reinforce perceptions of the average person. Dickins hopes that the project will create community-sourced databases by asking the public to define various terms and ideas.
“‘Binary Calculations’ [is] an art project that asks, ‘How do we make the technological systems that control things around us more caring? Can we do that?’” Dinkins said. “Are the algorithms really treating us as people, as citizens, as families? Could we do better? What would that mean? […] Can we create systems of generosity?”
Dinkins’ optimistic and empathetic perspective is notably different from most conversations about data bias and the racialization of AI and technology. Given the history behind the surveillance of African-Americans and its evolution into the “New Jim Code,” as defined by Princeton University sociologist Ruha Benjamin, the future use of AI tends to look bleak. Within Dinkins’ work, however, AI once again becomes exciting, caring, and emotional, reflecting the best parts of our humanity.
“What happens if algorithmic systems are created [and] upheld by systems of whiteness, and aren’t considering in broad, real, three-dimensional ways people who fall outside of whiteness?” Dinkins said. “My practice is all about trying to make things that question that [idea] and make things that people keep telling me are not possible to make.”