Last month, I had the honor of sharing my research on educational technology, algorithmic racism and justice-oriented approaches to AI at the National Institute for Mental Health (NIMH)summit on the Impact of Technology and Digital Media on Child and Adolescent Development and Mental Health. I included a short clip of my remarks below, but I encourage anyone interested in ethical and critically conscious approaches to AI, digital media and mental health - especially as it relates to research with and about young people - to check out the entire summit! Link to the full recording here: https://lnkd.in/gFpCMZJg
Transcript
Hi everyone, I'm Tiara Tanksley and I want to start by saying that my work is at the intersection of education and information studies. So as an educational researcher and someone who does AI ethics, my points today are going to kind of focus on the conceptual frameworks that I think are important. So just to kind of give you an overview of some of my work, I look at how technology and digital media shapes the lives and schooling experiences of black youth. Starting from my dissertation where I actually studied black girl activists experiences with hashtag Black Lives Matter to understand how social media activism was impacting not only their socio emotional experiences, but their educational experiences. And in that study actually uncovered how algorithms were working in ways that commodified and hyper. Circulated images of Black Death and dying. And so that research was important because it really moved my lens up from thinking about race and gender and identity as something that is the source of vulnerability to actually the systems and infrastructures. Right Thinking about racism, sexism and anti blackness becoming technologically codified within these systems and that creates the vulnerability I also look at more recently. Ed tech, educational technology and artificial intelligence in schools and how the proliferation of these systems into schools actually creates educational inequity for black students because of algorithmic erase algorithmic racism, but also because of the growing entanglements between AI and the carceral system. So actually realizing how algorithmic racism supercharges the school to prison pipeline. So from those studies, I just want to kind of, you know, extract. Some points. The first one I already kind of mentioned thinking about vulnerability, not necessarily as folks minoritized identities, but the systems of power that become computationally encoded. So. That's my first point. And I think that's important because you know, if we only focus on. So if my research only focused on how black girls are experiencing hyper targeting, how their content is demonetized, how they are flagged as unethical users of social media, right. When threats and digital harassment are not flagged algorithmically, Right. Because at the time of my study, metas content moderation protocols actually did not protect black children, they only protected white men. So that was an algorithmic decision that made this group. Vulnerable If I only looked at how black girls were being you know how they experienced PTSD and insomnia and all of these other things from seeing viral images of Black Death and dying. The focus would have only been on how to make them more resilient in an anti black system, right. And instead the call to action becomes the systems, how do we change content moderation systems, how do we change commercialized search et cetera, so that our young people don't have to be so resilient all the time, right. So that's the first thing. The second one is thinking about it and I think you kind of mentioned this. It's like thinking about how. That our technology is impacting our research. So how? What are the effects of TDM on our own research? And when I think about the tools that we're using to measure or to study young people's experiences with social media or other technologies, I'm thinking about like not only algorithmic racism and sexism and all of these things, but zooming out from the user tool relationship to also understand the implications of technologies on the environment. So as we, you know, continue to engage in the AI hype and use more and more AI technologies to study things that perhaps there's other ways to study. We are contributing to the global climate crisis, right? We know that a single conversation with Chachi PT is the same as drinking one bottle of water and that the global, you know, carbon emissions are unprecedented from the use of AI. The global drought has been exacerbated by it. AI. Not to mention the human labor costs. And so it's really important to think about how can we. I know that research perhaps is not supposed to be politicized and activists, but in education we're allowed to do that. So I'm often thinking about, you know, how do we ensure that? That our work is just right and that we're not contributing to harm. I think Doctor Mimi Ito said that yesterday, Right. If we're not thinking about institutions of power and our role within, like the global system, that our research can actually reproduce and exacerbate harm for historically marginalized communities not only here in the global north and in the imperial core right, but also in the global South. And then finally, my research looks at resistance, joy, and critical Literacies. So I actually started a program called Race Abolition and Artificial Intelligence at UCLA. We're currently in our 4th year this summer and it's a critical AI literacies course for that focuses on black youth in Southern California. And essentially it raises their literacies around what AI is, where it is, how it's designed, and how do we subvert it when it is centering harmful. Things right when it's reifying harmful systems. And so the point of this is not only to make black youth more resilient, right? So that they they know how to not encounter viral images of Black Death and dying, or they know how to protect themselves. They know how to change the algorithms subverting them from the inside, but also preparing them to computationally reimagine and rebuild technologies that foster hope, futurity and Wellness. So I think that what's important for me in that research. UM is really preparing youth to dream beyond the world that they inherited, that it actually doesn't have to be like this. And the technologies that we engage with every day don't have to be like this. And recognizing that youth are everyday computer scientists and their everyday hackers and we often as researchers believe that we know more about technology and Wellness than they do. But actually they are experts, right? And inviting them to not only be at the table but to be Co conspirators, right? They are expert researchers and I think if we look at it from that frame then we can have a more robust and justice oriented and perhaps transformational understanding of technology in the lives of schooling experiences of youth. Thank you.Bars!
Career Navigation Expert | Sociology Professor | PhD @ USD Education for Social Justice | Higher Education Scholar | Community Impact Engineer | Content Creator
1moYesssss! 🙌👏
To view or add a comment, sign in
Postdoctoral Fellow at Johns Hopkins Bloomberg School of Public Health
1moAbsolutely amazing. I esp love the reframe to reimagine. Thank you SO much for sharing! 🔥🙏