Sneha Revanur, 19, is the founder and president of Encode Justice, a group she established when she was 15 to mobilize high school and college students to ensure AI is aligned with human values. Working alongside Adam Billen, Encode’s 22-year-old director of policy, the group hopes to shape federal and state legislation, and beyond. Q: What are some of the top AI concerns for young people? Revanur: For a young person navigating the digital world, there’s a whole host of things you have to worry about that previous generations didn’t have to. We have seen young people turning to chatbots when they should be turning to friends and family, or mental health professionals. That’s obviously very concerning, because sometimes these chatbots aren’t equipped to navigate mental health emergencies. I really worry that that will impact the fabric of our society, and that will lead to a collapse of the bonds that really sustain us. Q: There’s a big focus on how addictive and manipulative social media is right now for Gen Z. How does AI play into this? Billen: People are starting to recognize that the basic algorithms on these platforms are at the crux of a lot of what is driving their sort of toxic patterns of attention and associations with themselves and their friends. It’s driving eating disorders, CSAM [child sexual abuse material], all of these issues are being driven partially just because the fundamental profit mechanism of these companies is to push whatever gets clicks, and will keep people on the platform. Revanur: I would say that it’s really important to shift the blame from individual users to these larger companies that could honestly make very minute design choices that wouldn’t really impact their bottom line all too much, but would have a dramatic impact on user experience. Billen: I personally have paid a lot of money for an app on my phone that forces me to not be able to use those apps for more than a certain amount of time each day and between certain periods of time. My phone is in black and white all the time. It’s taken me years just to figure out those things. Sneha Revanur/Instagram/screenshotQ: Do you think that companies are doing enough to let people opt out of using their algorithms or tools? What should they be doing to give us more choices? Revanur: They’re not, and we’re asking for them to let us opt out in our AI 2030 agenda. Billen: Yeah, absolutely. Two key examples of this would be Snapchat. Its AI bot is glued to your home screen and is the top thing whenever you open the app, and there’s no way to easily opt out of that. And just on Instagram, we’ve seen now that the Meta AI search is extraordinarily annoying. Sometimes you search for things and it pops up with the Llama chatbot screen instead of just going and searching what you actually want to search. Q: At the end of Encode’s ambitious AI 2030 agenda, there’s a line where you talk about going against the blurring of human and machine. Can you give me some examples of the harmful ways you think it’s blurring? And what values from humanity would you like to see preserved for future generations? Billen: If you’re interacting with a chatbot, you should know that you’re interacting with a chatbot, for example, with a customer service representative. We don’t want to live in a world where you’re talking to someone on the phone and you have no idea whether you’re talking to a human or a machine. It’s going to take real work to make these machines actually reflect human values based on the current technology. We don’t want to see one where they’re entirely built on the predicate of appeasing us, especially with the interaction of chatbots with young people or in romantic relationships. Revanur: We want a world where we can see trust and community and connection and creativity and critical thinking not just preserved, but also revitalized. That is a future that is possible with AI, but it’s not the future that we’re headed towards right now. Those are all the core values that make human society resilient and so strong, and that’s what I want to keep fighting for. What Encode thinks about its meetings with Meta and OpenAI. → |
|