They’re not necessarily the guys you might expect, Apollo Knapp told me.
These are 6-foot tall high school athletes, kids who are outgoing and popular. “They’re the kind of people who are friends with everyone, who get attacked in the hallway every two feet,” said Knapp, an 18-year-old high school senior in Ohio and a board member of SafeBAE, a sexual violence prevention nonprofit.
But at their school, these are the guys who use AI to help them talk to girls. They will paste their texts into ChatGPT for feedback before sending them, he said. Or they’ll send their own photos to ChatGPT and ask, “Am I cute?” Or they will simply ask for moral support when they are “too scared, perhaps, to confront women.”
Nonbinary girls and teens don’t need to rely as much on ChatGPT, Knapp said; they are more likely to have a circle of friends ready and willing to work on their texts. But boys are more isolated, socialized to believe that it is weak to talk about their feelings.
Worse yet, they’ve grown up on a steady diet of media telling them that “if you say the wrong thing” to a girl, “she’ll accuse you of something,” Knapp said. Even if those messages aren’t accurate, they get into teens’ heads, making them feel like they have to check everything through ChatGPT to make sure it’s okay.
The alienation of children and young people from everyone else in American society has been a persistent theme in recent years. The fear is that guys, especially straight guys, will become sucked into the manosphere podcasts and become increasingly alienated from the girls and women they theoretically want to date. This is an oversimplified narrative, and there is reason to expect that boys and men are more connected and more interested in connection than the nastier listening material might suggest.
But in talking to teens and experts about AI and relationships, I got the sense that kids need better outlets for their feelings than we give them. And while ChatGPT can help some kids in some circumstances, teens of all genders need a more reliable support system—one that doesn’t require an electricity-hungry data center to answer a question.
After all, Knapp said, “what’s going to happen if you have no power and you have a girlfriend?”
Teens use AI for dating. The question is how.
It’s difficult to know exactly how many young people are talking to ChatGPT about relationship problems, as research on youth and AI is in its infancy. In a recent Pew survey, 57 percent of teens said they had used AI “to search for information,” while 12 percent said they had used the tools “to get emotional support or advice.” It’s possible to imagine dating queries falling into either category.
Anecdotally, experts and teens alike say that young people are turning to ChatGPT for everything from low-stakes questions about texting to serious concerns about what might constitute sexual assault.
Val Odiembo, 19, advises his university classmates on healthy relationships. As peer educators, you’re used to getting questions like, “What do I do when my girlfriend says this?” or “is this consent?”
But recently, those questions have been diminishing. Odiembo, a nursing student and SafeBAE board member, believes students are now asking ChatGPT.
“My students told me, ‘I asked Chat what I should say to this guy,’” Odiembo told me. When that happens, “I die a little inside.”
Some teens are using chatbots “to try being flirty or romantic or a little sexy and see how the chatbot responds to that,” Megan Moreno, a professor of pediatrics at the University of Wisconsin Madison who studies technology and adolescent health, told me.
That type of experimentation may be more common among boys, who generally engage in riskier behavior online than girls, Moreno said.
Using technology to experiment with flirting and romance is not new. Millennial teens turned to chat rooms and AOL Instant Messenger for this purpose. This could be risky: my classmates spent a lot of time fishing for each other in front of the letter – or downright dangerous if teenagers ended up chatting with adults.
But, as Moreno points out, at least the people you were chatting with online were real humans who could tell you to leave if you said something too rude.
Chatbots, on the other hand, “are programmed to be incredibly responsive and fawning,” Moreno said. “Even if you say something incredibly inappropriate, the chatbot will respond in a way that reinforces it.”
This is even more problematic when the issue is sexual violence. Young people are increasingly turning to chatbots after sexual encounters to ask if they may have committed an assault, Drew Davis, director of strategic initiatives at SafeBAE, told me. The responses he has seen have sometimes been unhelpful, he said, emphatically providing legal defenses or providing assurances rather than discussing accountability.
SafeBAE is developing an interactive tool that helps young people think about sexual situations that may have been confusing to them, such as those where both parties were drinking, and connects them with resources to help them take responsibility and apologize if necessary.
The goal is to “give them a language, give them tools to be able to do this, that doesn’t come from AI,” Davis said. “It’s connecting them to other people.”
Why Teens Turn to AI in the First Place
It’s possible to imagine AI driving young people even further away than they already are. The big question is whether children are using AI to practice human relationships or to replace those relationships, Moreno said. In a recent survey, one in five high school students said they or someone they knew had been in a romantic relationship with an AI.
It’s not hard to see why teenagers (or adults, for that matter) might be drawn to a voice that always has answers but never criticism. When talking about thorny topics like sex and consent, “I think there’s a lot of shame,” Odiembo said. Teenagers “feel comfortable turning to AI, because AI won’t judge them.”
But some teens also see value in the inevitable challenges and frictions of human relationships.
“You need to be called every once in a while,” said Knapp, a senior from Ohio. “This is how humans evolve.”
Some experts believe that with better guardrails, like a willingness to say, “Hey, don’t talk to me like that!” — AI could still be a useful companion for teenagers learning to talk to each other. For example, a chatbot could be trained to help children with social skills. Part of me wonders how much less awkward my adolescence would have been if I had been able to tell my jokes with a robot before taking them to the crucible of the high school classroom.
It’s also worth noting that AI models are constantly changing and, in some ways, improving. After speaking with the SafeBAE team, I tried ChatGPT and Google Gemini by pretending to be a teenager worried about having crossed a line with a girl. Both models did a decent job, at least in the first response, asking follow-up questions about the situation and encouraging me to take responsibility.
But the young people I spoke to for this story don’t want better chatbots; Instead, they want to see humans improve. They want teachers better trained to discuss difficult topics like consent and assault. They want coaches and other adults who can model healthy masculinity for boys, rather than reinforcing stereotypes. And all teens want supportive places to open up about feelings and relationships, some of the most important and confusing aspects of human life.
“I wish people would feel a little more comfortable having uncomfortable conversations,” Odiembo said.
Families continue to report concerning conditions at the Texas immigration center where 5-year-old Liam Conejo Ramos was detained, including a worm in a child’s food, water causing rashes and stomach pains, and staff providing medical care.
Teens and tweens want to see more depictions of “parents who enjoy being parents” and “parents who show love to their children” in movies and television, according to a recent UCLA survey. In this, as in everything, the answer is bluish.
The New York Times delved into AI videos aimed at children. It’s still unclear whether endless videos of adult mammals hatching from eggs are harmful to children, but they certainly are strange.
Currently, my oldest son is obsessed with the Ham Helsing series, graphic novels about a pig who hunts vampires.
After writing about children’s recent obsession with the phrase “chicken and banana,” a reader wrote to inform me of a much earlier coinage. “Maybe it’s my age (almost 80), but when they were teenagers, my age group regularly listened to a Chiquita Bananas jingle,” he wrote. “Naturally, we corrupted the Chiquita banana and turned it into ‘chicken banana.'”
“Sorry to crush the illusion of Chicken Banana’s current uniqueness, but we old timers used the term ‘chicken and banana’ a long time ago,” he added.
As always, if you have any questions or would like to share a story about children today or in the past, you can contact me at anna.north@vox.com.

