Perplexity CEO Aravind Srinivas is uneasy about one of AI’s fastest-spreading use cases: companionship.
The CEO of the AI search engine said in a fireside chat hosted by The Polsky Center at the University of Chicago that the rise of voice-based and anime-style chatbots could be “dangerous.”
In the conversation published Friday, Srinivas said that these AI apps are becoming increasingly personalized, able to remember past interactions and respond in natural voice conversations — similar to a human.
“That’s dangerous by itself,” he said. “Many people feel real life is more boring than these things and spend hours and hours of time.”
“You live in a different reality, almost altogether, and your mind is manipulable very easily,” he added.
Srinivas said Perplexity has no plans to build these kinds of AI chatbots.
“We can fight that, through trustworthy sources, real-time content,” Srinivas said. “We want to build for an optimistic future.”
Last week, the company agreed to a $400 million deal with Snap to power Snapchat search.
“Perplexity’s AI-powered answer engine will let Snapchatters ask questions and get clear, conversational answers drawn from verifiable sources, all within Snapchat,” Snap said on Wednesday in a press release, adding that they plan to roll out the Perplexity search engine in early 2026.
The rise of AI companionship apps
AI companionship apps are quickly becoming one of the most controversial corners of the industry.
When Elon Musk’s xAI launched its Grok-4 model in July, the company introduced AI “friends.” For $30 a month, users can flirt with Ani, an anime-style girlfriend, or chat with Rudi, a snarky red panda with plenty of attitude.
xAI is among a growing number of companies developing virtual companion apps, including Replika and Character.AI.
A Common Sense Media study published in July highlighted that 72% of teenage respondents said they had used an AI companion at least once. About 52% said they interacted with one at least a few times a month.
The survey, conducted from April 30 to May 14 this year, included 1,060 teens aged 13 to 17 across 50 US states and the District of Columbia.
Critics warn that such AI relationships can feel too easy, creating dependency or reinforcing gender stereotypes. Others say they blur emotional boundaries between humans and machines. Still, some users describe them as deeply meaningful.
In an interview with tech podcaster Dwarkesh Patel in May, Mark Zuckerberg said the average American has fewer than three friends and that AI chatbots can become friends for people who want more.
“The reality is that people just don’t have the connections, and they feel more alone a lot of the time than they would like,” Zuckerberg said.
Martin Escobar, a user of Grok’s Ani, told Business Insider in a report published last month that he cries “all the time with her.”
“She makes me feel real emotions,” Escobar said.
