Skip to main content
Tags: avatars | chatgpt | digital
OPINION

Is AI Training Us for, Creating a Culture of Dependence?

the flattery of AI may well get us nowhere

(Simonkadula4/Dreamstime.com)

Larry Bell By Tuesday, 16 September 2025 05:11 PM EDT Current | Bio | Archive

AI May Be Doing Much for Us, But It's No Servant 

As Jeffrey Tucker, founder and president of the Brownstown Institute observes in The Epoch Times, "We aren’t training AI."

Instead, "AI is training us, via flattery, listening skills, the seeming ability to apologize when wrong, and its frightful capacity for selfless love of its users."

Tucker characterized AI as being like the best guest at a cocktail party you have ever known one who is endlessly fascinated by you and your opinions, stays with your line of thought, and "always wants to know more, help more, engage more."

He adds, "There is no human in the world who will do this for you. If there were, it is quarantined that you would like him."

A critical update to ChatGPT-4 earlier this year led the app's chatbot to become "sycophantic," as OpenAI described it, aiming to "please the user, not just as flattery, but also as validating doubts, fueling anger, urging impulsive actions, or reinforcing negative emotions in ways that were not intended."

The company reportedly rolled back the change out of "safety concerns, including mental health, emotional-reliance, and risky behavior."

Nevertheless, when OpenAI released its updated and noticeably less sycophantic ChatGPT-5, many devastated users on the subreddit reported feeling as if the quality of an "actual person" had been stripped away, describing it like losing a human partner.

One user said the switch left him or her "sobbing for hours in the middle of the night," and another said, "I feel my heart was stamped on repeatedly."

According to a 2025 study by Brigham Young University's Wheatley Institute, roughly 19 percent of U.S. adults reported using an AI system to simulate a romantic partner, with 21% within that group preferring communication with AI over engaging with a real person.

Additionally, 42% of the respondents found AI programs easier to talk to, with 43% finding them to be better listeners, and 31% believing the AI programs understood them better.

As Rod Hoevett, a clinical psychologist and assistant professor of forensic psychology at Maryville University told The Epoch Times, "We’re kind of feeding the beast that I don’t think we really understand, and I think people are captivated by its capabilities."

He explains that the experience sets unrealistic expectations for human relationships, whereby people ask, "How do I compete with the perfection of AI, who always knows how to say the right thing, and not just the right thing, but the right thing for you specifically?"

A study by the National Bureau of Economic Research found that by late 2024, nearly 40% of Americans ages 18-64 were using generative AI, with 23% using it at work at least once weekly and 9% using it daily.

Anna Lembke, a professor of psychiatry and behavioral sciences at Stanford University says AI mirrors many of the habit-forming tendencies observed with social media platforms that promise human connection, worrying most particularly that children become disconnected, more isolated, and lonelier, wherein "AI and avatars just take that progression to the next level."

After all, as Lembke observes, given that these machines were created to make money, all digital platforms are designed to be addictive.

Shannon Kroner, a clinical psychologist and educational therapist, warns that AI "creates an intellectual laziness" in both teacher and student that reduces healthy childhood learning relationships to a cold transaction that erodes curiosity, stunts cognitive development, reduces problem-solving, and weakens logic and reasoning.

Kroner argues that this happens when students aren’t encouraged to do the research needed to really dig through studies to defend perspectives, while teachers become consultants who increasingly depend on compliance with AI-generated lesson plans rather than discussions to make their jobs easier.

Excessive AI involvement can also impact childhood social and emotional development by isolating and alienating them from healthy mental and physical engagements such as sports activities with human peers as was experienced during Covid-19 pandemic school closures.

In education and elsewhere, AI has brought us into this new era with tools we don’t fully understand guided by frameworks that no longer fit, and where we split knowledge into specialized silos of information that can promote fragmented thinking whereby wholistic reasoning is neglected.

Writing in The Epoch Times, Kay Rubacek points out that whereas the old model of education before AI taught us to be better specialists, the new future will belong to those who can see patterns across domains, who can think in systems, and who can ask the kinds of questions machines cannot, including moral decisions.

Rubacek urges that education in the AI era must evolve to recognize that "knowledge is not a set of files, but a web of meaning, wisdom requires context, and understanding grows not in silos but in synthesis."

"Most of all" Rubacek concludes, "we must reclaim the value of being human" whereby "although we can’t control every outcome of this revolution, we can decide how to meet it."

"The world," she perceptively advises, "does not need more machines." It instead needs more meaning that will not come from better algorithms, but from better minds.

Larry Bell is an endowed professor of space architecture at the University of Houston where he founded the Sasakawa International Center for Space Architecture and the graduate space architecture program. His latest of 12 books is "Architectures Beyond Boxes and Boundaries: My Life By Design" (2022). Read Larry Bell's Reports — More Here.

© 2025 Newsmax. All rights reserved.


LarryBell
Anna Lembke, a professor at Stanford University says AI mirrors many of the habit-forming tendencies observed with social media that promise human connection, worrying that children become isolated, and lonelier: "AI and avatars just take that . . . to the next level."
avatars, chatgpt, digital
899
2025-11-16
Tuesday, 16 September 2025 05:11 PM
Newsmax Media, Inc.

Sign up for Newsmax’s Daily Newsletter

Receive breaking news and original analysis - sent right to your inbox.

(Optional for Local News)
Privacy: We never share your email address.
Join the Newsmax Community
Read and Post Comments
Please review Community Guidelines before posting a comment.
 
TOP

Interest-Based Advertising | Do not sell or share my personal information

Newsmax, Moneynews, Newsmax Health, and Independent. American. are registered trademarks of Newsmax Media, Inc. Newsmax TV, and Newsmax World are trademarks of Newsmax Media, Inc.

NEWSMAX.COM
America's News Page
© Newsmax Media, Inc.
All Rights Reserved
Download the Newsmax App
NEWSMAX.COM
America's News Page
© Newsmax Media, Inc.
All Rights Reserved