“All you have to do to understand the future of AI is to understand the incentives.”
I recently listened to a podcast where Oprah Winfrey interviewed a representative from the Center for Humane Technology. That line stood out to me. If I’m being honest, the entire conversation stood out. But this particular thought lingered.
I’ve found myself using AI more frequently as a thought companion, a research assistant, an elevated version of Google, if you will. The idea of using AI to fully write articles or books still makes my writer’s heart seize a bit. But as a tool, it’s undeniably useful.
There is, however, something that began to catch my attention. If you’ve used AI, you’ve likely noticed it. The conversation never quite ends. There is always another question. Another suggestion. Another “Would you like to explore that further?” It’s subtle, but persistent. I noticed it with AI tools like ChatGPT. I also noticed it recently with Amazon Alexa. I used to ask Alexa a question …perhaps about a recipe…she would provide the answer and politely tell me to enjoy my meal. Now, she often follows up with another question, encouraging continued interaction. I mean…I guess it provides a strange, entertaining conversation of sorts. And at first, I will admit I found this mildly annoying. But after listening to that podcast, I started connecting the dots. What I now know does not make me happy.
To understand the future of AI, you have to understand the incentives. Social media companies do not primarily make money through subscriptions or direct payments. I mean…they do… but something has to happen first for the subscriptions and payments. Attention. They make money through attention. The longer you stay on a platform, the more content you consume, the more advertisements you see, and the more valuable you become. Your time, your engagement, and even your emotional responses become part of a business model built around keeping you involved.
Around 2012 and 2013, something shifted. Social media platforms moved away from being simple communication tools and began evolving into attention-optimizing systems. This is often referred to as the “arms race for attention.”
Companies began competing not just to attract users, but to keep them engaged longer and bring them back more frequently. The competition grew more sophisticated. Notifications became less informational and more psychological. A like, a comment, or a new follower taps into something deeply human: our desire for belonging, validation, and connection. These small signals encourage us to return again and again. And we do. Happily.
At the same time, platforms removed natural stopping points. Infinite scrolling and autoplay replaced clear endings. One swipe led to another. The experience became continuous, designed to keep users engaged without interruption.
Algorithms also began learning from behavior. They tracked what users clicked, watched, and lingered on. Over time, content became increasingly personalized. The system learned what held our attention and delivered more of it.
Artificial intelligence takes this even further. AI doesn’t just respond to behavior… it predicts it. Which is honestly frightening. It analyzes patterns, adapts in real time, and creates experiences that become increasingly difficult to disengage from. I understand that the goal is not necessarily harm. The goal is engagement. But when engagement drives revenue, technology naturally evolves to maximize it. The end result can, in fact, be harmful.
This becomes particularly important when we consider children and teenagers. Developing brains are especially sensitive to novelty, reward, and social feedback. Adolescents are navigating identity formation, belonging, and comparison. When platforms are designed to hold attention, young users are especially vulnerable.
What educators and parents are seeing today…decreased attention, shifting motivation, increased emotional reactivity…is not random. It is the harm that is occurring. Students are growing up in an environment where their focus is constantly being competed for by increasingly sophisticated systems.
Understanding incentives helps explain why this is happening and why it will likely continue. If attention remains the currency of digital platforms, AI will continue to evolve in ways that capture more of it. The technology will become more personalized, more adaptive, and more effective at holding users’ attention. This should concern you. It concerns me. Deeply.
It also shifts the conversation. The issue is not simply that kids are distracted. It is that they are growing up inside systems designed to capture their attention. That distinction matters. It moves our conversation away from blame and toward understanding. When we understand the environment students are growing up in, we can respond more thoughtfully. Parents, educators, and communities can focus not just on limiting technology, but on helping young people develop awareness, resilience, and balance in a world increasingly designed to compete for their attention.

Kristi Bush serves as a national education consultant and social media safety advocate. She is a licensed social worker with greater than 15 years of clinical practice and healthcare experience. She attended Troy and Auburn University where she studied social work and counseling. Kristi travels nationally and has spoken with thousands of children, parents, professionals and organizations about the benefits and threats associated with social media. You may reach Kristi through her website at www.knbcommunications.com.



