The “Big, Beautiful Bill” and the Slow Disembowelment of the American Citizen
July 1, 2025The Mind and the Making of Reality: Hallucinations, Shared Experience, and Consciousness at the Edge
July 12, 2025It begins innocently enough. You finish watching a documentary on YouTube, and before the credits even roll, another one starts—hand-picked, the platform says, “Just for you.” Spotify knows your mood before you do. Amazon whispers, People like you also bought… Google quietly auto-completes your questions before you’ve finished asking them.
And suddenly, you’re no longer navigating. You’re being navigated.
We’ve created a digital infrastructure so intuitive, so frictionless, that it now anticipates our needs, our cravings, our fears—even our identities. It’s a marvel of engineering. It’s also a metaphysical nightmare.
Because here’s the question that’s been gnawing at me:
In a world run by predictive algorithms, do we still have free will—or are we simply passengers in a car we built, but no longer steer?
Prediction ≠ Control… Or Does It?
The defenders of algorithmic life often say: “Relax—it’s just recommendations.” They point out that we choose to click. But that’s not quite honest, is it?
We don’t operate in a vacuum. We’re suggestible creatures, especially when tired, stressed, lonely, or bored. Our cognitive bandwidth is finite. Our choices are shaped—often invisibly—by what we’re shown, when we’re shown it, and how often.
The algorithm doesn’t hold a gun to your head—it just makes the alternatives invisible.
In philosophy, this is the difference between negative freedom (freedom from coercion) and positive freedom (freedom to act meaningfully). What we’re losing isn’t the right to click something else—it’s the awareness that something else even exists.
Welcome to the Feedback Loop
Let’s get a little technical. Algorithms learn by tracking patterns. The more you engage with content—be it news, products, memes, or conspiracy theories—the more the system feeds you similar things. This closes the loop: your clicks reinforce the algorithm’s model of you, and the algorithm’s model of you then shapes what you’re allowed to see.
This is called a reinforcing feedback loop. Cyberneticists warned us about this in the 1950s. Behavioral economists warn us now. But we seem hell-bent on ignoring them.
It’s not just a loop—it’s a funnel.
You enter it with a universe of possibilities. You emerge in a narrow corridor of identity, behavior, and belief, all optimized for maximum engagement. In short: You become the version of yourself the machine can most easily predict.
I ask you again: who’s driving?
Digital Determinism and the Illusion of Autonomy
There’s a dangerous idea quietly surfacing in tech circles: that with enough data, all behavior is predictable. The human mind, they say, is just another system—complex, sure, but ultimately deterministic. Give us the right inputs, and we can forecast your every decision.
Let me be blunt: this is psychological totalitarianism wrapped in a hoodie.
It’s also metaphysically lazy.
Because free will—if it means anything at all—demands unpredictability. Not chaos, not randomness, but the genuine capacity to choose otherwise.
But in a system designed to minimize surprises and maximize “engagement,” where exactly does the unpredictable self go? It gets optimized out.
You become an echo of your past behavior.
The Algorithmic Self: A Mirror or a Mold?
I’ve taught literature, film, and philosophy for decades. One of the oldest questions my students wrestled with is this: Who am I, really?
The new version of that question is this: Am I the person I think I am—or the person the algorithm believes I’ll become?
There’s a profound difference between a mirror and a mold. The former reflects. The latter shapes. And increasingly, our digital tools are not just reflecting our desires—they’re shaping them.
Think of TikTok’s “For You” page. That’s not just a feed. That’s a hypothesis about your identity. And it’s being tested, refined, and reinforced every few seconds.
This isn’t about paranoia—it’s about philosophy. Who authors your life?
Because the more we outsource choice to machines trained on our past, the less we cultivate the one thing that separates us from those machines: the mystery of conscious agency.
Reclaiming the Wheel
So what do we do? Smash our phones with hammers and move to the woods? Not quite.
But we need a new kind of digital literacy—one rooted in philosophical awareness. Not just “how to use the tool,” but “how the tool is using me.”
Start asking:
- Why is this being recommended?
- Who benefits from this prediction?
- What choices am I not seeing?
Philosophy, consciousness studies, and AI ethics all converge on this truth: To be awake is to resist being automated.
We cannot afford to sleepwalk into algorithmic determinism. If we still want to be drivers—and not just passengers—we have to grab the wheel, even when the road gets foggy.
The stakes are nothing less than our freedom, our identity, and our future as self-directed beings.
Author’s Note:
I write about the intersection of consciousness, philosophy, AI, and the examined life. If you find yourself wondering who’s driving these days—or if you’ve accidentally become your browser history—follow me or drop me a message. Let’s think it through, together.