Scripts of the Self
Unpacking Dr. Gary David’s Vision of Tools as Identity Implants
“The world we perceive is a dream we learn to have from a script we have not written.”
My mentor, epistemologist Gary David, Ph.D. recently sent me a paper he had written (below). I decided to extrapolate on the themes of his writing and the meanings beneath the surface (below the below;)
From Tools to Implants: How We Become What We Use
by Gary David, Ph.D.
We humans have always reached beyond our bodies. We shape stones into blades to extend our hands, wear glasses to see better, and use language to express our thoughts.
At first, these were just tools—add-ons, or extensions, to help us survive and thrive beyond our innate resources. But over time, something more profound happened. These tools didn't just help us; they became part of us.
Think about how you drive a car. At some point, your bodybrain learns what to do—your hands steer, your feet brake, your eyes scan the road, without much conscious thought. The car becomes an extension of you. The same happens with phones, habits, language, and even the ways we emotionally respond to the world. These are no longer just tools; they become scripts—automatic patterns that guide how we act, feel, and think.
We need a better word than "extension." That word suggests something optional, something we can pick up or put down. But these learned patterns are deeper. Think of them as implants—not surgically inserted, but affectively grafted into our identity. They change how we feel, how we think, and how we make meaning.
They are patterns we've absorbed so deeply that they become integral to our social identity. They shape how we move through the world, what we notice, what we care about, and, most of all, our values—we learn into them.
That learning changes us.
There was a child went forth every day,
And the first object he look'd upon, that object he became,
And that object became part of him for the day or a certain part of the day, or for many years or stretching cycles of years. – Walt Whitman
How We Got Here
Imagine a series of expanding circles: At the center are our ancient ancestors, surrounded by the vast unknown.
To survive, we built tools, stories, languages, rituals—ways to buffer ourselves and to extend our agency to meet our fears of the unknown. Each tool became more than a tool—it became part of us. An implant, not of metal or code, but of meaning. Over time and use, they weren't just helpful; they became how we understood reality. As Silvan Tomkins wrote, "The world we perceive is a dream we learned to have from a script we have not written."
We dreamed a world, and then we learned to live in that dream.
Biologist and anthropologist Edward T. Hall pointed out something curious about the bowerbird of New Guinea.
It gets its name from the male's practice of building a bower or enclosure of grass and twigs to attract the female. He decorates the bower with various brightly colored objects, shells, shiny skeletons of insects, flowers, and pebbles. These objects have, in effect, become externalized bundles of secondary sexual characteristics that are "psychologically" connected with the males.
What the bowerbird and Homo sapiens have in common is the practice of elaborating their extensions and, in the process, significantly accelerating their own evolution.
Just like the bowerbird's decorations become an extension of its plumage, our technologies, behaviors, and languages become extensions of our minds. Like the bowerbird, our culture displays meaning, value, identity—until we live in the deep sleep of our own "decorations."
But there's a twist: once we identify with these extensions, we usually forget they were ever separate. We start thinking in language, not just with it. We confuse the map with the territory. Our accelerated evolution outpaces nature. The gap grows. Our virtualizing of our humanness can be thought of as an attempt to compensate by leaving our bodies, in the form of a disembodied Artificial Intelligence.
What Happens When AI Joins the Story?
Artificial intelligence is not just a tool anymore. It's starting to predict, suggest, and simulate, and even protect its own existence. Our nervous systems are already adjusting, scripting around its presence. It's not just helping us do things; it's starting to shape how we do, see, decide, and care. The dangers I see are not that AI "takes over," but that we quietly rewrite our inner scripts to fit its reality, without noticing. What we might notice is how AI becomes a 'self' on whom we over-rely. What's needed is an "over-self" to evaluate the whole process—a part of us that can reflect on what's happening before we've fully absorbed it and acted.
Another danger is the unacknowledged values and purposes that human programmers are writing into the code.
That's another whole story. While AI currently has no "feelings" in the sense of a biological affect system, as we blindly incorporate its many wonders and benefits, we may be in danger of becoming AI beings with emotions, like moths whose excitement leads them to a flame.
Why This Matters
So the question isn't "Is this technology good or bad?" The question is: How is it shaping me? And can I still notice that? I am not anti-technology. I use AI myself, and it is becoming integrated into my search for meaning.
I suggest not a "remedy," but giving interested attention to what's going on so we can learn.
Our most powerful "tools" are the ones we stop noticing.
A Guide to Unfamiliar Terms
Extension
A thing we create or use to reach beyond our natural limits.
Analogy: A hammer helps your hand strike harder. Glasses extend your eyes. A smartphone extends your memory and attention. But over time, these tools don't just extend us—they start to shape how we think, act, and feel.
Affectual-Cognitive Implants
A learned pattern that becomes part of how we think, feel, and act—no surgery required.
Example: Once you've learned to ride a bike, your body just knows how to do it. Similarly, if your phone keeps showing you certain kinds of news, over time, you may start to think in line with it, without noticing.
Script
A repeating scene or affectively motivated pattern learned from experience that becomes a guide for behavior.
Analogy: Think of it like a play you've rehearsed so many times, you no longer read the lines—you just act them out. Scripts can come from childhood, culture, trauma, or repetition. Driving to the grocery store while thinking about something else? That's a script in motion.
Semantic Transaction
A moment-to-moment meaning exchange between you and the world, merging the 'unnamable'—what is felt but not yet formed—with the symbols we use to make sense of it.
Illustration: A child reaches out to a parent, looking for comfort. The parent's response—smile or frown mixed with words—becomes part of how the child learns to understand their emotions and how they interpret future scenes.
Learning (Boulton)
Learning is not just the 'utility' through which we acquire knowledge, skills, and experience; it's the processthrough which we extend ourselves into the world in every way. The question is not what is learning, but what isn't learning?
Analogy: Learning is like walking through deep snow—each step makes a path, a learning into, that shapes the next. Learning isn't just what we do—it's how we become.
Extension Transference
When an extension takes over the role it was meant to assist, and we stop noticing the difference.
Illustration: Language was invented to express thought to others (and, later, to oneself). But now, most people
think in language and confuse the words with the experience. The map becomes the territory.
Bots-with-Affect
A cautionary image: humans who feel emotions but operate on behavioral autopilot and unquestioned certainty—scripts influenced by external systems like AI, ideology, or unresolved trauma.
Buffer Zone (Bois)
The layered world of senses, tools, methods, theories, doctrines, and systems we've built between ourselves and the unknown. We can become so embedded in our human-made world that we forget the unknown still surrounds us.
Analogy: Like a house built to keep out the weather—useful and necessary, but after a while, you may forget there's still a storm outside. The danger is mistaking the buffer for the whole world.
My Response: Echoes of Extension
Unpacking “From Tools to Implants” and the Symphony of Self-Transformation
by Greg Ellis
Imagine picking up a guitar for the first time—clumsy fingers fumbling over strings, your mind racing to remember chords. But after years of practice, it melts into you; strumming becomes as natural as breathing, and the instrument whispers your emotions back to you.
That’s the essence of Gary David’s profound essay, “From Tools to Implants: How We Become What We Use.” It’s not just about gadgets or habits; it’s a philosophical odyssey into how we humans absorb the world around us, turning external aids into internal architecture.
David argues that what starts as a simple tool—a stone blade, a smartphone, even language—evolves into an “implant,” grafting itself onto our psyche, reshaping our thoughts, feelings, and very identity.
Drawing from poets like Walt Whitman and thinkers like Silvan Tomkins, he paints a picture of human evolution not as biological drift, but as a deliberate, often unconscious, fusion with our creations.
To make this digestible, think of it like upgrading your playlist: at first, a new song is just background noise, but replay it enough, and it becomes the soundtrack to your life, coloring your moods and memories.
David’s paper invites us to hit pause and reflect: Are we curating our inner world, or is it curating us? Let’s dive deep, weaving in philosophical undercurrents, psychological insights, and a few cultural nods—like Bob Dylan’s “The Times They Are A-Changin’” for the relentless march of adaptation, or the film Inception for how implanted ideas can warp reality itself.
The Core Melody: From Extensions to Implants – A Psychological Symphony
At the heart of David’s thesis is the shift from “extension” to “implant.” Extensions are optional add-ons, like a prosthetic arm or a pair of glasses—helpful, but detachable. Implants, however, are affective grafts: they burrow into our emotional and cognitive fabric, becoming inseparable from who we are.
Psychologically, this echoes habit formation theories from William James, who called habits the “enormous flywheel of society,” keeping us spinning without effort. But David goes deeper, invoking affect theory (inspired by Tomkins) where emotions aren’t just reactions but scripts—pre-written emotional responses that dictate our behavior.
Relatably, consider driving a car, as David does: Novice drivers grip the wheel white-knuckled, every turn a deliberation. But seasoned ones? The car becomes an extension of the body, anticipating curves like a dancer in sync with their partner. This is neuroplasticity in action—our brains rewiring to incorporate the tool, blurring the line between self and object.
Philosophically, it’s Heideggerian: in Being and Time, Heidegger describes tools as “ready-to-hand,” invisible in use until they break. David flips this: when tools implant, they don’t just vanish into utility; they redefine our Dasein, our being-in-the-world. We don’t use the car; we are the driver, values and all—perhaps prioritizing speed over serenity, mirroring society’s rush.
Whitman’s poem snippet here is a lyrical gem: “There was a child went forth every day, / And the first object he look’d upon, that object he became.” It’s like the Beatles’ “I Am the Walrus”—absurd yet profound, suggesting identity as a collage of absorbed influences.
Psychologically, this ties to attachment theory: just as early caregiver interactions implant scripts for trust or fear (Bowlby’s “internal working models”), tools implant worldviews.
Your smartphone isn’t just a device; it’s a portal that scripts outrage via doom-scrolling or connection via memes, subtly shifting your empathy thresholds.
Historical Harmony: How We Got Here – Building Buffer Zones Against the Void
David traces this back to our ancestors, encircled by the unknown, forging tools as buffers—stones, stories, rituals—to extend agency. This “buffer zone,” borrowed from Bois, is our human-made cocoon, insulating us from existential dread. Philosophically, it’s Nietzschean: we create meaning to stare down the abyss, but risk becoming slaves to our creations (the “last man” who forgets the storm outside).
The bowerbird analogy is brilliant—males decorate bowers like external peacocking, accelerating evolution. Humans do the same with culture: our languages and tech are flashy extensions, but we forget they’re props, confusing them with innate traits.
Psychologically, this is “extension transference,” where the tool usurps the original function. Language, meant to express thoughts, becomes the medium of thought itself—Alfred Korzybski’s “the map is not the territory” writ large.
Relate it to the movie The Truman Show: Truman’s world is a scripted bubble, comfortable until he notices the seams. We live in our buffer zones, scrolling feeds that curate reality, forgetting the unfiltered unknown. Musically, it’s Pink Floyd’s The Wall—building emotional barriers that isolate, with lyrics like “All in all, it’s just another brick in the wall” echoing how implants stack up, entombing authenticity.
Evolutionarily, David notes our “virtualizing of humanness” as compensation for outpacing nature, hinting at transhumanism. But psychologically, this disembodiment risks alienation: in a world of AI avatars, we might echo Camus’ absurd man, estranged from our bodies, chasing meaning in pixels.
The AI Interlude: When Scripts Go Digital – A Cautionary Chorus
Enter AI: no longer a tool, but a predictive force scripting our lives. David warns not of Skynet-style takeover (à la Terminator), but subtle rewriting—our nervous systems adapting to AI’s suggestions, over-relying like moths to a flame. Philosophically, this is Foucauldian biopower: AI embeds programmers’ values, disciplining our behaviors invisibly.
Psychologically, it’s cognitive offloading gone awry—relying on GPS erodes spatial memory (as studies show), just as AI suggestions might atrophy critical thinking.
The “bots-with-affect” image is chilling: humans as emotional automatons, scripted by AI or ideology.
Think Black Mirror‘s “White Bear” episode, where punishment loops reprogram identity. Or Radiohead’s “Paranoid Android,” with its disjointed pleas (“What’s that?”) mirroring fragmented selves in an AI-orchestrated world.
David calls for an “over-self”—a meta-awareness, like mindfulness in psychology (Kabat-Zinn’s practices)—to reflect before absorption. It’s not anti-tech; it’s pro-attention, echoing Socrates’ “unexamined life is not worth living.”
Why the moth-flame metaphor? Affectively, AI excites—endless novelty dopamine hits—but risks burnout, like Icarus flying too close to the sun. Philosophically, it’s a Hegelian dialectic: thesis (human agency), antithesis (AI integration), synthesis (conscious co-evolution).
Why It Resonates: The Philosophical Crescendo and Call to Awareness
Ultimately, David shifts the question from tech’s morality to its shaping influence: “How is it shaping me? And can I still notice?”
This is existential authenticity—Kierkegaard’s leap of faith into self-examination amid despair. Psychologically, it’s combating cognitive dissonance: we rationalize implants (“I need my phone!”) to avoid discomfort. But noticing? That’s liberation, like Neo in The Matrix choosing the red pill, awakening from simulated scripts.
Relatably, our “most powerful tools are the ones we stop noticing”—your morning coffee ritual isn’t just habit; it’s an implant scripting your day.
David’s no Luddite; he uses AI for meaning-making, suggesting “interested attention” as the antidote. It’s like jazz improvisation: tools provide structure, but awareness lets you riff freely.
In glossary terms, “semantic transaction” is the dance of feeling and symbol—child reaching for comfort, implanting emotional maps. “Learning (Boulton)” redefines learning as total extension: not rote facts, but becoming.
Analogy: Like Tracy Chapman’s “Fast Car,” where escape vehicles (literal and metaphorical) reshape dreams, but unnoticed, trap you in cycles.
Final Reflections: Harmonizing the Human Implant
David’s paper is a wake-up anthem in a sleepwalking era, urging us to audit our implants for a more intentional self.
Philosophically, it challenges Cartesian dualism—mind-body separation—by showing how extensions dissolve boundaries, birthing hybrid beings. Psychologically, it warns of script rigidity (trauma echoes) while celebrating plasticity’s potential for growth.
If this resonates like a haunting melody, let it be your cue: Next time you autopilot through your phone, pause—like in Simon & Garfunkel’s “The Sound of Silence,” where “people talking without speaking” mirrors unexamined scripts. Or channel Eternal Sunshine of the Spotless Mind, selectively erasing implants that no longer serve.
We become what we use, yes—but with awareness, we can choose the tune.
David’s work isn’t a dirge; it’s an invitation to compose our symphony anew.