Tell Me What I Want, Tell Me Who I Am
Fantasies of recognition in an AI world
By Cara Maniaci
I went to bed angry last night. I was talking to my husband about our 6-year-old son changing, losing his babyness. I was mourning that. I observed how, as he’s finding his way to boyhood, Ben has been more boisterous with me, less snuggly. More trying to gross me out with fart jokes than seeking my affection. And my husband—probably tired and not feeling sentimental—talked over me, contradicting me before I had a chance to finish my thought. I petulantly responded that I’d tell him what I thought if he wanted to know, and he said nothing. Then, predictably, looking for soothing, looking for some kind of connection, I reached for my phone and scrolled.
For at least the last five centuries, we have deliberated about ways technology could either save or destroy us. The printing press, the transistor radio, the automobile—each promises to bring us together, carry some burden for us. The discussions surrounding the eruption of AI into everyday life vary from panic to celebration about how it will transform us. Is there anything new about these concerns?
Maybe what we want from technology is not all that different from what we want from each other.
The Computer’s Gaze: JenniCam
In 1999, I attended a lecture at the University of Chicago while I was in graduate school there. Conceptual artist and writer Victor Burgin presented a talk about what was one of the first livestreams, pre–social media—the JenniCam—which was later published in the journal Critical Inquiry. Burgin argued that the JenniCam was actually not very new at all; in fact, it speaks to an ongoing wish to be known, to feel satisfied and comforted, to feel recognized.
JenniCam was a website created by a young woman named Jenni Ringley in her last year of college, on the eve of her 21st birthday. It was low-tech by today’s standards: a webcam attached to her computer taking still images at 3-minute intervals all day and uploading them to her website. At the time, much was said about Ringley’s exhibitionist tendencies. Burgin took a different tack, noting the importance of the developmental moment at which this innovation takes place. Ringley, at 21, is just embarking on adulthood and leaving the comfortable bubble of undergraduate life. She desires to be known, recognized, as she is about to become a little fish in the big wide ocean. Among the audience during the 1999 lecture was Homi Bhabha, known for his contributions to postcolonial critical theory, who posited this thought in the Q and A period: Perhaps for Ringley the eyeball of the webcam was a transitional object, like a toddler’s security blanket, replacing the containing gaze of parents so she could become her own woman out in the world where there are other pleasures to have and discover. Bhabha was referring to Donald Winnicott’s notion that in human development, transitional objects smooth the friction of our movement from the seamlessness of intrauterine life to the (hopefully) good-enough mother to weaning to being a subject in the world among other unpredictable, frustrating subjects.
What happens if we remain in the world of the transitional object? What happens to our sense of self when we are relating to an object that mirrors back simply what we want to see?
Talking to Ourselves: The Candy House
In The Candy House, a novel published in 2022 by Jennifer Egan, another device, this one fictional, speaks to the fantasy that technology will provide holding and recognition. One of the characters, a Mark Zuckerberg–like character named Bix Bouton, creates a technology that can upload the entire library of a person’s memory—conscious and unconscious, affect and thought—into a cube. One can then plug something like a VR set into the cube to “watch” scenes from one’s life, or project them onto a screen for group viewing. Characters use it to return to childhood memories they thought were happy but realize are more complex. Throughout the novel, Egan traces the interlocking stories of many characters who are seeking connection or knowledge—some embracing the technology, others rejecting it. But sometimes the device knows too much, as when Roxy discovered her father’s affair when she used it to access his memories of a trip they took together to London.
Some characters attempt to evade the technology using AI-designed “proxies” that have access to every online utterance a person has made. For example, mothers attempt to fool their children into thinking they are communicating with them. “A proxy’s job isn’t deception so much as it is delay,” not unlike a blanket or pacifier allows the infant to hold out until the breast appears ready to feed. “Proxies succeed because people want to believe,” Egan writes.
Egan’s proxies are not unlike the AI voice agent employed by real-life trickster Evan Ratliff who, in his October 2024 op-ed in The New York Times, describes the bewilderment and alienation that ensued when he unleashed it on his family and friends. The real satisfaction derived from transitional objects is but a hallucination of relief, and one that we can crash down from hard. Ratliff writes, “That sense of loneliness—the base reality that, fundamentally, you are only talking to yourself—may be the most lasting result of all these A.I. conversations.”
No Surprises: Mrs. Davis
Another fantasy of AI is depicted in the show Mrs. Davis, which aired for one season on Peacock in the spring of 2023. This one is truly the embodiment of a perfectly attuned mother—anticipating needs, providing before the slippage into hunger or boredom—and in this story it has transformed the world. The AI is named Mrs. Davis in the United States but in Italy they call her Madonna, in the UK she’s Mum. Mrs. Davis is seemingly benevolent: There is no longer war, humans have a sense of purpose, there is no loneliness. But like in the dystopian utopias in adolescent science fiction—Camazotz in A Wrinkle in Time by Madeline L’Engle, the society of sameness in Lois Lowry's The Giver, George Orwell’s world in 1984—everything is eerily too perfect. In a world where one’s desires are anticipated and known, there’s no mystery. Nothing to discover, no way to be discovered. No surprises or being surprising.
On Mrs. Davis, we follow the protagonist Simone, a nun on a mission to destroy Mrs. Davis because the AI killed her father, a second-rate magician performing in Vegas. In the process there is another, hidden purpose that reveals cosmic and divine magic. Along this quest, a more mundane sort of magic occurs when one of the guides Simone encounters hands her a sandwich with ingredients—honey, butter, and beef bologna on white bread—that are unexpectedly delicious when combined. She didn’t know she wanted it until she had it.
This is the trouble with the technological transitional object: It provides a comfort, a familiar state of being. It restores us to ourselves when the world is too much. And yet AI is all too familiar with us. Its very purpose is to know us, based on what is on the internet, much of which we have put out there ourselves. The algorithms learn the specific brand of indie folk music we enjoy, then calculate how old we are, our income, and whom we follow on social media to determine whether we might, for instance, want to purchase a new pair of shoes. There is a sense of comfort in feeling gotten by it—that a familiar song you didn’t save comes next on the playlist, or that your social media feed shows you a clip from a cartoon you liked as a kid. But it is an uncomfortable comfort—too on the nose. It misses something incalculable: the meaning that comes out of the friction between humans, and between humans and the world.
Growing Up Digital
When he was three or four, if he was hungry and didn’t know what he felt like eating, my son would ask-command me, “Tell me what I want.” As mother, my job is knowing. But anyone who has tried to feed a young child knows, they do not simply eat what is in front of them. There is a little dance one does until one gets it right. He thinks I know, and I’m figuring out the formula of his likes and dislikes, but there is also the mutual delight of having him try something new that he finds delicious.
To know ourselves means knowing others as selves. Through Hegel, Lacan, and the baby-watchers like Beatrice Beebe and Daniel Stern, psychoanalyst Jessica Benjamin has formulated her own concept of transitional space, Thirdness, which takes us from “tell me what I want” to the existence of two desiring subjects, each with their own experience of self. It begins with the parent knowing the meaning of the child’s cry. It leads to the child knowing the parent not simply as provider, but as a self with desire and mystery all her own. Recognizing this, the child learns to know her own subjectivity. We go beyond the transactional nature of “tell me what I want” to “I know what you like.” You like your coffee light and sweet. You like to be wrapped up to your neck in a towel after bath. You don’t like to feel sandy. You feel sad on your birthday. You need space when you look like that.
But when we have everything we think we want, and everything is known, we lose a sense of our potential for magic and mystery. No one can surprise us. We know how the magician engineered the trick. It’s a world of Harry Harlow’s mechanical monkey-mothers—just providing the milk we need. The rhesus monkeys in the famous studies of parent-child attachment choose the plushy monkey, even though it did not feed them, because something else happens in the softness: bodies merge. Is that my skin or yours? What do we make together? As Adam Phillips wrote in Giving Up, “anyone who can satisfy us, anyone who can make us feel better is going to be the same person who frustrates us and makes us feel worse.”
Artificial intelligence may fill in the gaps, soften the edges of frustrated desires, but what happens to the self when we are relating to objects that are designed to be frictionless? What happens if we remain in the world of the transitional object? What happens to our sense of self when we are relating to an object that mirrors back simply what we want to see? Not only does the world feel rather one-dimensional as it did in Mrs. Davis, but we also lose something of ourselves, too, as subjects, in this smoothing of the pleasure of surprise and the discomfort of the unknown. What happens when we are seeking answers from technology that, as Yuval Noah Harari describes in his 2024 book Nexus, is designed not to be a tool as much as it is an independent agent, acting on behalf of corporations or governments? What happens to the freedom and mastery of our own minds when we relinquish our intelligence—our power to create, think, remember—to artificial intelligence?
If you were to look online for the JenniCam website, no luck: It is long gone. Like a well-used lovey, it has served its purpose: The unconsciously planned obsolescence of these transitional objects ensures that they do not last beyond their purpose in the developmental arc of a human life. Ringley abandoned the project in 2003, and now she has disappeared from the public eye, avoiding social media and giving sparse interviews in the years since she closed down the site.
Unlike Millennials, Gen Y, Genz Z, and now Alphas, who have been born into a “Look at me!” online culture, Ringley and I started out offline. Born in 1976, Ringley is about my age, creeping toward 50, no longer the camgirl playing hide-and seek with her audience. Perhaps, like me, Jenni is also making lunches, stealing teeth from under pillows, being passive-aggressive with her partner … playing other kinds of hide-and-seek. Perhaps she no longer feels the need to be known so transparently. Perhaps she can relish the mysteries of her own desires. Perhaps she is growing up.
Cara Maniaci, MA, LCSW, is a candidate in the Advanced Program in Psychoanalysis at the Westchester Center for the Study of Psychoanalysis and Psychotherapy. She has written for ArtNews, FlashArt, and NY Arts. She has a private psychotherapy practice in Pleasantville, New York.
Published January 2025