This room is an imagining of everything you have ever touched. Everything you have ever bought or made. It is full of symbols, and pictures, and memories of every person you have ever loved. There, on the far left wall, is your social network; friends you made in High School and have now forgotten, colleagues from work who you talk to every day. There on a shelf is a printer, spitting out every paper or passport or rating or license or verification that has represented some part of you to some system at some point in time. Your bankcard is stuck to the right wall, at the back. Your credit rating is framed near the door. On the screens are the media you consume and the events of the world that have happened around you and shaped the reality in which you lived out your life. This is a room of all your connections and effects. No matter their size, whether memorable, quantifiable, or traceable – they are real and they are part of the person that is you.
Earlier this week, I posted a question on the xtended substack: how would you feel if your loved ones trained a chat-bot to talk like you, after your death? One reader reacted with unabated horror. She foresaw an invasion of privacy, the collation of messages, emails, maybe private thoughts or half-finished manuscripts collated in a lifetime’s worth of google docs.
Google docs. That would be the place to start if you were training a chatbot on my words. I’m a google docs hoarder. The profusion of docs and sheets associated with my google account is intimidating, at those times when I think I really should neaten things up. But it’s also nostalgic. I find ideas in there that I had when I was twenty, and in their reading I am transported back to a minuscule student flat with a furry black and white bedspread and a time when it was genuinely exciting to be adult enough to buy your own forks. My google account is the digital equivalent of the memory boxes that I stuff with ticket stubs and lanyards, post-cards and polaroids. I’m not a journal-keeper; I go through phases of journaling but it consistently feels like a chore, while externalising my experiences and memories in artifacts is an intuitive and life-long practice.
If you were tasked with putting ‘me’ into a room, you really should include my box of ticket stubs. And my google account. Taking just my body feels incomplete; a snapshot of flesh that is probably mostly concerned about whether the container she now finds herself in provides any recourse to coffee. And if there’s enough air.
You are more than your body. I think that’s true regardless of where you sit on discussions of consciousness and soul. The cumulative effects of your presence and agency in the world; everything you have touched, bought, sold, created, everyone you have hugged, spoken with, fought against, birthed, mentored, befriended. I don’t know how intuitively accurate this seems to other people. If I were to put you into a room, what would I find in there?
So. Back to chatbots. The first I heard about a chat-bot being trained to speak in the style of a specific person was in 2018, when I read about a man named Roman who was the dear friend of software developer Eugenia Kuyda. When Roman died, suddenly and tragically, Eugenia fed into a neural network the decade of messages that Roman had sent her, plus others from friends who had also loved him. The program that emerged seemed - to Eugenia - to maintain Roman’s presence in her life and she found herself keeping it updated with her news and daily thoughts, continuing the conversations that she had with her friend, when he was alive.1
It is the conversational aspect of a chatbot that makes it different from a portrait or a photograph, a passive record of a person’s presence. A chatbot trained on the written records of a person's presence is an independent agent in the world without the driving motor of a living body. Is this a fearful emulation? Or an extension of self-hood? A sort of digital tendril, like a new creeping shoot from a plant that split from the mother stem and creeps across the soil to grow into something new?
Personally, I am not gripped by horror at the idea of my loved ones training a chat-bot to speak or write like I would. Weirdly, when I imagine it, I remember reading Harry Potter, and the descriptions of photographs that waved and smiled and reacted to the world as their subjects would have. I’d like to ask a kid their thoughts on what those photographs really are, and whether the picture of the person waving back at them is the same as the person who was photographed.
My own intuitive feelings aside; Ruth Sterling - the reader who does not like the idea of becoming a dataset for an AI program - raised an excellent point about privacy, and whether she should add a clause to her will prohibiting such a use for her digital assets.
I spent some time googling “who owns my google docs after I die” and I asked ChatGPT whether I could make a will that prevented my beneficiaries from using my data to make a chatbot. This felt ironic, as I have a suspicion that my data, including everything that I’ve ever written in a google doc, is already up for grabs by online platforms who train large language models, like ChatGPT.
Digital asset ownership, data privacy, the now very normal phrase of ‘my data’...it confirms, at least for me, that the thing that is me is distributed; a lot in a chair at my desk writing this, some floating in documents I wrote ten years ago. My digital life, my digital self, seems real, albeit difficult to perceive.
And you know what technology is really great at displaying things that are real but otherwise difficult to perceive? Extended reality technology. Yep. That was a weird segue into the beginnings of part II of this essay, about why I think extended reality - or XR - tech is theoretically very exciting. Although I have yet to be excited about strapping a computer to my face.
I recently posted a segment of an essay that I wrote some years back, which also talked about chatbots and death. I keep returning to these topics because I find them fascinating and have not yet fully worked through my feelings about them. In day to day life I do actually talk about lots of other things and am really quite a cheerful person.
I agree that the body represents only a portion of a person. The prospect of a chatbot extending a person’s relationships beyond biological death does not come across to me as uncomfortable. But the filters and distortions applied to their presentation would be different to those pre death. And the question arises as to whether a person generates any new information in biological or chatbot life or is actually just an operating system which processes inputs it encounters.
I agree. Perhaps what we think of as newly generated content from ourselves is no more than a product of the same process as goes on for a chat algorithm albeit one more complex than the artificial algorithms that exist now. I don’t think this devalues what we are in our own estimation however, nor necessarily invalidates any of the wonder there is in ‘persons’ and their features, even feelings and beliefs.