Our first episode of Season 2 features Julie Carpenter, author of Culture and Human-Robot Interaction in Militarised Spaces: A War Story. Julie is a social scientist based in San Francisco, and her work explores how humans experience emerging technologies. Listen in as we delve into the relationship between humans and robots, exploring everything from love and intimacy to the bonds humans form with robots deployed in military settings.
Content warning and disclaimer: We talk about adult themes in this episode, so it may not be one to share with minors. We also produce this podcast for your education and enjoyment only. Please don’t take anything in this episode as advice specific to your situation.
Love the episode? We are so glad! Please help others discover our podcast by sharing this episode with friends, family, or colleagues. You can also help us game the algorithms by giving us a great review on Apple Podcasts.
Episode Credits
Guest: Julie Carpenter
Hosts: Zena Assaad and Liz Williams
Producers: Zena Assaad, Liz Williams, and Martin Franklin
Transcript
Liz: Hi everyone, I’m Liz Williams.
Zena: And I’m Zena Assaad.
And this is season two of the Algorithmic Futures Podcast.
Liz: Join us as we talk to technology creators, regulators and dreamers from around the world to learn how complex technologies may shape our environment and societies in the years to come.
…
Zena: Today, we’re joined by Julie Carpenter. Julie is a social scientist based in San Francisco, and her work explores how humans experience emerging technologies that include some form of artificial intelligence.
Liz: If you’ve been listening to our first season, you already know the term “artificial intelligence” – or AI for short — defies a single definition. AI, as Julie describes it, “is a communication and storytelling medium” – one that has the capacity to generate emotions within us. Her work focuses on technologies that have been designed to interact with humans responsively, and with some level of autonomy – like robots, or voice assistants.
Zena: And her interest? It’s in the humans in this relationship, and the stories we tell about the emotions these technologies evoke in us. She brings the patterns in our stories to light as a way of understanding our interactions with technology – and predicting how these patterns might impact us and society more broadly in the future.
Liz: Julie is the author of Culture and Human-Robot Interaction in Militarized Spaces: A War Story, which was released in 2016, and is currently working on her latest book, The Naked Android: Synthetic socialness and the human gaze. We are so pleased to have Julie help us kick off season 2 of the podcast.
Liz: Hi, Julie. Thank you so much for joining us today. We’re really, really excited to have you on Algorithmic Futures Podcast.
Julie: Thank you. Thank you for inviting me. I’m really excited to get to talk to you both.
Zena: Thank you for joining us, Julie. You and I have known each other for a few years now. I think the first time we met was when I sent you a cold email to ask you to come present for my students online. We were still in the height of the pandemic at the time. So I think it was 2020. Is that right?
Julie: That sounds about right. Yep, that makes sense. I remember that.
Zena: And then we’ve kind of collaborated a little bit since then.
Julie: Yeah. Well, obviously we have a lot of overlapping interests and I think you found me through some of the military research that I had done in human robot interaction. I had a real blast getting to talk to your students. That was a fun experience, even though it was virtual.
Zena: No, they were really fantastic cohort. I think I first came across your research because I read your book that spoke about the work that you did in the military and I believe it was specifically about the explosive ordinance robots.
Julie: Yes, it was. So I talked to what are formally known as explosive ordinance disposal personnel in the US military. I spoke to 23, which is actually a fairly large number of them because it’s a very small cohort of highly specialized trained people. So the acronym is EOD. You’ll hear me say EOD a lot. There are the people who dispose of unexploded ordinance or exploded ordinance. They gather that for forensic use. So that means anything from rendering safe nuclear and chemical weapons to IEDs, which became incredibly insidious.
Liz: IEDs are “improvised explosive devices.” These are basically homemade bombs, and can take on many different forms and range from very simple to very complex. They can also be planted pretty much anywhere.
Julie: So we know that there could be potential ordinance in air, land, sea. And if it’s land, it could be desert, jungle, all of these places. It could be in human spaces like marketplaces or libraries or homes. It could be out in open spaces. So, strategically, this is a really complicated job.
Liz: I’m really curious to find out what brought you to doing that work? What was the path that you started off on and how did you end up there? I know that’s actually not where you stayed, so can you tell us a little bit about what your journey in this space has been so far?
Julie: Yeah. It’s been a winding one. I think that that’s really common for people that do human-centered research or something that’s called user research typically, or user experience or social science research that specifically looks at how people are interacting with technology. We’re sort of this first wave, not that technologies didn’t exist before, but when you think about emerging technologies.
So now there are university programs for these entire paths, but my path and my colleagues that are in my cohort, we all had very winding paths because we couldn’t say, “I want to take a human robot interaction graduate program” at the time. So I actually started with film school, of all things. Not because I thought I was going to be a filmmaker, though I really enjoyed the editing process, but I was very drawn to people’s personal stories and what shapes them and then how that informs how they make decisions.
I get so many of my interests—I can directly just go right to my dad, who was a sociologist, who introduced me to science fiction. He himself had a love of technology and encouraged things like bringing home electronic circuit kits, would buy home stereo systems for himself because he just… Not because he was like a stereo freak, but because it was the latest emerging technology.
We had a VIC-20 computer. I tell this story because if you’re under a hundred years old and you didn’t have one of these home computers, before you could just open your laptop and have it turned on, back in my day, a computer came with a stack of books as tall as I was as a child. And that fit my weird little personality, I guess, to go through books like that and it fascinated me. I was lucky that I had access to these things.
I was privileged that way. I didn’t have a privileged childhood in many ways, but often because dad would spend our last penny on technology. So there’s a balance. But I have to really trace a lot of those interests back to that. And then like I said, I went into film. What are you going to do with a film degree when you live in Wisconsin in the United States?
That’s not exactly known as the film capital of the world, especially in the late ’90s. So my first job out of school, I won’t go through my whole CV, but this is sort of where I transitioned. My first job out of school was in public relations. It was at that time where people were developing their own corporate websites and they had engineers on staff, but they didn’t have anybody that could do anything front facing.
I was able to do that. I knew the basics of doing webpages. So all of a sudden I became the web admin literally for this institution’s significantly important websites. And that’s how a lot of people started in web work then. They fell into it. There was not a direct path.
Liz: Julie told us later that she started out in this field when the paths into the field were changing. She did have an opportunity to get a Master’s degree in human-computer interaction, or HCI – even though this program didn’t exist when she had finished her undergraduate degree just a few years before. In short, the field of HCI – and all the degrees associated with it – emerged quickly, as a result of a clear need for people who could do this kind of work.
…
Zena: So Julie, you’ve done a lot of work around the deeper connections between humans and robots. And in particular you explored elements of love and of intimacy. So when you were researching this space was there a common thread that you found that contributed to the formation of these kinds of relationships between humans and machines?
Julie: Boy, that’s a good question — because the umbrella of my work as you just said, is an interest in people’s emotional attachment to, especially robots, but many technologies. I’m interested in AI in particular in many forms, but I certainly have a body of work in the robots. So the common thing is the attachment, which means that I’ve looked at what seems like a lot of odd topics if you looked at those as pillars, like the work I did with the explosive ordinance disposal people with the robots in the military versus research that I’ve done with people who love robots in a romantic sense.
So what’s the common thing? At some point they develop what is a sense of… And this is always changing because culture is always a changing thing. How we interact with robots every day if we’re going to stick to robots, for example, is always changing. But if we regard, there’s a point where… I call it the robot accommodation dilemma where there’s a tension between – you recognize that the robot is a robot. You understand that it has limitations and capabilities though you may not be completely sure what those are. But also in some sense you regard it as a social other. And that doesn’t happen with every human-robot interaction. So like you were saying what makes attachment special? Well, that’s sort of like with human-human attachment — I would love to have that magic recipe because then I could put it in a bottle and we would all figure out who to love and that sort of thing.
But I go back to that there is a framework — I would describe attachment as that they have an affectional tie. Often you can compare the relationship in human-robot interaction, one comparison is to how people often relate to pets and why that’s especially good comparison. Now, I’m going to go way off into human-animal interaction, but when I was talking about we don’t regard all robots as a social other. We don’t regard all animals as pets.
So if you eat meat, then some animals are those that you might consume. Other animals are those that you might domesticate. You have these categories in your mind of how you’re going to regard them in relation to yourself. Similarly, with humans and robots, you’re not always going to have robots that you regard as a social other, but for some you will. And what can trigger that is a really good question.
It’s some of the same things that would trigger your attachment to any other social being — human, animal, whatever — is having a sense of safety or comfort or familiarity with this other. Sometimes proximity. If you feel that that thing is responsive in a positive way to you, if you feel that you’re working towards common goals.
We also know attachment isn’t something that’s immediate. Attachment when I say that is something that happens over time. So it’s usually a robot if we’re going to talk about robots or AI or whatever it is, is something that you interact with frequently. So those are some of the commonalities that you can find that can create a situation where someone can feel that a robot is a social other. And that social other might be like a mascot.
It might be regarded in a pet-like way, or it might be regarded like a friend, or even like I said, in a romantic way. So there’s many — and new ways, and new emerging categories that I couldn’t define in a robot way. Right?
Zena: Specifically thinking about the more intimate relationships, so maybe like the friendships or the romantic relationships, often with human-human relationships, I think what facilitates them or what builds them is that there’s like a reciprocal transaction, I guess, between people. If it’s a friendship, you’re each giving each other time, you’re giving each other support. With a human robot relationship, particularly a more intimate one, I would imagine that that would be quite a one-way relationship and that there wouldn’t be a lot of opportunities for reciprocal transactions within that relationship.
So how is it that these more romantic and intimate bonds can still be formed in a relationship that from the outset kind of looks to be a one-way relationship?
Julie: So that’s a great question. So first of all, let’s put aside the fact that robots can be used as a communication medium themselves. So it could actually be someone speaking through a robot, but that’s not what you’re talking about. But I did want to address that. So of course we see AI getting more human-like in the way it can process language all the time. So it doesn’t really take much for us to treat it in a human-like or an animal-like way. That can be movement, that can be that reciprocity you were talking about.
In the case of something that has, I think what you’re trying to say is like a limited intelligence that’s fairly obvious, or limited feedback that would otherwise be there in a human-human relationship, what’s required really is for the person to project a lot of narrative onto the robot in that case, which is a dynamic that some people are very uncomfortable with. But I would also sort of, again, remind people, we project narratives onto our pets. Big time. Big time. And frankly, we project narratives onto our everyday relationships. We don’t necessarily see our spouses, our children. “My kid is the best”, right? Kind of a thing. So we’re always projecting narratives. And it’s not a spectrum, it’s just a diverse… It’s a new kind. When you’re saying something, like, romantic and you’re have an expectation of reciprocity, then certainly there is more, you might want to say emotional and creative work that the person does. It’s very much is even sometimes starts out as creative outlets for people.
Zena: So then, what would be some of the safety or ethical considerations of those kinds of relationships, especially considering this narrative that people tend to put onto robots or machines?
Julie: Yeah. I think people have specific concerns. So one of the concerns that people talk about often is that the narrative that immediately jumps to mind, frankly, for many people when they think of this is of a human man and a female presenting type of human-like robot. And so a cisgendered sort of straight male who’s often socially awkward–this is the image people have in their minds, right? – interacting with a female like robot. So often a media narrative or a popular narrative that you get, people have this fear that’s… And I’m bringing this up as an ethical concern that it’s going to lead to further objectify women. So that’s one concern people have. To that, I would say that’s very interesting. I’m not going to toss it aside as a concern, but going back to, again, if I’m looking at this as a communication medium of sorts, we saw similar panic about video games, violence in movies, internet. None of these things cause people to do things. But if you’re looking for a bias, if you’re looking for something to confirm your beliefs in something and you search for it and set that scenario up, then it’s going to confirm your beliefs. Here’s the thing. So first of all, that model of the straight awkward guy with the female robot is not always the typical model.
Robots can be incorporated sexually into all sorts of relationships that people can have — very consensual, fun, creative relationships. The other thing is that I would say is that if people are concerned that they would treat a robot – a female robot in particular is what comes up — in the home one way and that they’re going to go out in public and then try to treat women like that, real women, again, I would say real women are going to push back. “I’m not a robot. So if you try to talk to me like one, I’m going to correct you.” Again, so people learn social categories and because they understand implicitly the social categories that person sought out the robot as an outlet for creative sexuality, whatever reasons that it’s fulfilling to them at that time. So that’s sort of some common ethical pitfalls that people also talk about.
Another one I hear is about robots replacing people in relationships. And to that, I would have a similar response is that people have different social needs. Robots will have many different functions. Humans are excellent at forming social categories because we learn them from watching and observing, and absorbing everything in culture around us. Right?
Liz: So it seems like from what you’re saying, basically we’ve got these social narratives effectively that are shaping the areas that we choose to focus from an ethics perspective when we’re considering relationships like these. What you’re saying is basically things are much more diverse, and also we already have different ways of thinking about people and pets, and all of these other categories. We have ways of dealing with these things that we should be keeping in mind when we’re thinking about these things.
Julie: I would say that that summarizes it, but I will also acknowledge that the first wave of people who have created, for lack of a better term, sex bots, and there are not many that are out there, the term is really loose because they’re not robots in the sense that I have not seen any that are independently intelligent and interacting with their environment in a way that they can sense and learn and react like Zena was just saying.
So that term in itself is really loose, but I will acknowledge that, that first wave that is still out there that is working towards building sexualized robots has been very much men who are developing robots that appeal to a lot of that sort of westernized, playboy, exaggerated, if you will, for lack of a better term, exaggerated in the robot sense. But that’s the first market.
It’s complicated, but I’m hopeful that as long as the technology and the concepts and the ideas are out there that we’re going to get creative with sex and AI, and robots. I mean, there are so many interesting positive things that could come out of this technology, but there are ethical concerns. Yeah, I agree.
Liz: Are there ways you might recommend that we think about the ethics issues with regards to this in a way that takes into account some of that diversity in terms of our interactions, but also in terms of how we’re even thinking about creating these things?
Julie: I mean, first of all, like I said, it’s for all of the media interest in the idea of sexualized robots, it’s really not a burgeoning industry yet. There really just have not been functioning robotics in that area. I guess I’m hesitating because it’s sort of hard to say right now; they’re talking dolls essentially, if you can afford one. So that’s a very different experience than something that can move and interact with you.
So as the technology changes, as the supply chain for them changes and the prices come down, that changes the market. That’s capitalism. That’s how it works with everything. As they become more accepted, as they become more common, all of these things impact how culturally we look at them. So will it go from we’re looking at them… I mean, think of a Terminator narrative or a Frankenstein narrative that we very much have about robots 25 years ago in the West that I think has changed a lot.
As I watched it, there is fear of some emerging robot scenarios like sexualized robots or people who could fall in love with robots because it’s new. But as these things become more common and as we can observe how people are interacting, we can really… Instead of trying to predict damages, we’re learning how people are navigating it as it’s happening. I mean, that’s my view as a cultural scholar, really.
…
Zena: Julie got in touch with us later to note that there is one issue that she thinks is very important to mention here: the data produced through our intimate interactions with technology is a matter for concern. There are real questions about who should own (or even be able to access) this data – and these questions are really important to consider, both for those designing and distributing these technologies, and for those who are choosing to interact with them.
Now, back to our conversation with Julie.
…
Julie: I’m very much an observer. Again, that’s ethnographer in me. I’m sure there are plenty of people in this area and ethics that would have a much more active response. But I view my work as one that’s heavily one of observation and reporting, again, patterns in how I see people interacting with these technologies. So I’m sort of this conduit of information, if you will. I’m a medium in itself if that makes sense.
Zena: The only thing I want to add here is you mentioned a lot about creativity and how people are creative, and so we’re going to be creative with the ways that we use technology. Then you also spoke about the pitfalls, I guess, of living in a capitalist society. And the example I always like to use is social media. So social media to me is still a relatively new thing. It came about when I was in high school. So for me, it’s still relatively new. But if you had told us back then that you could build a multimillion dollar business through social media and be known as what’s called an influencer and earn money through that platform, I don’t think anyone would’ve believed that.
But I think because we are quite creative and there are usually a lot of opportunities to monetize different technologies and different capabilities, social media turned into basically a brand new advertising platform. Right? It turned into a new media market and it created a new industry.
Julie: And notice too, that people have, I think, I hope, generations that have grown up with it or just us that were introduced to it are much more savvy about it now and further categorized where what is… We have to be aware or we should be aware hopefully when someone’s an influencer. Really, in my mind, it would come back to, in a lot of ways it’s trust. Right?
Trust affects how we interact with and through so many mediums. So if you think about trust is… Again, I think I said this earlier is you and this other working towards a common goal. Trust is fluid. It can be built and broken and repaired. Social media is such an interesting medium. Like you were saying, you can blow up and literally be recognized by millions of people around the world and be just this money earner for being a version of yourself.
…
Liz: I’m wondering if we might go back to the military domain to our next question, where you’re exploring the relationship between soldiers and robots. And again, you were talking about the space of explosive ordinance. So you’ve got a book on this. This is your first book, Culture and Human-Robot Interaction in Militarized Spaces: A War Story, which we’ll link to in our show notes. Can you share with us what got you started working in the military domain?
Julie: Yeah. It was very synergistic. It really was. For some reason, the military always fascinated me culturally, again,.as a cultural scholar. When you take a bunch of people, it’s a strange situation. It’s literally a strange… You take a bunch of people who had never met and you throw them together and you sort of not break them down mentally, but you start from square one and you basically say, “We’re going to teach you how to do everything all over again. You’re going to learn how to shine your shoes a certain way. You’re going to learn how to make your bed a new way. You’re going to learn a new way of talking, because you have to learn all the acronyms and language. You’re going to be risking your life. And so you have to learn to trust all of these strange strangers who come from all socioeconomic and diverse backgrounds.” And that to me has always just been such an amazing situation in and of itself and one I’ve never experienced. I started reading. I read one or two stories, including I think I was a Washington Post story where they were talking about the first generation of explosive ordinance disposal robots. And they were talking about on a testing ground where a colonel had actually stopped the test because the robot, which does not look human at all, resembles like a small tank, maybe the size of a backpack, this particular model.
So it’s tracked. It doesn’t look like a human. It doesn’t resemble an animal, but it had been harmed during this exercise and was sort of crawling across the field and a colonel stopped the exercise because he couldn’t bear to watch this robot dragging itself across the field. So there were anecdotes like that coming through. That happened to be explosive ordinance disposal work.
And then when I was getting ready to do my dissertation, I literally knew somebody who was a former Marine, who knew somebody at an army base near me who said, “You know who you need to talk to? Our EOD. Let me start introducing you to people.” And that was an ethnographer’s dream. Research starts in all kinds of ways. That was very organic. I don’t know how I got to that path that I actually made it happen, that I knew somebody who knew somebody, but I was like a laser on that path for a long time.
So to me, it was a combination of the situation where, again, sort of in your mind, if you haven’t yourself experienced it, and I have not been in the military, I imagine people who have to learn to emotionally compartmentalize to a certain degree in order to function at a job that can be physically and emotionally risky. And to think of them interacting with or experiencing such an emotional negative reaction to seeing a robot harmed–Then all of these researchy questions started popping up in my head because I also had been reading that… And this again was public information, but because I was interested in it, this was something I was focusing on, that the US military, like other militaries, including Australia’s, around the world, were working towards making more… They were calling it synergistic human robot interactive systems. In other words, building a lot more robots that were going to be working on the field, including drones, including wearable robotics and including things like robots that explosive ordinance disposal folks used. So all of these things together put me on that path.
Liz: What was it like to experience that and to engage in that kind of work in this particular space, particularly as an outsider coming in and trying to become one of the team, I guess?
Julie: I really appreciate that question. You are maybe one of two people in the last 10 or so years that have thought to ask that, because I was not a participant observer, as you would say. Right? I’m very much on the outside. Actually, I have a funny little story about that because I was trying to do one-on-one in person talks with these… I’ll say soldiers. So that tends to be an army term, but with the soldiers when I could. And, one, I happened to–I was meeting in informal situations—met at a cafe, and he asked if his wife could come along because she had heard that he was meeting some woman at a cafe from the university and she had something in her mind that was, you know… So, it’s very unusual to have a wife or a spouse at an interview and I said, “I’m totally cool with that. Is it fine if she sits at the next table just so we have a sense of discussion? But she can listen in or whatever.” And that worked out. So that was sort of interesting. I really wasn’t an outsider to them, but what was also interesting that came out of it, had a very good talk with this person about their experiences. Afterwards his wife came up to me and said that she had never heard some of the stories before, and her feelings were almost hurt. And I said to her, “It’s because he’s protecting you.”
It was easier for him to tell me emotional things, especially things that are negative and can be grizzly in that line of work. And to talk to me where he had a sense of safety and semi-privacy and no reason to share that with her before. So it is interesting, right? They’re bizarre, unusual scenarios. How did I get in? That introduction, like I said, that synergistic introduction of knowing somebody within the community, a former marine who knew someone who was active at a military base, who then in turn introduced me.
And this is very common in ethnographic work, especially when you’re outside the community. Again, that’s where trust comes into play. I had to earn, like you said, the trust of these different leaders. Some I got to meet in person. Some in email. Another interesting experience is when you’re a PhD student waking up in the morning and getting a very scarily worded email from the US Navy’s legal core saying, “Who are you? We heard that you’re asking soldiers something, et cetera, et cetera.”
So there’s that aspect, like you were saying. So there’s the earning trust. I was very grateful that once I did earn trust, it snowballed and participants would then recommend the experience to other participants. So that’s commonly called snowball recruitment. And so that means it was working and that they trusted me and had all in all a positive experience and felt safe with it. So that’s great.
But besides the interviews, I do want to acknowledge that a lot of work went into things like Freedom of Information Act requests and seeking to look at all of these kinds of artifacts, watching a lot of videos that soldiers take on the field of explosions and things like that, sort of lurking on specialized bulletin boards and social media groups. So it was much more than those interviews, and I actually really appreciate that question.
Liz: Oh, it’s fascinating to hear. I’m curious. I mean, that’s one of the things I have challenges around, like you do interviews, you build a relationship with people, and then you think about, “Well, I have to make a research output that I make public at some point.” There is a negotiation at least internally, but also with those whose stories that you’re sharing and you’re analyzing what to make public, what to put out there. How did you manage that?
Julie: Well, of course, that was official university research. So it’s all informed consent and all of that sort of thing. You definitely need to develop a rapport. The work that they do, as with any military person that can be in the way of harm, they share some really painful stories, but they knew that anything they shared with me might be used in the book, and it very much was an area of two-way trust.
Anytime I speak with, you know, loosely, I’m going to use air quotes, “a participant”, It’s an honor, really, for them to share that with me. If somebody loves a robot, that’s not common, and that’s culturally frowned upon at the moment, right? So if they’re making themselves really, again, risky and emotionally vulnerable with me, and they’re saying, “I’m sharing this not just with you, but I’m trusting you, again, to be that medium, to share parts of it with the world,” often I think people are eager to share their experiences. A lot of what I’m doing is asking questions that nobody has asked them.
Why love a robot? Why treat a robot socially in the middle of a wartime situation? Why – you know, all these why questions that nobody outside of their own brain has really stopped to make them think about.
Zena: When you were talking about the interview that you did with the military person, and he brought his wife along, and afterwards, his wife was kind of, I guess upset or disappointed that he had said things to you that he hadn’t shared with her, and it made me think about the relationships that we form with robots, and particularly the romantic and intimate ones. And I kind of focus on them because I’m the most curious about them. It made me wonder if maybe these robots are actually filling a space that other people can’t fill for you. I think that was the thing with this person that you interviewed. Right? You were filling a space that his wife couldn’t fill for him.
So with your new book, which is titled The Naked Android: Synthetic Socialness and the Human Gaze, I’m wondering if this is something that you explore in this new book, kind of delving a little bit deeper into why we are forming these more intimate relationships with technology?
Julie: The Naked Android has always been about how we regard robots in relation to us and how that is changing and why. I have collected and am still in the process of collecting some amazing stories of a wide variety of people who interact with robots or develop robots or work in AI.
So for example, I interviewed Takeshi Koshiishi who is basically, he designed Honda’s ASIMO robot, which is one of the first wave of social robots. If you think of a small 4-foot-11 white humanoid robot, you’re probably thinking of ASIMO. I’ve interviewed people for the book who love robots, romantically. Filmmakers who have made films about robots, an actor who embodied a robot, and how she could communicate to people within a robot costume non-verbally what the robot was trying to communicate through her body. Right?
That’s interesting because we watch how robots interact with us in that human-like way. Right? I talk to science fiction writers and influential people who work in AI. And what I’m asking them are a variety of questions, but mostly about what has influenced them in sort of like what you’re doing with me with that question, and I was talking about my dad and this winding path. I’m getting all of these fascinating stories that are, of course, each very individual and deeply embedded in their own culture. But some of the common threads we find in this, for example, are fiction, heavily influencing our expectations, what technology will do, and therefore influencing how people build and design based on these things that we have implicitly learned to expect from stories that people tell. There’s a circle of this culture that inspires and pushes technologies and stories forward, and then the stories inspire and push technology forward.
That’s sort of where the book The Naked Android is at right now. I hope it’s read as interdisciplinary as well. I try to write in very plain language because I want everybody to be able to ask these questions that you’re asking and try to figure out, this is so much to get your mind around. I mean, this is a topic–again, it’s something I’ve been focused on for the last 25 years. There’s no concrete answers, but what I can do in the book is say, “Here are some examples of how people went on their journey with deciding how they’re going to live with robots in their world and their different positions. And people can take away from that. They may take away from that whole entire new ideas about how to interact with technology. I hope, I hope people will see different viewpoints.
Liz: That sounds fascinating. You might not want to answer this question because it sounds like you’re still doing a lot of work on the book, but do you have an idea about when you might want to put it out into the world?
Julie: Well, my publisher would’ve liked it out yesterday. Hopefully, hopefully in this next year would be lovely. It did have a pandemic interruption. It did. As many projects did. And then like we were just saying navigating this research–there’s a lot to that qualitative backend that people sort of don’t see that’s very time consuming. But hopefully in this next year. I’m very excited about it though. I’m excited not because of me writing this book. I’m excited, again, I have the honor of being—I feel like a curator of a museum curator of these stories. I get to preserve and also share.
Liz: Oh, I’m looking forward to reading it when it comes out. I guess the one last question I was going to ask is how can our listeners find you? How can they find the work that you’ve already put out there? Or how would you like to be found?
Julie: Yeah. So I’m not doing a ton of social media these days. But I have a website, jgcarpenter.com. You can even send me an email through the website if you’re curious or want to connect. But yeah, I try to keep that updated and about my activities.
Liz: Wonderful. We’ll make sure that’s available in the show notes when we put them up on our website.
Julie: Thank you.
Liz: Thank you so much for taking the time to meet with us today. This has been a fascinating conversation and I’ve had so much fun getting to know a bit about your life and your work. Thank you, Zena, for inviting Julia along.
Zena: Thank you for agreeing. I always love chatting with you, so it was really nice to be able to transition this into one of our podcast episodes.
Julie: Aw, thank you, Zena. Like you said, it’s always lovely to chat with you, and it was great to finally meet you, Elizabeth. Thank you.
…
Liz: Thank you for joining us today on the Algorithmic Futures podcast. To learn more about the podcast and our guests you can visit our website algorithmicfutures.org.
Website featured image: Possessed Photography on Unsplash