Digitality and its consequences

with Robin Boast

Download this episode in mp3 (41.31 MB) or all episodes in a zip folder (1.38 GB).

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Robin Boast is the Professor of Information Science and Culture at the Department of Media Studies, University of Amsterdam, the Netherlands. Previously, he was the Deputy Director and Curator for World Archaeology at the Museum of Archaeology and Anthropology at the University of Cambridge. He has published widely in the field of information and the culture of the digital.

In particular, relevant to this podcast, he is the author of "The Machine in the Ghost: Digitality and its Consequences" (Reaktion Books, 2017). Everything has its consequences, including digitality. But what are they? Are they positive or negative? And what is digitality in the first place? I asked Robin during one of the best conversations I've had making this podcast and also outside the podcast: I feel lucky that my path crossed that of this fine scholar and delightful man. And I have to thank Dr. Frederick Baker for this, a guest on this podcast (episode #12)! I hope you enjoy this episode!

We live in a digital age, within a digital economy, continuously engaged with digital media. Digital encoding lies at the heart of our contemporary mobile-obsessed, information-heavy, media-saturated world, but it is usually regarded – if it is thought of at all – as something inaccessible, virtual or ephemeral, hidden deep within the workings of our computers, tablets and smartphones. It is surprising that, despite the profusion of books on the history of computers and computing, little has been written about what makes them possible.
Robin's book "The Machine in the Ghost: Digitality and its Consequences" navigates the history of digitality, from the earliest use of digital encoding in a French telegraph invented in 1874, to the first electronic computers; the earliest uses within graphics and infor­mation systems in the 1950s; our interactions with computers through punch cards and program­ming languages; and the rise of digital media in the 1970s. [Paragraph adapted from:]


This episode was re-published on EuroScientist in November 2019 (see episode on EuroScientist). On that occasion, I visited Robin in Amsterdam again and interviewed him for another half hour! A list of highlights is available in the description of the video on YouTube and below on this page.

At minute 3:45: Files and folders were information technology that was invented at the beginning of the 20th century.

At minute 4:03: Melvil Dewey's "The Library Bureau."

At minute 4:51: Steve Jobs and IBM are mentioned.

At minute 5:16: The digital workspace with files and folders have changed us all into being secretaries.

At minute 8:00: A physical folder and a folder on your computer are the "same" because they were "made to be the same," but behind all these layers, the medium is fundamentally different.

At minute 8:32: It "took work" to make the physical folder and the digital folder "seem very similar."

At minute 10:13: Ted Nelson is mentioned.

At minute 10:46: Doug Engelbart is mentioned.

At minute 11:25: The flexibility of digitality was intentionally "locked down", so that it would to be "narrowly mediated."

At minute 13:47: It takes a lot of work to make a digital text look like paper. Current designs "keep us from using text as though it was digital." About Adobe PDF.

At minute 14:04: It takes a lot of effort for the computer not to be digital, to be just like paper.

At minute 19:38: The current design of digital devices is the consequence of an economic situation, it has nothing to do with digitality and its potential.

At minute 19:58: You don't get a lot of choice because you get what they decide is valuable for them to sell.

At minute 20:25: The lack of choice we have comes out of an economic system, it has nothing to do with digitality.

At minute 21:19: MIT and the first computer graphics (and video game) are mentioned.

At minute 23:31: Xerox PARC and Alan Kay are mentioned.

At minute 23:53: Smalltalk (programming language) is mentioned.

At minute 25:00: Logoscript: an implementation of Logo (programming language) in JavaScript.

At minute 25:28: Steve "Woz" Wozniak and the Lisa computer are mentioned.

At minute 26:02: IBM and Microsoft, "a small startup from Seattle," are mentioned.

At minute 26:13: QDOS (Quick and Dirty Operating System)

At minute 27:17: A lot of what we wind up with now is a consequence of those "technological commitments" that were made in the 1970s and early 1980s.

At minute 28:00: With mobile devices, with smartphones, computing is shifting... we're moving away from a "typewriter with a screen," which is the model that has been dominating since the late 1940s.

At minute 29:45: Why the new business models demand that we forget [previous ways of doing things, possibilities that were not considered valuable for the business?]

At minute 30:38: Should we build our own machines? #breakTheMould

At minute 31:54: The digital revolution hasn't even begun and will not begin until we all program (paraphrasing Alan Kay.)

Go to interactive wordcloud (you can choose the number of words and see how many times they occur).

Episode transcript

A list of topics with timed links is available in the description of this episode on YouTube.

Download full transcript in PDF (136.82 kB).

Host: Federica Bressan [Federica]
Guest: Robin Boast [Robin]

[Federica]: Welcome to a new episode of Technoculture. I'm your host, Federica Bressan, and today my guest is Robin Boast, Professor of Cultural Information Science at the Department of Media Studies at the University of Amsterdam in the Netherlands. Previously, he was Deputy Director and Curator for World Archaeology at the Museum of Archeology and Anthropology at the University of Cambridge. He is the author of many publications, among which the book The Machine in the Ghost: Digitality and its Consequences. Welcome, Robin.

[Robin]: Thank you for having me.

[Federica]: Your last book, The Machine in the Ghost, published in 2017, attracted my attention especially for the subtitle: Digitality and its Consequences. We often hear we live in a digital age or digital technology has revolutionized our lives, and certainly I agree that there is a kernel of truth in these statements, but what Technoculture wants to ask is: How exactly have our lives been revolutionized? How does the digital impact our daily lives, our social interactions? Like how are we different than the people that came before us? So, in other words, what are the consequences of digital technology? Now, is digital technology the same thing as digitality in your book?

[Robin]: It's kind of hard to say. I mean, the problem is, like any word, I mean, they mean a lot of different things, so one thing the book is trying to do is trying to situate, you know, the idea of digitality within a broader historical context and trying to move it away from its, what I see as a partial myth, not a complete myth, but a partial myth of digitality being about computation, you know, that digitality is computation, and I wanted to move it away from that and show how what we usually think of digitality (which I think of in terms of encoding, you know, binary encoding) as being something much older and having always to do with something to do with media, which it did. Well, you know, according to my story; if you buy my story, that's the story. So I wanted to situate encoding very much within the context of media, of media production, of message making, you know, of communication and creating multiple, you know, forms of media performativity and how the idea of encoding itself allowed that much more than computation. Computation came from a slightly different place (although it, you know, as you know, being a computer scientist, it kind of underlies the basics of what goes on there), but very quickly that becomes a process of writing or drawing about things, which is, to me, media. And so, and I push, you know, encoding right back into the 19th century, which is where binary encoding arose along with, as a kind of, in association with the telegraph. You know, and in fact, you know, ASCII, the basic Latin, capital Latin letters of ASCII, were exactly the same encoding of the capital Latin letters of the telegraph that was used almost universally from, you know, the turn of the 20th century. Actually that, the encoding itself, it comes from the 1870s, but it was modified somewhat in 1901 to 1902, and that became, those letters became, how we did telegraph. All telegraph was binary, was digital in that sense until the... Well, from the First World War.

[Federica]: When you say 'digital encoding', do I understand correctly that we are already one step ahead of just binary arithmetic, it's already an application, so we're doing something with it?

[Robin]: Yes, yes, exactly. The Baudot code (where we get the word 'baud' rate), in fact, it comes from Baudot, Émile Baudot, who was a telegraph engineer in France who, on his own, [back 00:04:26] created a telegraph in the early 1870s. I mean, more or less was finished by 1874 and was in use by the French by 1876, you know, in a line from Paris to Rome. Actually, there were a lot of different telegraphs, but I won't go into that. There isn't just Morse. Morse wasn't binary, by the way, of course; it's one of the great myths. It's a five-phase encoding, so, you know, it can't be binary, but Baudot's coding was actually binary. I mean, it was a five-bit encoding, you know, that was punched onto tape as a five-bit encoding and, in fact, continued to be a five-bit encoding, and was — I mean, by the turn of the century, someone put a typewriter on it and changed the encoding slightly, but it was literally a five-bit encoding, binary encoding, so in the sense that not only did the holes matter, but the non-holes mattered. Right? So it was literally a binary encoding of each of the letters, and so being five-bit, it could accommodate 64 characters. And there were, you know, there was a base and there was even a Cyrillic version. And what people don't realize is that, you know, when you see things like Good Morning, Vietnam, right, when the teletypes, that's using Baudot's code. Those were using the five-bit encoding, and all telegraph, practically, except over radio (you know, long-distance radio, which did continue to use Morse), but all other telegraph, from the 1918, 1920, this binary code (it's called the TTY code), it was officially enshrined as an international telegraph code in the 1920s, and everybody used it. I mean, all telegraphs traveled by this five-bit encoding, and in fact that's why, when they started doing typing on the computers, they used that because it was already there, had been around for 80 years. You know, it already existed, the technologies for encoding and readings have already existed, but the things that we, the computing side got attached mostly by the cyberneticists, and although it's kind of true, it's not completely true. So, for instance, the word 'digital' itself, which came out in a special memo in 1942, I think it was, (and I forgot the guy's name, which is terrible), an engineer went on to work for the Moore School, which was very important to early computing, electronic computing, in the U.S., wanted to distinguish the kind of computing that was beginning to emerge during the war. It was a classified thing. And he wanted to distinguish analog from what they were doing. Right? And so he called it 'digital' because these the first models like ENIAC and EDVAC and so forth, the models for calculation (and these were literally just calculators) literally were based on base ten; they were not binary. There was one binary computer under development in the U.S., the Atanasoff–Berry computer, which never really worked and was literally counting on their fingers. I mean, they had decimal rings. It was decimal computing. They had decimal rings that would count up to, you know, from zero to nine and then carry and literally they had decimal rings. It wasn't binary at all. It isn't how the chips work today with the, you know, half adder and whole adder. It doesn't work like that at all. It was literally counting on your fingers, and so that's why they call, that's where the word itself, 'digital,' comes from, digitality, right? So counting on your fingers. Of course, you can count binary on your fingers as well. I do it with my son sometimes. He's very keen that I can count on my fingers in binary.

[Federica]: So you have clarified what you mean by digitality, digital encoding, and you have talked about some history of digitality and its technological applications. But your book is not a history of digitality. It says Digitality and its Consequences. Am I correct that the consequences are not technological, the consequences are social and cultural?

[Robin]: Absolutely, and I think that's also part of it, because part of the story also was that the first computers in Europe were binary. I mean, a Colossus, for instance (which, you know, remained a massive secret until really the 1980s, I mean, you know, it was classified until then), but it was a binary computer, and it actually worked binary. It used Baudot encoding because it was to decode German messages, so they literally encoded the messages, the letters, the messages in TTY, which is Baudot coding, and used binary logic like we do today. But immediately after the war, you had those two different models kind of circulating around and the kind of structures today through, you know, Wiener and others trying to find the actual architecture that we have today. Those were kind of emerging in this soup of computation, which is true, it was mostly computation, but in Europe there was much more of an emphasis on processing information as binary information, and that also started, the efficiencies of that came over in the U.S., and it took some time, but then slowly... But very quickly, people started realizing that as soon as you're dealing with binary, and binary, because it could encode things (which we'd known for 80 years), and you could very quickly start using these machines not just for calculations, but for literally presenting, you know, to present them, to present the results. You could start... The Williams tubes, which were little screens that were pervasive after the war because they were used for radar, and, of course, you know, one of the great inputs to the growth of industry after the war was more surplus. There was so much stuff around. I mean, I saw an article once of a U.S. government ad trying to get farmers to buy flamethrowers just for weed control or something. It was crazy; they're trying to get rid of all this stuff. You know, so when people were using Williams tubes as storage, you know, as RAM, basically, because there it had a delay of the phosphorus on the screen. It could be used as temporary, written and read, then they, and very quickly, it was Whirlwind, actually, which was an MIT computer where they started using the Williams tubes to monitor the tubes, you know, the logic tubes which they had instead of transistors because they didn't have transistors then. And they realized very quickly that they could actually use the screen interactively. And it's 1948 and '49. Very, very, early. And so they thought, 'Oh, this is interesting. You know, we can do stuff like this.' And they actually, one of their engineers went away for a week and invented a light pen. They initially called it a light cannon because it was so big, but then they narrowed it down to the light pen. And they were literally working with data on the screen by 1949, you know, and literally creating graphics. There's a famous, with Murrow, the famous American reporter who reported on the Blitz, where they had him in and they were showing, and then he was sitting there, and it was writing text on the screen (this is 1952), was writing text on the screen. It was drawing graphics, you know, and so forth, and imaging came very quickly. And the point I make is, and the fact is, as soon as you realize that this is actually a way of encoding media, that the two work together and it isn't just about computation, then that's when the computer really started to expand. And very quickly in the '50s, you have people writing music on computers. You had them doing psychological experimentation. You had them doing art. You had them making games. All of these things started just emerging from the universities and collaboration very, very quickly, because it was clear it was a media machine.

[Federica]: What I can't wrap my head around is: How do you go from observing the evolution of technology to the changes in society, which are much more subtle? What came [unclear 00:13:24] new concepts, ways of living, behaviors, ideas, and were these already present in society and actually produced the technology and then maybe reverberated back, creating a loop? How do you trace a history of technological evolution and make inferences on how society evolves?

[Robin]: Well, I think, first of all, that is a very, very complex problem, but at the same time, if we think of it as, if we think of these things as computational devices, we wind up not attending to their role as media devices. And if you start attending to them as media devices, I believe, then you start noticing the kind of traditions that come out of media that actually are having much greater influence. I think the other thing is, I don't think we should think of technology as evolving. It doesn't evolve. It goes and things happen or they don't happen, and it's all kind of chance. There's no plan. And so at any point as an historian, I can see there's a whole number of possibilities that things could have gone down that route, but they didn't; they went down this one. And sometimes it's not a good route. Sometimes it's not the best of the lot. Now, to play the game of, you know, 'what if' in history is a very dangerous game, because you can't actually know. It's very easy to know what did happen. Well, it's kind of easy to know what did happen. But it's impossible to know what could have happened. You just don't know. It's too complex, but you can see at any moment that if there was a plan, they would have gone, they would have done something else. But I think, especially in terms of looking at contemporary media... Well, contemporary, let's think in terms of social media, for instance. Well, what we're looking at there is something fairly familiar to media theorists and media historians, because it's media. You know, if you think about what Google and Facebook and these do, they provide a service that captures our attention. A media service. It's a media service. I mean, it's much more individualized. It's something that we interact with. It's something that we provide content to, not the only ones, but it's structured for us to engage and continue to engage, and that's pretty much what media always does, from newspapers to film to television to radio. It's drawing you in. Where before it was simply broadcast to you, now it's much more interactive, okay, but it's still... I mean, there's still media companies at the end of the day, because what is their business model? To engage you and try and add an added level of data to try and understand your desires so the people can sell you stuff.

[Federica]: Are you saying that Facebook and social media are not qualitatively new, are not very different than what we've had before?

[Robin]: I'm saying, it's a pretty standard media business model. It's, you provide us, you provide an engaging service to someone so that someone else will pay you to put their adverts in front of that person to engage them, you know, to try and get them to buy things. That's pretty much the television and radio model. I mean, it's, you know... Okay, the kind of thing that's going on, we don't want to be naive either and say, 'Oh, it's just the same thing.' It's not just the same thing. Digitality actually (and we can get into this a little later) is actually a very different kind of medium, and that's one of the consequences I drew out in the book or started to draw out. It is a very different kind of medium, and that does have very significant consequences. But if you think... One of the interesting things that I think about contemporary digital media is that we've moved from a position in the '70s or certainly the '80s and early '90s or mid-'90s, say, from a place where digitality, I think, was beginning to open up opportunities because it's digital to a situation that is pretty much locked down and corporate media as it was before. Okay, we have a bit more interaction, but not a lot. I mean, we have no control over how, or very little control, how we can use or what we can do on Facebook or Twitter. We're pretty much locked into their model because they want us locked in because that's how they generate their data to generate advertising revenue. That's the goal. That's the goal.

[Federica]: We're sitting here doing a podcast interview. This is actually allowed by digital technology because the equipment today is portable, it's light, it's affordable, it's affordable to set up a podcast. Actually, it is like a radio program, but I, as an individual, can produce it and put it out there for the world to listen. This is a consequence of this type of technology and how affordable it is today. So is the podcast a... No, a podcast is the technology, so are you and I sitting here the social consequence?

[Robin]: Yes, absolutely, because, of course, this is one of the differences that digitality has brought. It has lowered the bar massively to actually create media, but of course this also makes the point, which is that it is a media machine. But what is done is by... Not without cost. You have to realize there's an enormous infrastructure there, an expensive infrastructure, that allows us to do this nonetheless, you know, and an infrastructure that wasn't really there before. It kind of was because we're still kind of piggybacking on telephones, but not really, but to a degree that's kind of what we're doing, and you can say, 'Okay, this is just radio in the end,' but it's of course it's encoded. It's not just radio, it's not like analog radio, you know, and okay, Wi-Fi is just radio, but it's digital. It's a digital signal, but it's radio. So there's not... So there's a lot of overlap, and those technologies have consequences. The fact that they are older technologies does have a consequence to also what this medium is. But I go back to my basic thing, which is, what makes digitality different is encoding because, unlike all other media (yeah, all other media, when you get down to it, even television), if you think of what, you know, most media (whether it's printing or whether it's television or whether it's film or whether it's radio, any one of the, whether it's photography), what you have there is a very material link between the medium (which is the carrier, whatever that is, whether that's radio waves or whether that's, you know, whether it's a tape or whether that's a film surface, you know, surface of a photographic, a piece of paper or a paper that's printed), there's a direct link between the medium and the message in a sense. This is, of course, you know, McLuhan's point. There's a physical link there, and you can modulate it and do things to it, but pretty much that's it. You can't... You're stuck with that mediation to a degree. You can modulate it and do things with it, but you're stuck with that mediation and, in a sense, we're still kind of stuck with that to a degree, and we don't want to forget that level of infrastructure either. You know, when you're sending digital signal over radio, there's still problems of noise and so forth that Shannon helped sort out and so forth, but, you know — and doing it as binary rather than analog, you actually have a better sound because it's easier to error-check against noise. This is Shannon's contribution, you know, Claude Shannon's contribution. You can compress it. You can do things like that, but this is also the point about digitality that is different. There is no absolute link between the medium, the binary encoding, and what you perform as a medium, you know, as a media performance. There's none, except convention. Right? I mean, I think we maybe discussed this once before at a time, but, you know, the point is, if I take, you know, if I, as I do all the time, if I take an image from my phone, you know, something, whatever I want, okay, the fact I then send that to you by WhatsApp and so forth, and you open it and you see an image, okay. One, it's not the same image. You know, it's a different image. It's been processed differently depending on the settings on your phone, the make of your phone and so forth. It's going to look different. The differences may be inconsequential from our normal operation, but it's different. The bits, how that image is constructed on your screen, how the color balance, all that will be different. So the interesting thing is, it's different from each other, where in other forms of media it tends to be more or less the same. Now, of course, there's theory about how even those are different, but... You know, each print of a film is different somehow, and that's true also, but this is fundamentally different. The other thing is, of course, that, you know, that the point is, the only reason that I take a picture and I see an image on this (I send it via WhatsApp, and you see an image) is convention, is because we have a series of agreements in those construction of those apps and those technologies, you know, those transport technologies and so forth, that it maintains its identity as an image through that process. There's nothing saying that we have to do that.

[Federica]: Speaking of sharing, we can send pictures through WhatsApp, like you just said, and we can communicate with our friends and extended circles almost any time, so there is some truth in what we hear sometimes that we are this hyperconnected society. I have been wondering if this hyperconnectivity might have impacted the way in which we think. I mean really the structure of our thoughts — which is something so huge to ask, I know, but I noticed such change in myself, comparing how I am now with how I used to be before we were connected like all the time, because I'm old enough not just to remember that, but to have had quite some experiences then, like traveling, for example. One major trip that I did was in 2001 through Canada and the United States. Internet was there, but I did not have a smartphone. I didn't even have a digital camera, so I had to stop in internet cafes, so communication was anything but constant. There was not much of it at home and let alone with friends. And now when I travel, I always have the smartphone with me — and don't get me wrong, I find it brilliant. Like it's amazing, this ability to share the small things every day with our dear ones. I find it brilliant, but sometimes I think that some moments of how I felt far away, alone, then, I cannot reproduce today if I leave the smartphone at home. It's not a matter of having the device with me or not. It's more a matter of knowing that now we have that possibility and not being able to forget about it if I just leave the device home. And that bothers me somehow. Like I'm so glad I have lived that experience then because it's been phenomenal and I cannot reproduce it now. So sometimes, when we think of today and then sometimes we feel like we're talking about the Greeks like, 'Oh, I wonder how it was to be on earth then,' but we're talking about ourselves, and I found myself unable to reproduce, in a way, the feeling of certain experiences that I have lived before the internet and before the smartphone. We've heard this, like it's impossible — it's not just me — it's impossible to think how it was before. Why? How? How has this impacted the way we think? I always feel my friends close to me, you know, in a way. I have this ongoing conversation, so wherever I go, this changes the way I look at the world. This is what I want to say, although it's very confusing, and I'd like you to say something about this social change, which is not just technological...

[Robin]: It's also technological. It's also technological. Sorry, [unclear 00:26:21] but, you know, it's also technological in the sense that, because you have the possibili... I mean, sharing is something humans do anyway. We share, we communicate. We're communicating animals. Many animals can, almost all animals communicate in one way or another. They have even language to a degree. You know, so it's not unique to us. I read a theory the other day that the only things unique to humans is cooking. The only thing we do that no other animal does is cook. You know, so that's interesting. But, you know, the thing is that, of course it has social consequences because it opens up whole new social possibilities or at least levels of possibilities. You can't say that necessarily... I mean, in a way, you know, you could — and I think this is dangerous, so I'm going to say something that I'm then going to contradict — but in a way you could say that the way we communicate now is pretty much the way we communicated a hundred years ago. We write letters (i.e. texts or whatever), you know, we write things to someone else. They write back. We send images to each other because, you know, when there was photograph or, you know, at least you sent drawings and things. You know, okay. But we communicate visually. We communicate... And every technology has, yeah, its own consequences on, you know, on how you see the world, of course, and that's why I'm contradicting myself. It's not that simple. But at the same time, you know, it's not like we're doing anything radically different. We are doing it at a scale that's radically different. It is a hyper-mediated world, and it is something that is immediate, you know, but those immediacies are also relative. I mean, if you think of the immediacies of what happened once telegraph came in at the end of the 19th century, especially the early 20th century, when it started becoming a little bit more affordable, you know, for someone... If you have one of your members of your family goes off, went off to Australia, you know, at the end of the 19 — or in the mid-19th century, you know, for them to send you a letter took two months, and then for you to read it and if you send it back the next day. it would be another two months before they got it. So the message and return message was a third of a year before you could communicate with each other. Still, people wrote a lot of letters. They wrote a huge number of letters, actually, but it took that long. I mean, you know, as much as you could afford, because it was expensive, also. Then the telegraph came in, and you could send a telegraph and receive it within a day or two. Okay, that was an incredible collapse of time of communication, and there was, and this similar talk existed about, you know, its effects on society. This is partially one of Kittler's main points, you know, and, you know, there was the same kind of talk around how it was destroying our ability to communicate, how it was destroying the art of letter-writing, how it was collapsing, how it's positive as well and collapsing communication [made 00:29:25] bringing people closer together as well, you know, and the same conversation happened when the telephone became more ubiquitous, you know, and there was the same problems also then of teenagers spending too much time on the phone. You know, there was national debates, and debates in government about how much time teenagers were spending talking to each other on the phone rather than going out and doing things, just like we say now of kids spending time on computers, on playing games, you know, or texting each other. The same kind of conversations happen. That doesn't necessarily mean it was in, you know, that both aren't necessarily detrimental per se, but the same, you know, you've had this ongoing collapsing of connectivity and this ongoing, you know, almost exponential growth in mediation. I mean, mediation and literally producing levels of media, and, to a degree, a democratization of that. Of course, we also see the dangers of that. Right? I mean, a democratization of information also leads to a lot of, you know, misinformation, you know, or information that is at least difficult to assess, doesn't mean it's necessarily misinformation, but it's difficult to assess its validity. You know, and some people play on that, of course, you know. Of course, newspapers have played on that for 300 years, but, you know, so it's not a new thing either. But one thing that has changed a lot (which you put your finger on also) is what we call convergence, where before, you know, if you wanted — like when you traveled. You had to have a camera. Maybe you had a recorder. You might not have a dictaphone or something, some people did so they could record their voice. You know, you had this, yeah, you had, write, you had to have a pencil and paper. You had to stop in places and engage with a different kind of technology in order to get those things sent, like we used to have to stop at post offices or in the days when I traveled in India, even into the '90s, if you wanted to call home, they had little shops and you went, and you gave your number and they would they would dial it up, you went in a little booth with a phone and they'd dial it up and you'd get an answer and they'd ring and you were able to talk and you paid for the time. It's the only way you could call home unless you were in a very expensive hotel. So that's how you communicated.

[Federica]: Yeah, it feels like we're talking about a different world, but it was just a couple of years ago. The trip I mentioned to Canada and in the U.S., that was in September 2001. So on the 11th, I was in Los Angeles, and it seemed in order to let my dad know that I was alive, and I had to find a public phone and queue, and the lines were so busy that it wasn't obvious that I could get to speak to my dad, so communicating was not just not that easy, but not even obvious.

[Robin]: Yeah, communication was possible, but it was much more difficult, and of course, it wouldn't have developed this way, not that it's planned. I don't think, the point is, it isn't planned, but the opportunities that grew were the opportunities that enabled corporations to do things, because that's the model we work in at the moment. You have to remember, you know, that there were very good connected internets in many countries, you know, in the 1980s. Think of Minitel, or there was also a Canadian one that was very good, you know, and at least in France, you know, where you had, you know, where the government literally gave anyone who walked into the post office with a bill proving their address, they gave them a terminal for free, you know, and it did... You could book tickets on it. You could get up news. There were the pink rooms where everyone kind of, it was a kind of, you know, chat sex rooms. It had all those. It had apps. People were allowed to develop apps for it. They had all that kind of stuff in the 1980s, and it was government-run. It was run by the national post office, which was part of government. You know, so you had those situations existing, but we moved to a different model, mostly for ideological reasons, which is liberalisation, you know, corporate private ownership, and so forth, moving out of — smaller government, moving out of, you know, less government control, less government and we moved to that. So the system that's developed is one of interconnectivity which benefits their models, their business models, and so we do have massive connectivity because it makes the money.

[Federica]: So far you have stressed what is not new, what we still share with our past, what has not changed — but then what is new? What are these consequences of digitality?

[Robin]: Well, that's interesting. I think what's new is how we use the media, the fact that there is convergence, this does have social consequences, the consequences that we are being asked to engage in very specific ways. We don't have a lot of freedom. I mean, you do... I think this is what anyone... Alan Kay makes this point even today (you know, luckily, knock on wood, the great man is still alive), but until we're all coders, none of us will be free of this medium. Only coders have freedom in this medium because they can literally make the things. And this is my point earlier, which I think is really the fundamental liberating aspect of digital media, which is, there is no connection between the encoding and that. You can do anything with the encoding. And, of course, this is the power that the corporations use it for because they... It's not just that your words are data. It's not just that the images you take are data. They're data because they're encoded, because they can literally take that and turn it into data very easily, very, very easily. So an image no longer is necessarily a direct, is no longer a picture of something, you know, which is not a representation anyway. I'm very anti-representational theory, but, you know, in any media, but it's no longer a picture necessarily of something. It's a set of encodings that you can process in any way you want, you know, whatever your problem is, however you want to deal with that, whatever algorithm you want to bring to it, whatever model of knowledge you want to bring to it about how images make meaning, you can pull that, all those pixels, all those 32-bit pixels apart, pulling the bits out of the pixels that you want, deciding what matters and what doesn't, what is important about, whether you want to deal with that as a direct reference to color or something else. There's a good theorist in the U.S., Lev Manovich, who plays with this all the time, you know, and he has a wonderful graphic that he did, which is just a series of lines of different colors. Each line is a different color, a different set of colors just lined up. So, you know, depending on the proportion. What he did was, he took the covers of Time Magazine over the last 80 years and just stripped out their color, the color proportions, and just made them into a line, and so you can see the last 80 years of Time Magazine colors, color, how they used color. Okay. Is it meaningful? Probably. Is it very meaningful? I'm not sure, but you can do anything like that, and that's something that is not impossible in other media. You can do that with a photograph, but imagine the work. It is effectively impossible because it just would take far, far, far too much work, you would... And also, the medium itself doesn't lend it because it's not encoded in that sense. It's literally just looking at what the colors are. You know, for instance, you can say that on an image, you want to say: 'Okay, what happens when we turn this image into sound, and then we want to make that into, we want to take that as data and kind of see what happens would to understand images when they're processed and analyzed as sound.' You can do that. It's easy. In digitality, it's super easy.

[Federica]: Sometimes we speak about society as one thing, and even in history society goes through these transformations almost as a lady with a personality that changes, but in fact society is made of generations of people that are born and die and have their limited experience in time. So what about digital natives, this generation that was labeled in a way that seems to stress a fracture with what came just before? Are they a thing for you? Is it a category that you consider?

[Robin]: Well, I think you can't not consider it. I mean, it's like saying, you know, that we're the people, at, from the 1920s on, especially in the middle of the 20th century, were they different from their parents or their grandparents who didn't grow up, you know, going and seeing films three times, four times a week, you know, and that kind of sociality that that created? Of course this makes an impact. Of course... Of course, what digital are they born into? You know, it's not the digital I was born in. I've been working on digital technologies... Okay, not since I was born. When I was born, there was probably only about a couple of thousand computers in the world, but, you know, I have been working on computers, you know, and digital technologies since 1973. You know, it's a long time and in the world I grew up in the '70s and the '80s is very different than today. The digitality I was working with was much more kind of artisanal than it is now. Things now the... I think they're less of a digital generation now. If I think of my son who's, you know, expert, you know, on the laptop or on the iPad or whatever, or with the phone as well, but he doesn't engage with digitality. He engages with the media. He's engaging with the apps and the media performances, and in that sense that's different, I mean, because it's gestural, it's interactive, there's different things. There's interaction with other people online, so it's not the same as engaging with a film. Definitely not. It's a completely different kind of engagement, although sometimes he's being film-like. He's engaging film-like when he's watching YouTubes and things like that. So he's not in... He's literally just being film-like in a way, but there's differences, but are they the digital generation? Are they actually doing anything with digital technology? Are they just very good at interact... You know, they're very good media people, they're very good at dealing with media.

[Federica]: So you don't really buy this label 'digital natives' when we see the six-months-old babies holding smartphones and playing with something and holding an iPad, and we look at them and we kind of think that they are supposed to be different creatures than us. You don't really buy that.

[Robin]: No, I don't buy it. I think it's not that it isn't, it isn't having a different social impact. We also can talk another time about, you know, as you say, society isn't one thing. There's also a multitude of cultures around the world that engage with digitality differently. They have different ideas of how, what technology is for and why they use it and so forth. So that's, there's also an enormous diversity there that we could tap into. But that would take us, you know, months and months and months. So there's, but that's there also, but no, I mean, it's no different than, you know, the photos I saw as a child of the two-year-old in front of the telly, you know, changing channels or the two-year-old changing records on the record player like I was, you know, when I was a three-year-old changing records on the record player that my grandparents didn't do, you know. They didn't have that experience. They had that later, but not earlier, you know. And the, you know, of course that has different social consequences, but all of it has social consequences, and all of it is about having skills over a shared technology, so not... Having skills over a shared technology, whatever it is, one, does two things. Not only does it mean that you're changed by that engagement with technology, but of course you become part of a club. You know, you become part of this... It's also socialization there. To be able to do that means that you're joining other people who do that, you know, and that's quite old.

[Federica]: I think I lost you again because, again, you are stressing what has not changed. You say with each generation there has been new technology (the record player or then the TV, and now it's the iPad), but each of these changes has had consequences. So actually, can we name these consequences?

[Robin]: You can name some of them. Yeah, you can name some of them. The problem is, the consequences are very, very, very broad, I mean, so we try and pull out certain things, but yeah, I mean, if we looked at something like the phonograph, you know, it had a... As a phonograph, what it did was to change society because, to a degree, it stopped people singing and playing music. You know, before, especially from the 19th century, one of the great growths in the middle classes and even the working classes was the ability to play music and sing, sing and play music and which happened in the homes all the time. And so that was almost pervasive, and the phonograph took some time, but the end, actually, people, there was debates about it and people warned about it, you know, about this happening and it did happen to a degree — not completely, but to a degree. The number of people learning to play instruments, the number of people singing, the amount of singing and playing music in the home, dropped off dramatically. So socialization changed. The way you socialized around music changed radically. I mean, really radically, you know, and it wasn't that nobody played music anymore. It wasn't that nobody sang or played music at home. Of course they did. Otherwise, we wouldn't have any music. Right? But the degree, the pervasiveness of it, changed completely. You stopped, people stopped being able to read music because they just didn't play instruments anymore, or at least the vast majority of people. So what happened in the homes then was that people listened to other people's music rather than... Because it was difficult before to listen to other people's music, because, you know, you heard, you know, maybe you heard of someone, or you saw someone in a film perhaps or, you know, you might have heard them... Well, in the early days of radio probably not, but, you know, but you've seen them in a film or something or you might have heard about them or they became popular, and maybe you never heard the performer at all, but other people got the music. It's a huge industry of, you know, printing, you know, music and people, you'd buy the music, you'd go home and you play it and you'd sing it, and you'd hear your neighbors doing it, and you'd sort of become popular and you'd share music like we do today, but through your performance, not someone else's. You may never hear the people performing, depending on where you lived, so that performativity was very, very, very grassroots. It was really diverse, and it created a kind of community and sharing and so forth that hadn't existed before although music had been around, you know, that kind of playing in music is as old as we are. But when the phonograph came in and it became affordable, then immediately people started, were able to hear those artists, so they stopped playing — or not completely, of course, but they stopped playing. And then you had a different sociability. You had a sociability around other people's music, not yours.

[Federica]: So whether these consequences stay on the surface or go deep and are at social level or individual level as deep as in the way we think, this is fundamentally unfathomable.

[Robin]: Ultimately yeah, because it's very complex, and also, like you say, I mean, for every... You could say that for every person it's different. You know, we share things. Of course, we share things at one level, but at other levels we don't — and this is, of course, one of the great problems of social media. I mean, one of the great problems at the moment is that these corporations, even though they're still banking on it quite a bit, are recognizing slowly that online advertising doesn't work very well or doesn't work... It actually works probably slightly less well than traditional broadcast media, and so it's because you can't characterize someone, and also, targeting someone... Well, I mean you do it; I do it. You know, all those adverts that get targeted, do you read them?

[Federica]: Hmm. No.

[Robin]: The vast majority, no. And we know this. I mean, I think... I was talking to an advertising executive I was flying with one time and he said, you know, that the standard that they work to on online advertising is 2% noticing that the adverts there at all — not actually looking at it and figuring out what it's saying and 'Is it something interesting for me.' That's even smaller percentage. Just 2% even notice the ad's there. So that's not very good, is it? I mean, that's pretty bad. But then the point is because we're, you know, one, because we're good at doing that, filtering things out, but also because, you know, we all engage with media differently. You know, we have different desires, we have different needs, and those change moment to moment. They don't stay steady. You know, so this is a problem also. It's hard to characterize people. It's an ongoing... Knowledge isn't... Again, the knowledge model that's used by media, you know, early media and also the media of today, is largely that your knowledge, what you think, is something that kind of sits in there somewhere inside the gray matter of your head behind your eyes. It's kind of in there, encoded in there like a hard disk. Right? And so all you have to do is somehow look at how you're representing yourself — that's the representational model — how you're representing yourself and you can get the clues and build up what's in there. Well, of course, we've known for a while, or at least there's — of course, there's different schools of thought, and there's a school of thought that's representational that believes that — most contemporary neuroscience and knowledge philosophy and so forth thinks this is just nonsense. You know, we don't think like that. You know, it's not what we do. You know, what we do... You know, our knowledge is something we do. It's habitual. It's also something that is constantly changing. Our memories are something we construct on the fly. Doesn't mean they don't remember, but it's not a picture that we hold in our head. We reconstruct them, and over time they can get... I mean, any policeman or investigator knows this inside out. It's their job to try and figure out what people actually saw, which was different than what they thought they saw.

[Federica]: When we say 'the consequences of something,' for example, a diet too rich in sugar, we somehow mean the negative effect of that thing. In your book, do you distinguish between positive and negative consequences of digitality?

[Robin]: Yeah, to a degree, yes. I mean, yeah, I mean, consequences are always positive or negative. I mean, there's always goods and bads, and of course, we can see... I mean, connectivity has made our lives really a lot, has brought a huge number of benefits to our lives, both in being able to stay connected with friends, to be able to socialize more broadly. It's also isolated people, you know, which is a negative downside, because it depends on how you wind up using it, how it impacts you. But, you know, the fact that I can, if I'm worried about my wife, I can just call her, you know. I don't have to wait, not knowing... Okay, you could say I'm a little obsessive, but there you go. You know, I'm a little, I'm a worrier, so I do worry, but I can just call her, you know, and I can say, 'Are you okay? Yeah, you're good. You're okay.' Good, because she said she was going to be... Being Italian, she says she's going to be home an hour and a half ago and she's still not home. [laughs] So you know, and so I call, 'Where are you? Are you okay? Good. Okay, just want to know.' But I can do that, and even when she's traveling, you know, it's easy. That's easy about that, and I can record my life. I have a library in here that's... I mean, I have a huge book library as well because I like books, but, I mean, it's my library, and, you know, that I have access to on my phone and online and synced up with everything is enormous. It's bigger than my university library when I was an undergraduate, when I was a very early undergraduate. My university library wasn't that big. I had to go use another one, but... So these are advantages, you know, and we have access to information, news and so forth, and, you know, really quickly, we have potentially access to different opinions. The same time, it's also because there's so much and it's so easy, and because there's algorithms in there that are constantly trying to determine what we should and should not know or what we need and don't need to know, it's much harder to actually pull out of there what is significant and what's meaningful for us, what's valid for us, what's actually a valid point of view or not, to be... Now, and I think, you know, as university professors or university people with higher degrees, we've been trained to a degree to be critical, but most people don't have that criticality anymore. There's also a huge... I think one of the greatest impacts that we're going to live through over the next thirty years is the social consequences of automation. I don't think we've even seen the beginning of it, and I think, you know, when you go, you look at all those guys, mostly guys and a few girls sitting there at Davos, you know, their number one agenda item is automation not only of corporations, but of education. More and more, the systems are coming in to regiment algorithmically how we work, and that's a prelude to automation, and so that automation of all these tasks which were seen as higher-knowledge tasks are going to have a very negative effect on society. I mean a really negative effect, because, you know, knowledge is not algorithmic. Algorithms are very powerful and they do great things for us and have done. All technology to a degree is algorithmic, you know, in some degree or another, so it has... It's a powerful tool, but, you know, when you start automating, when a few people are deciding what the model for knowing is, and that becomes automated throughout society, you're in a very dangerous place.

[Federica]: So if I heard what you said correctly, we don't have good consequences and bad consequences. We just have consequences and each of them can be looked at from two different perspectives. For example, we have lots of news. This is not good or bad. This is good because we have lots of news, but this is bad because we are overwhelmed by the amount of news, so ultimately technology is neutral, like some say, and what is good or bad is just the use we make of it. Is this correct?

[Robin]: I think that's a little bit of a simplistic model. I think it's not that technology is neutral. It isn't. Of course, every technology offers you a set of possibilities. In media studies we call it 'affordances'. You may call it, you know, 'possibility' is another thing, 'agency', some people, you know, that these systems have their own agency. So there's possibilities. Latour has written a lot about this. John Law's written a lot about this. There's these affordances, if I use the media studies term. It has a set of possibilities of things you can do and a set of possibilities of things and a set of limitations or what you can't do. You know, there's things that it allows and things it doesn't allow. Okay, so those affordances, but those affordances can be used positively or negatively. You know, and that's a social choice, and that's a choice by governments, but also by us, how we use it, what we use it for, what we do within those affordances, what we allow those affordances to do, where we allow control on those affordances, where we don't allow control on those affordances. This will have social consequences because those are social actions. Those are social choices. The technology is not making those choices. The technology does not make a choice. Digital technology does not make a choice about what an algorithm does. The writer of the algorithm makes those choices. The algorithm plays out, even if it's a machine learning, even if it's a learning algorithm, the designers of that algorithm are the ones who made those choices. Those are social choices. Those are choices about what is and is not knowledge, what is right and what is wrong, what is acceptable and not acceptable, what is good for society and bad for society. And they're making those choices, and that's where we have to get these things disentangled. Yes, the technology has limitations and possibilities, you know, and so forth. Of course it does, and the affordances can drive to negative effects. You can say, you know, a gun, ultimately, you know, is for shooting a projectile at high speed, you know, and it tends to be in impact with something that's living that is going to be dead. I mean, okay, it doesn't have to be used that way, but that's kind of the reason for it. You know, I mean, it's... Okay, that's a strong agency of a gun, you know, and so, you know, if you have guns, you know, chances are they're going to mostly wind up fulfilling that agency. Not... Again, it's in someone's hands or someone's put it in an automatic system or whatever, you know, but, you know, at the same time, I mean, when we create things, when we create technologies, when we put them into settings and we distribute them and put them to use in society, these are social choices. These are not technological choices. These are social choices that people are making, somebody somewhere. And that's where the good and bad comes from, because it's good or bad social choices.

[Federica]: Concerning the use of technology for social control, ultimately, you've just said that you don't see good things happening in the future. Are you a pessimist, or you actually think there is hope, like we have the possibility to make social choices that save us somehow from this looming scenario?

[Robin]: I'm a little bit of both, a pessimist... I have a great faith in people, ultimately. I think in the short term, not always. People can be doing really stupid things. I've done really stupid things, I'm no different than anyone else in that way. We can do really stupid things. But ultimately, I think people as a whole wind up trying to bring society back to... Because we're social beings, we like society to work. We like people to be happy together, mostly.
There's dysfunctional people as well, but generally we're happy with that. We prefer that to conflict, and so I think ultimately, people try and bring it back to some sense, you know. I'm probably pessimistic about the ability of our leaders, be they in the corporate world or in the governmental world or even in the governance of universities, doing that for us. I'm very pessimistic about that. I don't think they have the competence to do it, and I don't think they have the... I think their desires do not align with the needs of people, and so in that sense, depend... I think it's going to get worse before it gets better.

[Federica]: So people, stay awake. It's even us discussing this right now.

[Robin]: Exactly.

[Federica]: Can you explain the title of your book?

[Robin]: Yeah. The reference is very straightforward. I mean, I'm not sure if I'm happy with the subtitle, you know, Digitality and its Consequences. That was a kind of a compromise between me and the marketing people at the publishers. 'The machine in the ghost' is a direct reference to Gilbert Ryle's philosophy, you know, on the ghost in the machine, which is still one of the best critiques of Enlightenment dualistic philosophy, idealism, and, you know, that there's the world and there's some ephemeral thing which is our minds, you know, and the two never meet, you know, except through some veil that they talk through this, and the representational theory is based on that. And that's why I chose it, because Gilbert Ryle very successfully critiqued and debunked the representational theory of knowledge, and he talks of the ghost in the machine. Right? Our body's the machine, the mind is the ghost, you know, and this is just a nonsense. So I wanted to keep the nonsense of it but turn it around because we talk of digitality in the opposite way. You know, digitality, is, you know, digitality or, you know, we talk about it as being virtual. You know, it's somehow, you know... Everything that goes on in the digital world is non-material, it's kind of ephemeral, it's conceptual, you know. Which is nonsense. Okay? You're a computer scientist. You know this is nonsense, I mean, it's all a machine. You know...

[Federica]: I also work with intangible cultural heritage. Dances, traditional knowledge, cuisine.

[Robin]: It's thing... Yes, intangible. Yes, I mean it's about as tan... You know, it's more tangible than the tangible heritage, you know, which we take and take out of context so that it winds up being dead. You know, so intangible heritage is the more tangible heritage. Right? It's something people are actually doing. You know, it's performance, it's in the world. And so I wanted to also, in some degree, debunk that idea that you have these machines that somehow, you know, that there's this ephemeral thing, this ghost, which I wanted to put the machine back in that ghost, so I said 'the machine in the ghost,' which is the reverse of Gilbert Ryle. He talked about the ghost in the machine. I talked about the machine... I want to put the machine back in the ghost of virtuality. You know, these are machines. They're doing mechanical work (okay, electromechanical). They're doing that, and there's a process going on there. And the point is that it's invisible to us, and the only... You know, the only ephemerality of it is because it's invisible to us, but it's not ephemeral. It's very physical. You know, it's very physical. You know, it's material in the world, okay, even if electrons flying down, I mean, it's all control, it's all switches. At the end, it's just a bunch of switches. Right? We know that. Okay, the transistors, it's just a bunch of switches going on or off certain logical orders, but then the cleverness comes in how you build on top of that, how you write to screens, how you make sound. It's all incredibly clever. It's phenomenally clever, you know, but it's just a machine.

[Federica]: You really ruin the party to those who like to come up with these labels. Did you know that besides intangible there's also e-tangible heritage?

[Robin]: [laughs] Yes, they love these things, don't they? Yeah, because it fits that knowledge model, right, which Gilbert Ryle debunked, I mean, ages ago. I mean, when did his book come out? I mean, I'm trying to remember now. Would have been in the '60s or '70s. And I forget these things all the time, but it fits the representational model. So, you know, if you accept the representational model that there's some mind in there, that knowledge is this ephemeral thing that exists out of materiality, then, you know, e-intangible heritage makes sense, you know, but if you don't believe in that, it just sounds like nonsense because, you know, if knowledge is something we perform, it's something we do — which doesn't mean it's easy, doesn't mean it's not conceptual; you can talk of concepts and things as something we do, you know — then it becomes a very different game altogether.

[Federica]: I have one last question for you that broadens the scope of our conversation in a way. Tell me if you're comfortable answering this. When we speak of technology and digitization of society and all of this, we normally think of ourselves, we lead the transformation, etc., but the world is a big place and the world is a diverse place. So not necessarily in your book, but in general, do you consider how the consequences of digitality might not be there yet for some people, or rather might be impacting them differently because of the conditions in which they live, because of who they are?

[Robin]: I do deal with that. I mean, I don't deal with that in the book. I deal with that in other spaces where I work, to a degree. You know, I think one of the things we also have to realize is to the degree that, even in other countries, even in other places, even in other poorer societies, how digitality has even pervaded those spaces, especially with cheap Chinese phones and so forth. They often work In peer-to-peer spaces, Bluetooth spaces. They don't have access to the Internet, so they're working in those... But it's incredible how much digitality has really pervaded even many of those spaces, you know, which is a major change. I mean, that really is quite broad.

[Federica]: We go crazy when we see a picture of a Buddhist monk with a smartphone, don't we?

[Robin]: Well, I don't. Why? Why wouldn't he have a smartphone?

[Federica]: Well, because it seems out of context.

[Robin]: Yeah, but that's our perceptions of what they should be. That's one of the other areas of my work is working with Indigenous people and Indigenous knowledge systems, because our expectation of who they should be, you know. Well, they're modern people like us. They're the same... Okay, they have different cultural traditions. You know, they come from, often, a very difficult history that no one's still willing to talk about in many cases, and those histories need to be told, and sometimes digital technology's helping them tell that, or they're using it as a media to help themselves tell those stories or archive them as well and things like that, but, you know, at the same time, they're just modern people. I mean, we don't... My mother was Irish. I mean, we were colonized. We fought a colonial war. We were ethnologized. You know, my wife is from southern Italy. There were still people doing ethnographies in southern Italy in the 1970s. You know, okay, it's not the same thing as what Australian Aboriginals or the Native Americans or First Nations people of North and South America suffered. That's on a scale completely different, you know, and I'm not pretending that that's somehow equal, but, you know, we don't look at an Irishman or an Irish woman on a phone as though that's something so weird. Yet the jokes of the Irish that you had at the turn beginning of the 20th century would have played that same game. Irish were classified in the United States until the 1930s as Black. So it's our, in continuing colonial expectations about these people, you know, that the person in Papua New Guinea in the backcountry who has a mobile phone because they can, which have absolutely no possibility of access to the Internet because they're too far away from anything unless they could get satellite — and some of them do, by the way, have satellite — but who often use it to communicate when they're crocodile hunting by Bluetooth, and they use the lights also because they're really useful for dazzling the crocodiles. Why shouldn't they be using this modern technology?

[Federica]: An overview, even a history, of how digitality is impacting different cultures around the world, how it is being received, would be very interesting. Because the trajectory of some technology seems to be from the First World to the rest of the world, do you think — and this is a provoking question — that it can be seen as a form of white male colonization through technology, technological colonization?

[Robin]: No, it's not that simple, although you cannot separate completely the fundamental aspect of new digital technologies as archive from the colonial program. The archive is a technology of colonialism and, of course, modern surveillance and modern... You know, even the business model of Google and Facebook and so forth is ultimately a colonial program. It is gathering information about people so you can control them, and it was a colonial invention of how to use the archive. And the archive was, okay, it was paper archive, paper and, you know, an image and drawings and maps and so forth, but the archive was a central technology of colonialism. Colonialism couldn't have happened without the archive. So that sense, the technology itself is still culpable — very much culpable — in the colonial program. But at the same time, you know, that doesn't mean that it can't... Like the archive itself also was able to... You know, the archive itself, even in its what we think of as normal form, right, (the building with papers and so forth), also acted as a point of liberation for many people. So yeah, I mean, it's this complicated thing, but no, it isn't so simple. You know, the Internet is not... Well, maybe the Internet, you know, but I mean, digital technology is not just white. It's not just male, though these days... For a period of time it was heavily dominated by the coders in that sense, but we also remember, you know, through the '50s and '60s and even into the '70s, there were a huge number of women involved in computing, massive number of [unclear 01:09:10]. Not unproblematically. It's not that they were treated equally. There were some very brilliant women who made some incredible breakthroughs, important contributions at that time. Were they treated equally and fairly? No. No, they weren't, but, you know, still they had that... Because, you know, computing was mathematical, and women had more of a chance. It soon became engineering; you know, of course, what woman in her right mind would go to an engineering department? [chuckles]

[Federica]: [chuckles]

[Robin]: So that changed. I'm not casting aspersions in your direction, but I think you understand what I mean. But that's changing also now, but I don't think it's... I don't think we can make it that simplistic, but yes, is what we see going on in social media today, have strong ideological and social connections to colonialism? Yes. Yes, it does, and it still does, and we still have not resolved that socially in our, for what we're doing. Absolutely. Thank you.

[Federica]: Thank you so much for your time.

Thank you for listening to Technoculture! Check out more episodes at or visit our Facebook page at technoculturepodcast, and our Twitter account, hashtag technoculturepodcast.

Page created: November 2019
Last update: October 2020