Maarten Van Dyck is a philosophy professor at Ghent University in Belgium, where he also directs the "Sarton" Centre for the History of Science. In October 2020, Maarten and I sat down at the brand new Ghent University Museum to talk about science communication in a time of crisis, like the COVID-19 pandemic (see link at the bottom of this page).
During that interview, Maarten touched on the role of trust in the advancement of science as we know it today, i.e. an "institutionalized social practice". In a time where skepticism and even conspiratorial theories are spreading over the internet, Maarten challenged the notion that science does not need trust, that science is all about facts, it is evidence-based, and therefore people should "believe in science". Actually, it turns out that the two pillars that allow science to work are facts, proofs, AND trust within the scientific community.
Maarten distinguishes two ways of thinking about trust and science: trust IN science, and trust WITHIN science. Trust within science is the trust that scientists have in each other, based on the assumption that they are in 1. good faith and that 2. they are intelligent. This means that they rely on other scientists' results not just because they "believe" them, but because there is a "good reason" to believe them. And the good reason is that science as an institutionalized social practice has a mechanism designed to distribute credibility. Very few scientists go and check their colleagues' results. This is because a basic level of trust is the premise to the system advancing. "It's very hard to imagine how science would be progressing, if you would always have to doubt other people" says Maarten (see full quote at min. 15:38).
What about trust IN science? On what ground should non-scientists accept and rely on what scientits tell them? Maarten says that "if people want to trust science, they need to be able to trust in these institutes [of science]" (see quote at min. 19:30).
Interestingly, Maarten does not necessarily agree that today we are seeing a crisis in the trust that people have in science. Like he said in our previous interview a year ago, the internet and social media in particular have drastically changed the way we communicate, the way we receive and exchange information, and this may be a game-changing factor that is only bringing to the surface a pre-existing situation. He also stressed the importance of time: trust is developed over time, by observing someone's behavior over time, and today's communication technology has shrunk time frames in a way does obstructs this process.
After clarifying the difference between trust and faith (in science education), Maarten gives his opinion on how to engage with skeptics, be it anti-vaxers or flat-eathers. He says that it is very important not to enter the conversation knowing where we want the conversation to end (e.g. "convince" the other party that we are right, that science is right), because that is a "very dangerous way of entering a conversation" (see quote at min. 34:26). There is always a rationale behind skepticism, and that should be addressed from a rational point of view. However, Maarten adds, we are never going to win an argument like that by "lecturing" the other party and insisting that they should "believe" because science is based on "facts". Trust is a way more complicated issue that may require us to engrain our position in a broader system of values, that we consistently embody over time, and that has desirable outcomes (a better life, a better society).
Thank you Maarten for this insightful conversation, which I hope we can continue one day!
Quotes and questions from the interview
Science depends on trust.
Science as a social practice depends on ways of dealing with trust in other people.
5:43 Science is some kind of an organized skepticism.
You're only a scientist if you're wanting to doubt everything. But of course you need to know what to doubt. And to be able to develop interesting doubts you also need to decide what kind of things you're not gonna doubt, and what kind of things you're gonna trust, and what other people you're gonna trust. And so what makes science so powerful is that it's not just skeptical, but that it's also a way of dealing with trust.
The fact that someone is a scientist comes with a certain expectation within the practice of science, that whatever they are saying, they are not just saying it without good reasons. And that they can be trusted. Which doesn't mean that you cannot doubt part of what they're saying, but that you cannot put into doubt a basic level of trust. And so you can look at science as we know it as an institutionalized way of distributing trust, credibilty, in a way that allows this skeptical process of doubting, proving, looking for proofs, to get off the ground, because there is an infrastructure of basic trust.
What's the difference between trust and faith?
There is an element of dogmatism or faith within the structure of any scientific education.
It's very hard to imagine how science would be progressing if you would always have to doubt other people, always have to go to their lab to check what they've been doing, always have to re-start all their calculations because you're not sure whether they are intelligent enough to do the calculations, etc... well, science would become a morass, and not aprogressive activity where people can really build upon each other's work.
When we say that trust in science is in crisis today, do we actually mean science or scientists? Is there a difference there?
Science today is an institutionalized social practice.
If people want to trust science, they need to be able to trust in these institutes [of science].
Maarten says that more people than we think don't buy into the simplistic narrative that science is all about proofs and therefore it doesn't need trust.
Do scientistis "just" trust other people, or do they trust other people *for good reason*, i.e. within a very well specified context (the institute of science) that is designed to allow mutual trust and has a rather well developed system to distribute credibility. However, this institute needs to EARN the trust from the outside.
Do you agree that today there is a problem with trust in science? That today trust in science is in a crisis? That trust in science decreased in the past years or decades, and it's at a low point now?
The importance of time
in developing and maintaining trust.
How do you engage a skeptic, someone who shows some resistance to trusting what science says - about COVID, for example?
When you talk to a skeptic to "convince" them of something you strongly feel about, for example that everybody should get vaccinated, you already know where you want the conversation to end, and "that's a dangerous way of entering a conversation".
Previous interview with Maarten Van Dyck
YouTube playlist with full interview and selected clips: https://www.youtube.com/playlist?list=PLwJ5DHOMVJqU2Po_uzdlrTAv2lyZlgU9S
Go to interactive wordcloud (you can choose the number of words and see how many times they occur).
Download full transcript in PDF (103.09 kB).
Host: Federica Bressan [Federica]
Guest: Maarten Van Dyck [Maarten]
[Federica]: Welcome to a new episode of Technoculture. I am Federica Bressan, and today my guest is Maarten van Dyck, professor of philosophy at Ghent University in Belgium and Director of the Sarton Centre for History of Science at Ghent University. Welcome, Maarten. It's so nice to have you on the show.
[Maarten]: Thank you. Hi, Federica.[Federica]: About a year ago, you and I had a conversation on science communication, and specifically science communication in a time of crisis like a pandemic, for example, and this led us to talk about trust in science, or a crisis of trust in science. That conversation inspired the interview, the conversation, that we are going to have today, which focuses on this issue of trust in science, trust and science, and that is so worth developing further. The links to the interview that we had last year will be in the description of this video. And I would like to start from where we left off in the other interview by giving voice to this common belief about science that you challenged last year. The common belief is that science doesn't need trust because science is all about proof. Science can verify things. It's fact-based, evidence-based. There is no trust needed when you can have proofs and facts. And you challenged this notion, and I found it really interesting, so I would like to give you the floor by starting right there and telling you, look, science doesn't need trust. Isn't it all about proof? Why would it need trust?
[Maarten]: Yeah. Well, so I think it's useful to make a distinction between two ways you can think about trust and science. And one is the starting point for the conversation we were having back then. That is a question of trust in science: people who are not scientists but who are asked to trust science and results being represented by scientists. But there is this other idea which has to do with trust within science and the extent to which science as a practice itself depends on trust. And of course, you're perfectly right in what you said that observation, analysis, trying to find proofs is crucial for what scientists are doing, but then the crucial thing to notice is that this process of trying to find proofs, of analyzing facts, etc., itself is always also a social process. Scientists are never on their own in what they're doing. Even if a scientist is working in his or her own laboratory and can observe the results of what he or she is doing, well, in interpreting what these results mean, they will always depend on things that have been observed by other scientists and that this scientist never has observed his or herself. So there is this, of course, crucial dependence on observation, but equally crucial is the dependence on others, people's observations, on what other people have observed in other circumstances. And there, of course, you need to have this basic trust in the way they have reported their observations. So if you cannot trust another scientist's observations, you don't know what the implications of your observation are. Right? If there are no other observations with which you can put your observation into some kind of relation, you're not dealing with a fact; you're dealing with something different. You can call it a factoid or whatever, but it's just an isolated observation that can only get the status of being a fact, a factual - and the observation can translate in a factual statement that has wider relevance because you can put it into this relation with other observations. And to be able to do this, as I just put it, you need to be able to trust the other scientists, the other people. So without a basic notion of trust, the whole process of analyzing, of proving things, just simply does not get off the ground, and I think this is really a very basic point that is, well, you don't need very complicated philosophical analyses of how proof in science works to see that this is, yeah, of course it is. It's so basic that it's almost trivial, but it's not trivial. Right? Because this then really shows to what extent science as a social practice also depends on ways of dealing with trusting other people, and there is this notion that you also alluded to in your introduction that what makes science so powerful is that it is a critical activity. It's some kind of an organized skepticism. You're only a scientist if you're wanting to doubt everything, but of course, you need to know what to doubt and to be able to develop interesting doubts, you also need to decide what kind of things you're not going to doubt and what kind of things you're going to trust and what other people you're going to trust. And so what makes science so powerful is that it's not just skeptical, but it's also a way of dealing with trust. And I would say that that what the history of science of the past few centuries, let's say starting from the end of the 17th century, has shown us is a way of building institutes - universities, academies, other kinds of institutes - that allow ways (smart, intelligent ways) of dealing with this problem of trust, of distributing credibility in a way that that gives you, as it were, the markers about, "Oh, yeah, I'm going to trust what these people say because they're also scientists." You see, there is the fact that someone is a scientist comes with a certain expectation within the practice of science that these other people, whatever they're saying, they're not just saying it without good reasons and that they can be trusted - which doesn't mean that you cannot doubt part of what they're saying, but that you cannot, as it were, put into doubt a basic level of trust. And so you can look at science as we know it as an institutionalized way of distributing trust, credibility, that allows this skeptical process of doubting, proving, looking for proofs that allows this process to get off the ground, because there is this, as it were, infrastructure of basic trust.[Federica]: Does it help that, in any case, in science, if I wanted to, I could go and check? Because otherwise, I would ask you, what is the difference between trust and therefore believing something because it comes from a credible source, for example, and faith, which is an argument that was made in the past that science is taught in school, in elementary school for example, just like religion. There were dinosaurs and all sorts of crazy stories, if you think about it, on the history of the world that nobody goes out and checks. So what is the difference between teaching science and religion? What's the difference between trust and faith, then?
[Maarten]: Yeah. That's, it's, of course, a very important question, and there are a few things coming together, right? The one is the question about education, and there I do think that you're perfectly right that an education in science cannot start if the pupil is not willing to believe certain things that the teacher is saying just because the teacher is saying that. Right? That you need to start from a basic assumption that the teacher knows things and that you're going to accept them, and only if you've gone through this kind of process, you will end up being in the position where you can start to doubt the teacher. And of course, this does not mean that the interaction cannot be an open interaction where pupils are encouraged to express their questions, but these questions will not be the kind of questions that put into question the authority of the teacher, but rather, "Oh, but how do you deal with this, and if it's like that, how about that?" And then this is, as it were, an invitation for the teacher to further explore the implications of what he or she is teaching to the pupils. Right? So in a way, there is an element of you can call it dogmatism, you can call it fate, within the structure of any scientific education, but - and this is then what leads us also back to the initial question - the important thing is that the end result of this process will be someone, if a person becomes a scientist him or herself, someone who's able to go back and, as it were, question what he or she has learned, etc., and engage in a very thoroughly critical way with the knowledge that's being presented. But you started by saying, "Oh, there's always the option of going and checking and looking for yourself," but how often does a scientist do that? How often do scientists leave their own lab and go to another lab just to check? That's not how science is done. Right? Scientists work on the presumption that they don't need to do that. Of course, there are cases, there are spectacular cases, when doubt arises with respect to the results coming from a specific lab, people starting to wonder to what extent they really can trust these results, and then they will go and then they will check for themselves. But this is really the exception, the exception that proves the rule that normally you don't do it because you don't need to do it. And I would say that this is really one of the basic rules of science. One of the basic rules of science is that you don't need to check the other. Of course, if someone presents a result that's very surprising, then there is... It's interesting. So there is an incentive to try to reproduce it yourself because maybe, "Wait a second, if such interesting things really do happen, then we need to find out more about it," and then you go on. But that's not really because you're not trusting the other person, but that's because you're seeing an opportunity for interesting research being done. And from time to time, the end result will be that the initially interesting result doesn't hold up, but then, again, this will not be interpreted, except for some very rare cases, as having the implication that the trust in the initial scientist was misplaced. It will just mean that these scientists had overlooked something, that they maybe were a bit quick in their interpretation or maybe that they just had bad luck, that something weird happened in their lab that they wouldn't have known. It's only in very rare cases that this will really mean that the giving of trust itself is put into question. So I do think that this is really important that you see what's the rule and what is the exception, and that this really tells us something about science as a social practice. Yeah? There are actually two - and I think it's useful to see that there are two dimensions to this relation of trust in what other scientists are doing. And one is really a more moral dimension that you expect the others to be, well, not trying to deceive you, that they're telling it as they think it is, that when they report an observation that they at least themselves believe that this is really what they saw. Maybe they are mistaken, but there is this basic moral presumption that people are not trying to deceive each other, that they are open about speaking - well, that what they are saying is, in a way, honest. There is a second dimension, and the second dimension in a way makes it more interesting and more complicated, and this is the presumption that the other people can be trusted in the sense that they are intelligent, they are smart people, that when they say something, they don't just say - well, they say because they believe it; that's the first dimension - but that if they believe it, then there's probably a good reason to think this. So, and I think that that, again, when you observe science as a social practice, that you can see that both dimensions are always present, that scientists will trust other scientists, and, again, some they will trust more than others. There are of course gradations there, but that the whole practice depends on this shared trust along these two dimensions that you're engaged in a practice with other people where the other people are both honest and intelligent. And it's this presumption that allows science to progress. I think if you would take away both notions of how you can depend on others, it's very hard to imagine how science would be progressing if you would always have to doubt other people, always have to go to their lab to check what they've been doing, always have to restart all their calculations because you're not sure whether they're intelligent enough to do the calculations, etc., etc. Well, science would become a morass and not a progressive activity where people can really build upon each other's work.[Federica]: It's really interesting what you said because scientists, the people, need to show good faith and are deemed to be intelligent, so we're talking about the people. I just would like to clarify something. When we talk about a crisis of trust in science today, do we actually mean science, or do we mean scientists? Is there a difference there? I believe there is.
[Maarten]: Yeah. That's indeed an important question, and I think the things we're discussing now indeed is closely related to that issue. And the way I would put it is that science is a social practice, an institutionalized social practice, in the sense that we do have institutes like universities that take care of the basic organization of this practice, that this institutionalized practice is supposed to have a clear effect on the scientists, and in some way, again, if you think about the two dimensions, it's having an effect in the sense that if you are going through the process of becoming a scientist, it will become deeply ingrained in you that you have to be honest, that's, as it were, faking results, etc., is, it's kind of the worst thing you can do. There's a strong moral code that is implicitly in the first place ingrained in science students. And there's also the second component of the selection, so there is an aspect of, as it were, disciplination, of really making these scientists and people have some kind of moral codes, at least when it concerns issues having to do with how you disseminate results and observations, and then on the other hand the selection in terms of something like competence. Right? That only the people who show that they have the kind of intelligence, the kind of practical know-how that's needed to contribute to the discipline will get the entry ticket, as it were. And so in this way, the institute takes care of making sure that the preconditions are met for this presumption of trust, so it's not... Scientists know that they can trust, so they have good reasons to trust because the institute has taken care of making sure that the game, that the, yeah, that the game can be played according to certain rules because everybody who's in there agrees with the rules and has passed certain tests that show that they can play the game. So there, it's really the institute that's doing crucial work, and I do think that, so if we then go back to the issue of trust in science as compared to its trust within science, so it's really important to see that if people want to trust science, they need to be able to trust these institutes, because it's... I do think that people looking at science, ordinary citizens, whatever stripe, being confronted, for example, over the past few years with scientists giving them results, talking about things, etc., that these people to a large extent understand that science depends on trust, that these scientists cannot have done all the calculations themselves, that these scientists cannot have done all the observations themselves. They somehow, I would say, see that science is more complicated than that. They understand that the two easy stories about science as being just about the facts and proofs does not really hold up because they notice that whatever scientist it is that's on the television talking about things will not have seen all these things, will not have done all the calculations, so that he is also just trusting other people. But here is, of course, the main issue. Is he just trusting other people, or is he trusting other people for very good reasons? And there, I think, if we think about issues having to do with trust in science, it's important to, on the one hand, stress the role of the institutes in organizing this trust and in this way showing that these scientists are not "just" trusting other people, that they are trusting other people within a very well-specified context where maybe issues of trust play out differently than they play out outside of this institute. So that's one important thing. And the other is that you do then also have to realize that, well, these institutes have to earn this trust from the outside. So on the one hand, you can answer to skepticism being expressed towards what scientists are saying with this move of saying, "Oh, but they're not just trusting each other. They're trusting each other within a very special context that's actually designed to allow this mutual trust and that has rather good developed mechanisms of credibility, of distributing credibility, because that's what the institute is; it's a machine that produces and then further allows the circulation of credibility. And it's quite a complicated machine that's rather good in doing that." So part of the answer needs to be pointing this out. But the other part of the answer, I think, is taking seriously the challenges towards this machinery and taking seriously then the issues that have to do with, "Oh, wait a second, so you do have this special machinery for trust and credibility, but who's actually allowed in, and what are the kind of issues that you are actually investigating, and what are the kind of questions that you're not asking?" etc. Which are more questions, again, at the level of the institute than at the level of the individual scientist. But I do think, again, that ordinary citizens do have a rather good sense of the fact that science as this kind of social practice is asking certain questions and it's maybe not asking other questions, and I do believe that it's important to see this not just as distrust in, or that that this can give rise to something that we can call distrust in science, people not just trusting science, but that there is a rationality to that kind of distrust that it's a kind of question that needs to be asked and that needs to be debated. What are the issues being investigated? How are the questions framed? What are the issues not investigated, and what's the relevance of what's not being investigated for how we interpret the conclusions that scientists come to? So I would say that, again, focusing on questions on this institutional level is really where the interesting debates can start.[Federica]: So to recap a little bit, scientists trust each other because they play within this special context that is designed to distribute credibility, so there are rules, and this all makes sense. From the outside, regular citizens who are not scientists, regardless of their background, if they show the signs of skepticism in the science, which can manifest in not wanting to get a vaccine, for example, this year, and being flat earthers or buying into QAnon narratives, etc., what do you tell them - "them" being outside of this mechanism - should they trust the institutions because they know that the mechanism just exists, even if they're outside of it? And I would like to preface your answer by just asking you, would you agree that there is a crisis in trust in science today, or, you know, we're speculating, or this is something real, these citizens that display these signs are there and there's lots of them out there, so it's an urgent discussion we're having?
[Maarten]: Well, I'm not sure to what extent we have more distrust in science now than in any other periods, but I would agree that it's probably becoming more visible and more... It presents itself as a more urgent matter, but I wouldn't be too sure whether this somehow shows that the basic attitudes that people have towards science have changed or have shifted. And one thing - and it's something we have not talked about yet, but that's maybe important to add to the discussion - one thing that I think is really crucial in how relations of trust are built and maintained in whatever kind of context - not just within science and not just trust in science, but also the way that an individual or within a group, whatever human social group you have, trust will be a crucial component of the group that you know that... And, again, on both levels, that you know that the other people that you're dealing with can be trusted in the sense that they're to some extent honest, not trying to deceive you, and also that they're to some extent intelligent, that if they say something you're going to take it seriously. Maybe you're not going to agree, but you're going to take it seriously. This kind of, these very basic attitudes we have towards each other, which depends on trust. And one thing that I think is really important in the question how these relations of trust are built, maintained, and also lost - because, again, within every kind of social group, trust can be lost as well (sometimes justifiably, sometimes not) - but what is really a crucial factor is time. Trust is something that it's, it comes from observing other people's behavior over time. It comes from interacting with other people within time, saying something and the next day you meet them again, and also it's just having, you know, the phenomenon that having a conversation again and again and not really much new is added in terms of content, but somehow it starts to have an effect. What the other people are saying starts to have an impact on you - not because they've said something new now, but because you've spent time thinking about it, you've spent time, as it were, somehow taking it within you, and maybe answering something and then seeing what happens with the answer, so I would say that in very general terms, relations of trust really depend on ways of dealing with time. And I do think that this is something that nowadays has shifted rather drastically. And this is, of course, not a new analysis, but that's social media and the way temporal framing happens on social media where everything is instantaneous, that this is really having an impact on these issues having to do with trust. And there, so my main claim probably would be that it's not because people are not rational enough and it's question of just explaining better or explaining more, but then it's really also way, or that really has to do with the temporal framing of how discussions are being played out, how information is circulating, etc. I'm not really able to put it sharper than that, so in a way it's a very basic intuition, but I have the strong feeling that this is really where, what's happening with us today and that we're, as it were, too closely connected to this phenomenon to really see with a good distance the effects that this temporal shift in how we communicate has, what kind of impacts that has had on us.[Federica]: We are almost running out of time, but I would like to ask you one more question inspired by all you've said so far that made me wonder about the relation between trust and, like you said, builds over time and rationality. Not just because you're a philosopher and a historian of science, I assume that you know how to engage one of the skeptics or even non-believers, what we want to call them, say someone who this year didn't want to get vaccinated. Sorry for the example, but it's just the obvious example this year. We know how frustrating it can be to talk to one of those people - and trust me, I don't think of myself as being better or in a higher position than them, because I have had my own doubts on the vaccine this year. Eventually, I got vaccinated and all of that, but I'm just saying I recognize that there would be space for legitimate concerns. I don't believe that there's cat pee in the vaccine or some substance to sterilize humanity. I'm just saying they kind of developed it quickly. So it depends on what doubts one have, but practically speaking, you know, we've talked - you've talked beautifully about all of these things, but when you have this real person in front of you that doesn't show willingness to let go, there's a sort of resistance to let go and trust, based on the fact that you just said trust is developed over time, complex, is there any way in which this person could be engaged at the rational level by saying, "Tell me what doubts you have, and I will respond," as if, again, one would imagine, if you tell me what doubts you have and I respond, I have a rational answer, based on facts and all of that, that eventually I will have to convince you - which is not the case, is it? So how would you, or how do you (if it's happened to you), have this real conversation? How do you engage with someone like that in real life?
[Maarten]: Yeah, so what I like about how you put it is that you put it as a question of "how do you engage with." And I think that that's already part of what I think is not an answer, but the starting point, and it's not really, or I don't think that it's most fruitful to think about, "How am I to convince people?" And then you're indeed immediately in this framing of, "Okay, you give me your reasons. I will give you my reasons, and I will show you that my reasons are better than your reasons," and once you're into that game, you're not going to win. Neither is going to win, actually, because you're not starting a conversation. You're starting something else. And of course, debates can be useful. I'm not saying that there is not a very important place for debating issues, but I do think it is dangerous to think about all kind of interactions we can have on the model of a debate, and I do think that at least starting to think about issues as conversations, as joint explorations - whatever the right framing is - opens up another space that in the end, because it opens up, as it were, a joint space, can have more effect in moving people, and indeed this is kind of a difficult case because it's a case where, because on the social level, things depend on the outcome. People feel very strongly, and indeed, I on a personal level, then also feel like, "Well, everybody should get vaccinated because it's in everybody's, or in our joint, best interest that as many people as possible do," and you start from a rather strong feeling about this. Which, of course, in a way, you alway- you already know where you want the conversation to end, at which point. Right? And that that's kind of a dangerous way to get into a conversation, so I'm not really having an answer to your question, "How do you do this?" Apart from...[Federica]: Has it happened to you? Oh, yeah, you were giving the answer, sorry.
[Maarten]: No, no, so apart from not seeing it as a debate, really. I think that's the most important general idea I have about is that these are not the issues that, at least when you are engaging with other people, that you should model on the level of a debate. It's more about... I think it's, for example, much more fruitful to show other people why you're doing what you're doing, to give them, as it were, an opportunity to follow your kind of thought process, but also your own doubts, how you integrate this within a broader way of living that you're getting vaccinated because you hold certain values dear, because you care about other people, because you know that you've done other things. That's really giving at least examples of how the decision to get vaccinated hold together with other values, in your case, and I think by trying to show the things, again, is something that's, in general I think, in any kind of human relations, that's often goes deeper, that's more meaningful than someone who's just explaining, right? Someone who's explaining, "This is how you should live. This is what you should do." That's not a very attractive person to be talking with. It's much, again, more satisfying on a human level to talk with someone who's not just explaining, but who's showing, who's inviting you within their way of thinking, so it's not a recipe for how to deal with things like this, but it's an invitation to think about it in a, yeah, somewhat more humane way, maybe.[Federica]: Absolutely interesting. So don't engage in a debate, but very interestingly just hold on to embody your positions and ingrain them in the broader set of your values. Which reminds me of one of the many nice quotes about science which is, "I believe in science because it works," and just the same, I believe, for example, that everybody should get vaccinated because of this set of values which leads to a society where I believe that we live better. That's a strong argument, in fact, based on where our actions lead us. Absolutely interesting, and I would like to keep talking to you, or to listen to you talk about these things. I would like to give you the opportunity to add something to conclude or some special remark you would like to make before wrapping this up.
[Maarten]: I guess the main thing I'd like to stress is probably, again, an invitation to everybody listening to notice the absolute importance of trust, and I think it's, once you notice this in yourself, it gives you another view on humans, and also on rationality, that rationality is not just a question of being smart and being able to reason well, but that it's also a deeply social notion, and that this is a good thing.[Federica]: Thank you so very much for your time today. In the description of this video, I will include the links to our interview last year on science communication, in particular science communication in a time of crisis, and thank you again for being with us today. I wish you a great day ahead.
[Maarten]: Thank you, Federica. Bye.
Thank you for listening to Technoculture! Check out more episodes at technoculture-podcast.com or visit our Facebook page at technoculturepodcast, and our Twitter account, hashtag technoculturepodcast.
Page created: October 2021
Last update: March 2022