Transcript: Azim Shariff
My conversation with the UBC professor and Canada 150 Research Chair
People on the left have been reaching out to me for years, saying that they feel like the mainstream media has become politicized, is biased in favour of the left — and they no longer trust it to report the news fairly. Now, a group of international scholars has published an interesting research paper that addresses this dynamic. It finds that when people perceive institutions to be politicized, they lose trust. Even if they happen agree with its politics.
Azim Shariff is a professor and Canada 150 Research Chair of moral psychology at the University of British Columbia. The pre-print paper we discuss today is titled “Even When Ideologies Align, People Distrust Politicized Institutions.”
This is an edited transcript for paid subscribers. You can listen to the interview for free here.
TH: Azim, welcome to Lean Out.
AS: Thank you, Tara, how are you?
TH: I’m great, it’s really nice to have you on the program today. I learned about your work, as you know, from a recent talk that Jonathan Haidt gave at UBC, as part of the Phil Lind Initiative. I was really pleased to hear about your recent pre-print paper, which covers a really important subject. This paper — which, we should stress, is not peer-reviewed yet — is titled “Even When Ideologies Align, People Distrust Politicized Institutions.” To start today, tell me how this international group of scholars came together, and what was the motivation for looking into this particular topic?
AS: Great question. So, as you can see, the paper kind of gives away the findings right there in the title. And it was interesting the way that it got worked into Jonathan’s talk. So, he and I had just had lunch the prior day, and my collaborators and I had just posted that pre-print of the paper — as you mentioned, a pre-print that has not yet undergone peer review. We just posted it earlier that week. I told him about it, and within however many hours it was between then and the talk, he managed to incorporate it in. We’ve gotten a positive response, because I think it is, as you stress, a topic that a lot of people care about.
As for the collaborators on the project, it’s led by Cory Clark, whom I’ve collaborated with since she was in graduate school. As well as Jim Everett, who I also think I’ve collaborated with since he was in graduate school. And then Calvin Isch, who is a student who’s working with Cory. The group of us have worked together for quite a while, and we’ve been discussing this issue for quite a while. Eventually conducting some studies on it, starting, I think, late in 2022. And then putting together this pre-print as fast as we could, because it seems like a perennially pressing topic. But maybe one that’s increasingly pressing.
TH: Absolutely. This paper covers three studies examining attitudes towards 40 institutions, organizations, and groups of professionals, ranging from the World Health Organization to scientists, the Supreme Court, pharmaceutical companies. You tested associations between perceived ideological slant, perceived politicization, and public trust — or, the willingness to support and defer to the institution’s expertise. Can you briefly walk us through the findings of this paper?
AS: Sure. I will, and I’ll give you some background on these ideas as I do. So, in part, this was inspired — speaking of Haidt — by a famous talk that he gave at the big social psychology conference, maybe 10 years ago now, where he pointed out that the disproportionately high numbers of political liberals in social psychology can be problematic. Since then, there’s been a lot of researchers who have pointed out how overwhelmingly liberal academia tends to be. Not uniformly across disciplines, but on the whole it tends to be slanted towards liberals. There’s many reasons for that, and there’s been a wide discussion about that. What we hypothesized is that the slant itself matters less than the degree to which people’s politics affects the work.
The example we use in the paper is a catering company that has a very conservative slant. It probably wouldn’t bother you that much to be catered by a group which has a slant one way or the other. It’s when those things start affecting the work. Academic research is, I think, one area where it probably does affect the work quite a bit. And we go through it in various different organizations where that might happen. What we find is, unsurprisingly, when a organization is perceived to be politicized against the party that you associate with — so, if you are a liberal and you think that the organization is slanted towards conservatives and their conservatism affects their work — you’re going to distrust that organization. And the same the other way around, right?
What was more surprising — and I don’t think we predicted one way or the other beforehand — but what was more surprising is that even if it’s on your own side, even if you perceive that an organization is slanted towards your own political party, you still find that the more politicized you perceive it to be, the less you trust it.
So, I’ll put my cards on the table right now. If I was a liberal, which I am, and an academic, the more politicized you see academia, the more you would distrust it. I think that’s probably something that a lot of people are experiencing. That even though they politically align with the side that they perceive the institution to be on, they don’t gain more confidence when they see the politics seeping into the work that’s being done.
TH: I think it’s so interesting and something I can certainly relate to. In terms of fully understanding the differences here, what’s the difference between an ideologically slanted institution and a politicized one? How do we measure that politicization?
AS: It’s simply that question about how much do you perceive that the politics is affecting their work? Let’s take your industry. In journalism, you could imagine people, the vast majority of journalists, being to the right or to the left. But you could imagine that they put up a pretty good firewall between their own politics and the way that affected what they wrote about, especially what they would report on. In that case, you’d have a slanted organization, but you wouldn’t have a politicized organization. I don’t know your industry as well as I know mine. I suspect that it is an industry where people’s politics actually does seep into the work quite a bit.
Similarly, in my industry, you can imagine a situation where, yes, we’re liberal and we’re liberal for many reasons. There’s personality characteristics that draw more liberal people to issues of exploration. There may be something about being within the universities for a long period of time that makes people more liberal. There may be all these reasons. But if you are able to put up a firewall and you say, “Okay, well, I’m not going to let my politics affect the math that I do, or the chemistry that I do, or the psychology that I do,” you would have a slanted organization, but not a politicized organization.
TH: It makes so much sense to me. I’m also on the left, and journalism is overwhelmingly left-leaning. I can’t actually think of a single instance in maybe 21, 22 years of media where I’ve worked with someone who describes themself as a conservative. That’s how left-leaning it is. But definitely where I started to get very concerned about it is when the work itself became overtly politicized. So, this makes so much sense to me. Can you tell me a little bit about how you’re feeling about academia right now — and what your concerns may be about that?
AS: Yeah. I would mention, at the outset, that I don’t think that there’s never a legitimate reason for an organization to be politicized. I think that there’s many examples historically of situations where taking political stances has been a smart and probably right move. What we are trying to test in this paper is that there may be trade-offs involved, when it comes to issues of trust and deference. In terms of deference, another measure that we had is, “Would you listen to recommendations from this group?” Say the WHO is making recommendations about whether you should, you know, get vaccinated or something. And you perceive them to be slanted, or you perceive them to be politicized. You’re going to defer to them less if you perceive them to be politicized.
Within academia, especially within science, I think that it probably is good to erect as as high a firewall as you can between politics and the research that we’re doing. Because it’s so easy for the politics to pervert the scientific method, right? The idea of the scientific method is that you want to manage confirmation bias. You want to have a set of structures, a set of norms and rules and institutions and procedures, that minimize the chance that you get something wrong because, for whatever reason, there’s a desire for one outcome to be more right than the other.
One of the things that my field has gone through over the last 10 or 15 years is the replication crisis. This has been discussed widely enough that I think people outside of academia have heard about it. A lot of what social psychologists, psychology in general — social psychology is where it started — a lot of what we had thought we’d known has turned out to be not replicable. It turns out to be what we call false positives. We thought that a finding was true, and it turned out that, when you did the methods properly, it was not. One of the reasons that happened is because it’s pretty easy for an experimenter to let their biases unknowingly guide their research towards a particular outcome. We try to have systems, such as peer-review, to manage that, to prevent that from happening. But it happens. And we have to be constantly vigilant to try to make sure that what we are discovering is the truth, and not just something that our biases are leading us to believe. That is falsely true.
I think that politics can create a very strong motivation for a particular outcome to be true. I’m not saying that this happens purposefully. I think that our underlying beliefs can nudge us unconsciously towards doing science a particular way, asking certain questions, preferring certain analyses, doing any number of what we call “researcher degrees of freedom,” which nudge us towards a certain outcome. I think that compromises our ability to get things right. And I think that probably justifiably compromises the trust that the public should have in us.
One of the nice things is that, within psychology, we’ve started a number of new initiatives and rules and norms — to try to clean up our act. To try to strengthen the scientific method and try to reduce our biases. So, that’s good. And that should mitigate the impact of any reason for favouring a particular outcome, including political reasons. So, that’s good.
But I do think that, yes, academia has become politicized. It probably always was politicized. It’s probably more acute now because we’re in a moment of higher partisanship. I think that that’s had some pernicious consequences.
TH: It’s interesting. Reading the paper, I was thinking about how, as you’re doing this research, all these real world examples are unspooling. One of the ones you mentioned is the scientific journal Nature endorsing Biden for president. The results of that were not just lowered trust in that particular scientific journal, but lowered trust in scientists in general. I was thinking also about a recent Canadian example of overt politicization, and this has to do with Jordan Peterson. I know you started your career way back as a student of his. This is regarding the College of Psychologists of Ontario. They have ordered Peterson to undergo media training, saying some of his tweets were degrading to the profession. This was perceived by some as politicized, given the fact some of these tweets had to do with Pierre Poilievre, leader of the opposition, and Justin Trudeau. What are your thoughts on that particular situation, related to this research?
Keep reading with a 7-day free trial