Image by Engin Akyurt from Pixabay
As an information literacy instructor, I spend a lot of time talking to students and library patrons about authority. As the ACRL Framework tells us, authority is both constructed and contextual. So someone who’s considered an authority in one area or situation might not be one in others. And who we bestow authority on also depends on a number of indicators, both emotional and objective. Like, not only what credentials this person has but also whether or not we feel like we can trust what they say.
Authority has become a complicated thing the last few years as mistrust in expertise keeps rising but I think this year has been especially challenging because the COVID situation has been one where a big reason people stopped trusting the experts is because what the experts were saying kept changing. Like with the mask thing. First we didn’t need them, then they became important and now they’re thought of as essential.
Of course, there are political reasons behind at least some of this inconsistency. It doesn’t help that the person in the United States who is supposed to be viewed as most authoritative of all is…well, we know who he is and what he’s like.
But there’s also the fact that the reason authorities and experts changed their advice over time is because our understanding of COVID changed over time. The whole reason COVID was such a threat at first was because we didn’t know how it worked or how to treat it. But the experts had to tell people something. So they worked with the information they had and gave what recommendations they could: wash your hands, sanitize surfaces, etc.
And I don’t know about you, but having some information about what steps I could take to help protect myself made me feel a little less anxious at the time, even though the advice we’re being given has changed a lot over time.
Like a lot of people, I was frustrated when the recommendation suddenly changed from things I could do fairly easily (assuming I could find soap and/or sanitizer anywhere) to something that was a lot harder to do. Here in New York, things shifted pretty quickly from being warned against buying masks (since they were in such short supply), to masks being recommended, to masks being required. I spent an entire weekend cutting up old pillowcases to try to make a mask for myself because there were none to be found anywhere. It wasn’t until a friend with sewing skills dropped off a mask she had made on my doorstep that I felt better about the whole thing. Now I have an entire basket full of them next to my door so I can grab one any time I go out.
When I first started wearing a mask, I thought I was doing so mostly out of respect for other people who are more at-risk than I am because that was basically what the experts said I was doing. Now they say that actually wearing a mask helps protect both me and the people around me.
To some people, these changing recommendations seem suspicious. Why would experts recommend one thing and then change their minds later? Shouldn’t they know what they’re talking about?
In a familiar, predictable situation: yes. In an evolving one where our understanding of what’s happening changes over time: no.
This is an interesting challenge for information literacy. In talking to students about research and evaluating information, we talk about how knowledge can change over time, but we’re mostly talking about the long term. Like how people used to believe that radiation was good for you but now we know better. But there are situations where our understanding changes more quickly than that and I think we need to talk more with our students and patrons about how to deal with those situations.
Because it’s not just a crisis like COVID. It’s pretty much any important news story.
An example that springs to mind is Columbine. (Or any mass shooting incident in the United States, really. There are plenty of examples.)
If you dig up the earliest news stories about the shootings at Columbine High School in 1999, you’re likely to find a lot of misinformation about what happened and why it happened. This isn’t deliberate misinformation—it’s just that it took time to find and understand the facts. In fact, one of the best sources of information about the Columbine massacre, a book by Dave Cullen, didn’t come out until ten years after the incident. Because that’s how long it took someone to sort through all of that evidence and figure out how put it all together to tell the story of what actually happened.
The thing about Cullen’s book is that the story it tells is a lot different from the one the experts were telling at the time. That’s not because the experts were trying to mislead anyone (mostly). It’s because the popular understanding of the massacre evolved over time. (And then kind of stopped evolving once certain myths had been cemented into place, probably around the time Bowling for Columbine came out. Cullen’s book actually does a lot to refute the most well-known myths about the shooting.)
So in information literacy we teach that authority can be flexible but we don’t talk as much, if at all, about how even experts we have every reason to trust might get things wrong or change their minds sometimes, especially in situations where they themselves are still figuring out what the hell is going on. It can be frustrating when they change their minds about something, but that doesn’t mean we shouldn’t listen to what they say. If anything, the fact that they’re willing to change can point to even more reason that we should trust them: if they kept making the same recommendations in the face of changing facts, that would be a bad thing. The trick is that we have to evolve with them.
I’m not saying that talking about this more will solve or prevent any sort of present or future information-related crisis (much less a public health one), especially not with all the politics involved. But I still think there’s value in creating awareness around how knowledge can evolve and change and what the consequences can be if we don’t evolve and change with it.