Trust in Research Undertaken in Science and Technology Scholarly Network (TRuST) launched to combat growing trend of disinformation
The rising trend of “fake news” came to prominence over the course of the COVID-19 pandemic as people turned to social media channels to read and distribute information that often fell far short of offering reliable information or verifiable data. The unchecked spread of misinformation led to serious harm for many individuals, especially those who decided to forgo scientifically proven treatments to combat the novel coronavirus.
It’s time we find ways to combat the growing tide of disinformation. We need governments, the research community, private industry, and citizens, to come together and create innovative policies and practices to ensure that existing and new technologies don’t come with unintended harms.
I doubt the engineers who first built those social media platforms were aware of how their products could one day be weaponized in campaigns of damaging — and deadly — misinformation. We need to find a way to bridge the gap between the people who design and build new technologies and the public who are the users of those technologies.
Here, at the University of Waterloo, we looked at several surveys that measured how Canadians’ trust in science, academia, health, technology and government has changed over the years. While there have been relatively few surveys measuring trust in science, the most consistent trend we’ve found is that trust in most institutions and individuals – especially the government — rose during the beginning of the pandemic but has since waned back to near pre-pandemic levels.
A report published in January by the Council of Canadian Academies, an Ottawa-based independent research organization, found that misinformation related to the spread of COVID-19 resulted in the loss of at least 2,800 lives and led to $300 million in hospital expenses over nine months of the pandemic.
Are Canadians suffering a crisis of trust across institutions? The data is troubling enough to spur me and some of my colleagues into action.
We cannot afford to sit on the sidelines and let the trust that Canadians have in science and academic institutions continue to erode. That’s why we created the Trust in Research Undertaken in Science and Technology Scholarly Network (TRuST), alongside my Waterloo colleagues, Nobel Laureate Donna Strickland and Canada Research Chair Ashley Mehlenbacher.
TRuST is the first multidisciplinary research network of its kind in Canada and aims to combat the growing trend of disinformation to better understand why some people deny, doubt or resist scientific findings and explanations.
TRuST will explore how engineers, scientists and researchers can find ways of embedding trust into the technologies that they are currently building. We hope that this can lead to further considerations of the intended, as well as the unintended consequences, of what those technologies can do.
It won’t be easy, but researchers and governments need to work together and think about how policy can help shape how we consider future technologies and online tools to prevent the spread of damaging misinformation.
New pharmaceuticals have to undergo rigorous study and clinical trials before they are brought to market. This is a measured approach that could be adopted when considering introducing new technologies into the wild. Before a company launches a new technological product into the marketplace, it could undergo a series of trials with a small group of people to identify whether any unintended issues come to light that could be addressed before allowing it to be expanded to more people.
Another approach could be for governments, in partnership with industry, non-profits and academia, to introduce a series of ethical standards that all technology companies would have to adhere to if they want to make their products available to the public. This method builds upon the work that Waterloo professor and founding director of the Critical Media Lab, Marcel O’Gorman has done, alongside the innovation hub Communitech and the Rideau Hall Foundation, to create a set of guiding principles that advises governments, businesses and organizations to use technology for the good of humanity.
While these suggestions may appear to go against the grain of conventional thinking, we need to begin — and continue — this conversation of how to regain trust across science and technology.
We have already seen how the risks of avoiding this direct approach have created an environment of distrust toward researchers, scientists and policymakers in this post-pandemic period. Tackling this challenge now is critical to ensure that future ideas and technological advances won’t suffer a similar fate.
Photo courtesy of DepositPhotos