Research in the Time of COVID

While being a Research Intern at the Center for Health Enhancement Systems Studies (CHESS) at the University of Wisconsin Madison, I was able to be part of a cutting-edge team jumping on the topic of COVID misinformation.

Capture2.PNG

During the summer of 2020, the COVID pandemic was running rampant throughout much of the world. While isolating and looking after my health, I was also facing another concern: I was worried that my internship – at the Center for Health Enhancement Systems Studies (CHESS) at the University of Wisconsin Madison – would be cancelled. The internship, however, proved to still be on, and now it had a unique research opportunity to be one of the first concentrated efforts on the topic of COVID misinformation.

As an Associate User Researcher, much of my work was in helping interpret results, find real examples of misinformation being interacted with on Twitter, and helping develop corrective measure features. Our aims of the project were as follows:

Aim 1: Identify high-priority misinformation claims most widely circulating in WI counties with higher concentrations of vulnerable populations.

Aim 2: Experimentally test whether visual enhancements (infographic, visual topic cue, visual illustration) could improve the correction of identified misinformation on top of text-only correctives. Identify the most effective visual enhancement strategy, respectively, for the elderly, racial minorities, and rural residents.

Aim 3: Employ identified best-in-show visual enhancement in our ongoing fact-checking efforts via the CWC app throughout 2020 and into 2021

To meet these goals, I was put in the unique position of being granted access to Twitter’s firehose COVID-19 stream. From this position, I then worked with the team to integrate the Poynter dataset with the Twitter data to geolocate misinformation and identify counties in Wisconsin that had widely spreading misinformation. We focused on two textual ways to inform users – both of which we wanted to inspire a hope appeal in – which were factual and coherence based. An example of factual based correction that we created looks like:

“[In fact, the main composition of MMS, chlorine dioxide, is a powerful bleach normally used in industrial processes such as the manufacture of textiles. And drinking or injecting any bleach or disinfectants can potentially cause a series of life-threatening side effects including nausea, diarrhea, and severe dehydration that can lead to death.

More importantly, bleach and disinfectants CANNOT cure COVID-19. There is no scientific evidence that consuming any type of bleach or disinfectants can prevent or treat any diseases. Both the Food and Drug Administration and healthcare experts have released multiple warnings urging people NOT to drink or inject bleach and disinfectants. The effect of drinking higher amounts of chlorine dioxide from MMS kits has been documented in medical journals and by doctors who have treated patients who drank it: For example, A 75-year-old man in New York who tried to treat his prostate cancer with MMS ended up spending four days in the hospital and received a blood transfusion.]”

Contrasting this was coherence-based correction, which looks like:

“[The reason why some might believe MMS or similar products can cure COVID-19 is that chlorine dioxide is used as a disinfectant in municipal water treatment, so people can ingest trace amounts of the chemical. But, importantly, the Environmental Protection Agency has set a maximum allowed level of 0.8 milligrams per liter. The amount of MMS recommended by pseudoscience advocates is about 200 to 500 times the maximum amount set by the EPA, which could potentially lead to extremely severe health issues. Similarly, various types of disinfectants are produced to kill germs or viruses on hard surfaces, which give the impression that they are “antivirus,” but the important thing is, they are not safe to drink or to be injected.]”

During the later efforts, I helped source and interview local graphic designers to create the visual mediums we wanted for the project. I provided ideas for the initial creations, as well as feedback for the products. At this point, we were able to compare both our visual and textual correctives by comparing them (a) across multiple identified misinformation claims (b) in a Latin-square survey experiment through Qualtrics (c) with oversampling of targeted vulnerable populations. Below is an example of an infographic we produced, using factual based correction.

Ultimately, my internship ended before the project concluded. As I was leaving, we began collecting and analyzing the Qualtrics data. What we found may not be a surprise to you; coherence-based correction had more positive results, with more effective correctiveness via Tweets. We also found strong political ties being linked with these spreads of misinformation within Wisconsin counties. After leaving, I continued to sit in on occasional meetings to follow the team’s progress, and offer feedback and thoughts.

This was a fantastic opportunity for me to get the “guerilla” style of user research experience that was necessary. The entire team and proposal had been a last-minute decision, as we knew that we would only get one shot at doing early COVID research. We were still in the tentative area of COVID being widely known, but the science behind it not reaching the same levels of fame.

We had to work fast to get user bases in our specs, and that meant that even though I was just a student worker, I was given significant independence and authority in my work. There was no time to check my work, and while this frightened me at first – what if I turned in something wrong! – it quickly helped build my confidence in myself and my work. When UX researchers are giving you tasks, completely confident you will finish them quickly and cohesively, it’s thrilling. It made all of my studying seem worth it.

Another fascinating aspect of this research was the psychological measures we had to learn about. Logically, if someone is incorrect, and somebody corrects them with the right information, the first person would simply update their ideas. Well, if you know anything about humans, and especially humans on Twitter, this is only true in theory. In practice, we had to familiarize ourselves with people’s defense mechanisms, and the hoops they mentally leaped through to avoid changing their ideas. I had never before considered the psychology behind these types of decisions. So much of what UX research and design is rooted in is how people act, yet I had never asked why they acted this way.

Overall, this experience helped me develop as a researcher, and learn how to be highly adaptable. Not only was the internship turned online due to COVID, but I had moved out to Wisconsin for the summer to attend it. It was a fantastic opportunity for me to develop as an individual and is one of the major factors that give me confidence when I start my job search across the country. In truth, I don’t think you can be an adaptable researcher unless you’re an adaptable person. My time in Madison helped me grow, both professionally and personally.