By Nina Kegelman|| Staff Writer

This week’s common hour speaker was Professor Julie Smith, MS of Webster University School of Communications. A world-wide speaker and author of the book Master the Media: How Teaching Media Literacy Can Save our Plugged-In World, Smith presented her talk “How to Spot Disinformation in Today’s Media” to the common hour community. Packed with silly GIFs and ironic visuals, Smith’s presentation explored the challenges of finding reliable sources in a politically polarized era and offered suggestions for safety and credibility online. 

For Smith, media literacy boils down to critical thinking. She walked the crowd through the knowledge and skills necessary for critical engagement with the media. First, she defined what is called “sticky” media in her field: media that makes money from the attention it receives. For example, Facebook is sticky media. In questioning the credibility of our news sources, Smith encourages us to determine their “stickiness.” With today’s media frenzy over the coronavirus, she warns that the media has nothing to gain by downplaying the public health threat. By revving us up, the media ensures we “tune in and stay tuned in.”

Smith turned towards the importance of committing to the quest for authentic information. Amidst political polarization like we see today, we tend to stay within certain “ideological bubbles.” The media is no exception to this phenomenon. Smith claims that we tend not to seek out media we disagree with, which makes us blind to potential misinformation. Regardless of political affiliation or news outlet preference, Smith urges us to always check for authenticity. 

Just because a website is designed well, for example, that doesn’t mean it is a credible source. She shared some examples of her students’ work for an assignment tasking them with inventing real-looking fake news headlines. “Queen Elizabeth Dead at 93,” read one of the designs, an uncanny imitation of NBC’s digital imagery. Smith shared that the student who had created the headline said the assignment scared him because it made him feel so powerful. 

Aside from not falling prey to savvy-looking sites, Smith also listed certain behaviors we can practice to spot disinformation. Self-checking, for example, means gauging the emotional response a headline or story gives us. If it makes you feel very strongly, be it happy, sad, or upset, chances are, it’s not a very credible source, Smith warned. Self-checking also demands that we check our own “baggage.” What opinions or biases do we already have towards this issue? How are we shaping our own communication experiences?

Next, Smith explained the various sorts of visual manipulation pervasive in today’s media. With platforms like Facebook and Twitter, which were never intended to be news sources, gaining popularity, digital trickery is practically inevitable. Smith showed examples of misleading charts and graphs, advising viewers to check sources and always read statistics. She also distinguished between what are called “deep fakes” and “shallow fakes.” Shallow fakes are forms of media, like videos, whose original content is altered for a desired effect. Smith used a video of a speech by Nancy Pelosi, which had been slowed down for dramatic effect, in which Pelosi appeared drunk, to illustrate this type of fake. Deep fakes, on the other hand, interfere with the original content of a source, making a subject appear to be saying or doing something that never happened. Smith recommended looking for facial discolorations, strange lighting, blurriness around the neck, and unusual blinking patterns to spot deep fakes. And of course, if the content is just too weird to have really happened, it’s probably fake.  

Smith also emphasized the “Google Reverse Image search” tool. Whether it be a fishy Twitter post or a bot account’s profile picture, this tool can check the internet for other sites using the same image. She also recommended running your own photos through Google Reverse Image to be sure other sites aren’t using them without permission. Bots on Twitter, Smith added, are easy to spot for their lack of followers, the type of accounts they follow (only women, perhaps), or strings of numbers following their username. If a bot chats you up, it will likely show errors in verb conjugation, article usage, and capitalization. 

Smith finished by listing several useful fact-checking tools, like Google fact-check explorer, snopes.com, watchframebyframe.com, trendsmap.com, and hoaxslater.com. She left the crowd with questions to ask themselves when examining the media: How could this message be interpreted differently? What information does it leave out? How else could I look at this?

The question portion of the talk included discussions about the media and political debates, how to call out friends for posting misinformation, how to find objective sources, and opinions on satirical news like The Onion, The Daily Show, and Saturday Night Live. 

Smith recommended reading transcripts rather than watching political debates, pointing to the famous example of the Nixon vs. Kennedy debate in 1960. Appearances can be deceiving, especially with carefully construed footage. As for calling out friends and family, Smith underscored the importance of exposing disinformation even when it’s uncomfortable. As long as it’s done with respect and privacy, calling out friends for spreading misinformation ourselves is the best way to stop it from spreading. Lastly, Smith added that some news is better than no news when it comes to satirical sources, which purposely misinform for comedic effect. However, advising that there is no “one truly objective source,” Smith encouraged collection of news from various different sources and drawing conclusions independently.  

Junior Nina Kegelman is a Staff writer her email is nkegelman@fandm.edu

By TCR