-
About
Our Story
back- Our Mission
- Our Leadership
- Accessibility
- Careers
- Diversity, Equity, Inclusion
- Learning Science
- Sustainability
Our Solutions
back
-
Community
Community
back- Newsroom
- Discussions
- Webinars on Demand
- Digital Community
- The Institute at Macmillan Learning
- English Community
- Psychology Community
- History Community
- Communication Community
- College Success Community
- Economics Community
- Institutional Solutions Community
- Nutrition Community
- Lab Solutions Community
- STEM Community
- Newsroom
- Macmillan Community
- :
- English Community
- :
- Bits Blog
- :
- What the Facebook Hearings Remind Us about Researc...
What the Facebook Hearings Remind Us about Research
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
My twenty-something sons have a keen interest in—okay, an obsession with—the history of media. My older son teaches film and is currently teaching a course on the future of film. That requires, of course, looking back at the past and considering how digital media have changed how films are distributed. My younger son is intrigued by the idea that his even younger cousins have never lived in a world in which YouTube didn’t exist. Having been a child when VHS tapes still had to be protected from melting in a hot car, he feels like the “old man” of media who can educate the younger generation about a world they never knew. They both use their iPhones or laptops constantly to research movies.
Is it obvious to note, though, that they don’t use Facebook to do research? Or that we don’t expect our students to cite Facebook as a source in a documented essay? My older son uses Facebook to disseminate his opinions about films, just like he uses a blog. Facebook is, after all, a social media or social networking service. How, then, have we reached the point of congressional hearings examining Facebook’s role in disseminating disinformation harmful to America’s youth?
Facebook creates communities of users, some much larger than others, who exchange updates on their lives and information they think will be of interest to their online community. However, the information shared is only as reliable as the community member who shares it. Facebook was never meant to be a news source, except as far as personal news was concerned. Then came the Trump administration, telling its followers that the mainstream media were not to be trusted as sources of national and international news. Some people began to put more faith in what a “friend” shared on Facebook than what a major news network reported. All of us have probably been guilty of sharing information on Facebook without thinking too critically about where that information came from. Sometimes we are glad to see someone out there reinforcing what we believe and pass it along without thinking about whether it is even true. Publishers of print media have been, and continue to be aware of the danger of printing libelous content. Now those who allow disinformation in digital form to go unchecked are facing some of the same type of scrutiny.
Those who run Facebook have tried to restrict what gets passed along as truth. Frances Haugen, the whistleblower who has released thousands of pages of Facebook documents, has testified about those efforts but argues that they fall far short of what it would take to eliminate the dissemination of misinformation. She points out that Facebook did tighten restrictions about what users could post in the days leading up to the 2020 Presidential election, for example, but relaxed those restrictions once the election was over—even in light of the events of January 6th—because it was profitable to do so.
A portion of Haugen’s testimony has been about the lies and conspiracy theories being spread about COVID-19 via Facebook. What is posted on a social media site can seldom be considered a matter of life and death, but lives literally are at stake if readers of a post believe that Ivermectin is a cure for COVID or that vaccinations are a Democratic conspiracy.
Mark Zuckerberg and the other higher-ups at Facebook can try to put in place a plan to block disinformation. In their daily lives, as they argue politics in the heated atmosphere that currently exists in our country, Facebook’s users still must bear the responsibility that any person who constructs an argument must bear for checking out the reliability of their sources. In arguments made in the context of their academic or professional lives, the rules of research and documentation haven’t changed. An argument in support of a claim is only as good as its sources and the warrants that build a bridge between claim and warrant, no matter how funny the meme or how convincing the post is on Facebook.
Image Credit: "facebook is dead" by Book Catalog is licensed under CC BY 2.0.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.