In the wise words of John Oliver: memes aren’t facts.
The truth is, though, a large amount of people get their information on current events from easily digestible sources on websites they usually go to anyway. I’m certainly guilty of doing that, and I’m well aware that those sources might not be entirely accurate. As our readings this week pointed out, though, there’s just too much out there to sift through. At some point, you just have to accept what’s convenient and move on with your day.
Another issue that came to mind is how sites like Facebook and YouTube use algorithms to “suggest content you’ll like”, but ultimately end up creating an echo chamber for the opinions you already have. 60 Minutes did a piece a few months ago that discussed, in part, how YouTube’s suggested videos feature helps spread conspiracy theories, false information, and the messages of hate groups.
This is a systemic issue. Social media platforms and popular websites seem to not care if they put too much information out there. As long as they get traffic and advertising revenue, they don’t care if an individual falls for conspiracy theory nonsense. They have no incentive to curate what information is spread. Worse, if these sites did start to monitor the accuracy of information being spread, they’d be accused of violating someone’s First Amendment rights.
All of this is to say… I legitimately have no idea how to approach this issue. You can tell individuals to “be vigilant” or “corroborate your sources”, but even then, people will take the easier option if the information in question isn’t of dire importance to them. The only real way to have a permanent solution to this is to monitor what is uploaded and what is distributed, and there’s no way of ethically doing that.