Over the past several weeks, people across the world have flocked to media and news outlets for information on the current conflict between Russia and Ukraine. Apps such as Twitter, Facebook, and TikTok have become primary news sources for millions of users. However, the algorithmic nature of social media raises a journalistic dilemma. Specifically, obtaining news from social media has the ability to dilute real news, prioritize posts that AI deems most likely be engaged with, and in turn, spread fake news.
This is not a new concept, and in fact, has been circulating since the implementation of AI in social media. In 2018, the Rohingya in Myanmar were subjected to to abismal living conditions, state-endorsed violence, and, ultimately, ethnic cleansing pogroms, all promoted by Facebook (now Meta). Now, a legal campaign is underway to hold Meta liable for the part they played in the genocide of the Rohingya people. The campaign includes allegations that Facebook’s algorithm amplified the spread of hate-speech and violent content targeted at Rohingya people, and served as a deciding factor in the content that people were exposed to while scrolling on their phones.
While this case may seem extreme and isolated to Meta, recent research has shown that TikTok has followed suit. A viral video that racked up hundreds of thousands of views across social media claims to show the last moments aboard China Eastern Airlines’ Flight MU-5735, seconds before the plane crashed on Monday killing all 132 passengers. In reality, the footage is from a video game and shows no such thing. The video that circulated was actually first uploaded to YouTube a little more than three years ago, and its description explicitly states it is a computer simulation of a 2019 Ethiopian Airlines crash. This case of fake news is one of many that spreads like wildfire in the digital sphere, due to its propensity to earn maximum engagement and likes.
Similarly, a video that spread on Facebook in late February purported to show a video of a Ukrainian fighter pilot known as “the Ghost of Kyiv” shooting down Russian aircraft. The footage was actually from a free online video game called Digital Combat Simulator, PolitiFact reported, and it’s likely that the “Ghost of Kyiv” is a viral myth that has propagated during the war. “This footage is from DCS, but is nevertheless made out of respect for ‘The Ghost of Kiev,'” the YouTube video’s description stated.

Recent investigations conducted by the anti-misinformation outlet, NewsGuard found that misinformation about the Ukrainian war can be discovered merely minutes after creating a TikTok account. After 40 minutes of scrolling on the “For You Page”, the algorithm specifically curated content that contained numerous false claims and videos about the war. “Toward the end of the 45–minute experiment, analysts’ feeds were almost exclusively populated with both accurate and false content related to the war in Ukraine – with no distinction made between disinformation and reliable sources,” the research team wrote.
The instances listed above beg the question of why AI sorts through and places content containing false and violent speech at the forefront of user feeds. While algorithms are meant to boost posts that are likely to have the most engagement, likes, and comments, there is a recurring theme behind which posts are widely circulated and which are not. Do the codes behind social media algorithms blatantly target hate-speech as an indicator of high engagement? Is it simply that emotion-provoking content, whether true or not, creates opportunities for digital discourse, directing online users toward the original ethos behind social media? Or is there a much more nuanced answer that sheds a light on the truth of human nature?
In any case, as the world continues to spiral toward a bleak and uncertain future, the digital sphere has become a playground for leaders in various industries to spread misinformation and propaganda, all while turning a profit.
This blog post is part of the CIMA Law Group blog. If you are located in Arizona and are seeking legal services, CIMA law group specializes in Immigration law, Criminal Defense, Personal Injury, and Government Relations.