Frames of Death: The Rhetorical Goals of Official Social Media Posts in the Russo-Ukrainian War

 

Miles B. Vance

Cinema and Television Arts, 黑料不打烊

Submitted in partial fulfillment of the requirements in an undergraduate senior capstone course in communications


Abstract

The prevalence of social media in daily life has made a substantive impact on the fields of propaganda and population manipulation. This phenomenon has manifested most significantly to Western audiences in the context of the 2016 United States presidential election, where Russian misinformation campaigns were used to promote the efforts of eventual president Donald J. Trump. Additionally, in recent conflicts around the world, social media propaganda has been used alongside traditional propaganda as part of a mental warfare campaign. The most recent example of this can be seen in the ongoing phase of the Russo-Ukrainian War, beginning after Russia invaded the east of Ukraine in February 2022. This research examined examples of social media content posted by Russian, Ukrainian, and Wagner-affiliated social media channels and analyzed the content for rhetorical goals and devices. The study employed the analysis of two videos from each side for three different events: the fall of Bakhmut, the Kerch Bridge explosion, and the one-year anniversary of the invasion. This leads to a total of 18 individual posts, with an additional two images analyzed concerning the advent of drone video posts. The results of this study show that the most common themes of social media content were legitimization, deflection, humor, and violence. All three sides appear to use similar tactics, although Russian channels often deign to engage in posting directly violent content. The motivations of these channels are to engage the viewer and either uplift or dismay them.

Keywords: Hybrid warfare, Russo-Ukrainian War, propaganda, social media
贰尘补颈濒:听mvance5@elon.edu


1. Introduction

This research analyzes the online social media content posted during the ongoing Russian invasion of Ukraine, which is part of the decade-long Russo-Ukrainian War. The research revolves around three distinct entities posting videos and images of the conflict: The Armed Forces of Ukraine (ZSU), the Armed Forces of the Russian Federation (VSRF), and the Russian Private Military Contractor known as the Wagner Group. Included in the selection of Ukrainian sources are units such as the Third Assault Brigade, which is a Ukrainian military division comprised of soldiers belonging to the Azov Battalion, a far-right third-party militia in Ukraine that has been incorporated into the Ukrainian Armed Forces. The purpose of this research is to qualitatively examine the online content produced by these entities and analyze such material for elements of narrative, tone, framing, music, and internet memes, all of which construct a form of postmodern psychological warfare known as Hybrid Warfare or Information Warfare. In previous conflicts, propaganda leaflets have been airdropped to enemy combatants and civilians. The internet provides a new medium for spreading propaganda and false information, which may greatly assist a country鈥檚 war effort through influencing the civilians and soldiers of both their homeland and their opponents.

II. Literature Review

In the years since the conclusion of the Cold War, information technologies such as the internet, computers, and cell phones have rapidly advanced in ability, speed, and prevalence in daily life. It is now almost inevitable that any given person in a developed nation will be in possession of a device capable of connection to the internet. This era of hyper-connectivity creates new possibilities for the use of information in warfare. The concepts of hybrid warfare, information warfare, social media, and their roles in the ongoing Russian invasion of Ukraine will be discussed in this literature review.

According to Danyk and Briggs (2023), 鈥淭ools of information perception and manipulation can be used to achieve various political, economic, military, and other goals, which in some interpretations is a form of preventative defense鈥 (p.35). The integration of such 鈥渢ools of information perception and manipulation鈥 into warfare is part of a phenomenon referred to as 鈥渉ybrid warfare.鈥 Hybrid warfare has existed as a phenomenon since the 1990s and will remain relevant in conflicts for the foreseeable future (Danyk & Briggs, 2023). Hybrid warfare is utilized by countries the world over, and it is useful because 鈥淭he ambiguous nature of (hybrid warfare) and its use of ambiguous modes such as insurgency and terrorism allow a belligerent to minimalize targeting opportunities by denying the opponent the ability to utilize its advantage in firepower鈥 (Brown, 2018 p. 62). Put simply, hybrid warfare seeks to circumnavigate material or numerical disadvantage through alternative means. These methods can include information warfare, cyber warfare, guerrilla-like infrastructure attacks, or any other number of alternative attacks on an adversary. Russia has been noted for its usage of the concept in its military interventions in Georgia and Ukraine (Brown, 2018), and this is part of Russia鈥檚 larger attempt at utilizing hybrid warfare 鈥渋n its pursuit of dominance in the Eastern neighborhood鈥 (Ratsiborynska, 2016 p. 18). This research focuses on the use of information and misinformation within hybrid warfare, particularly in the form of posts from Russian and Ukrainian sources on social media.

In the 21st century, social media is an increasingly important tool in agenda setting, which is the act of determining which issues will be important to the public (Feezell, 2018). Agenda setting on social media is particularly potent against those who are typically 鈥渦ninformed鈥 regarding political issues (Feezell, 2018). Prier (2017) identifies social media as a new and powerful tool in the field of information warfare, noting that the platforms 鈥渃ould be used as a weapon against the minds of the population鈥 he who controls the trend will control the narrative 鈥 and, ultimately, the narrative controls the will of the people鈥 (p. 81). Russia has long been a pioneer of this method, and perhaps its most visible efforts from a Western perspective can be seen in the 2016 U.S. presidential election, in which Russian information weapons systems directly targeted American users of social media websites to achieve the goal of spreading disruption, false information, and distrust (Hanlon, 2018).

Bia艂y (2017), borrowing from Ben Nimmo, has identified four main ways in which Russia has used social media to spread misinformation and sow distrust. These are: dismissing online commentators by attacking their credibility or factual statements, distorting facts with misleading depictions and contexts, distracting the audience by arguing alternative lines of reasoning, and dismaying the audience with frightening content. This final tactic of dismay has been used heavily in the ongoing conflict surrounding the invasion, with thousands of horribly gory videos of drone bombings, civilian deaths, and other realities of war proliferating social media feeds. Ross and Rutland (2022) have identified everything ranging from 鈥淲ords, tweets, TikToks, Instagram posts, drone recordings, and any other microtarget-enabling media deemed 鈥榲iew-worthy鈥欌 (p.222) as the primary 鈥渨eapons鈥 of information warfare. Ross and Rutland further urge the U.S. Army to develop 鈥渁ppropriate doctrinal changes related to information operations, public affairs, and cyber space operations.鈥 (Ross & Rutland, 2018 p.222).

Russia has been extraordinarily proactive in its use of information warfare in the ongoing Russo-Ukrainian War, only increasing its efforts after the 2022 invasion of Ukraine. Mullaney (2022) has performed valuable work in quantifying and describing Russia鈥檚 perceived goals in their social media war campaigns, and notes that the current goal is to pressure the local population, distract civilians from facts, and actively deceive them with false information and skewed perspectives. Mullaney further explains how Russia targets users, remarking that 鈥渢he Russian domestic audience remained the most important target throughout the war鈥 (Russia) targeted vulnerabilities within its society and sought to legitimize the war鈥 (Mullaney, 2022 p. 199). This shows that information warfare, especially when viewed through the lens of social media, can target multiple audiences. Propaganda is no longer a home effort, as the same posts can be viewed by both Russian patriots and Ukrainians who may doubt their government鈥檚 legitimacy. Indeed, a popular topic addressed by Putin鈥檚 regime seems to be the supposedly Nazi-infested parliament of Ukraine. This rhetoric is likely used because it targets both Russians and Ukrainians as an audience and appeals to a commonly recognized evil entity.

One interesting aspect about the Russian use of hybrid warfare in the ongoing conflict post-invasion is that Russia has not used cyber-attacks to target the internet. Russia has occasionally targeted civilian power infrastructure and governmental infrastructure, but in a much more restrained and noninvasive way than theorists of cyber warfare have previously predicted (Lin, 2022). This lack of direct attack could be because Russia lacks the capabilities to engage in cyber warfare on a scale previously approximated, but it is far more likely that Russia wants Ukrainian civilians and soldiers to continue to have access to power and internet so that those individuals can be exposed to the Russian information warfare systems. If Kyiv were to 鈥済o black,鈥 the Russian efforts at dominating social media would be worthless. It is possible that Russia will use cyberattacks to target substations and internet/cell towers in a retreat scenario, as was seen with the Chernobyl area. However, this remains to be seen, and for the time being, Russia appears perfectly content to keep Ukrainian internet active.

The current field of research is exhaustive in its examination of Russia鈥檚 use of hybrid warfare, including the aspect of information warfare on social media. However, the existing work has not fully examined Russia鈥檚 invasion of Ukraine, the latest iteration of the Russo-Ukrainian War. Furthermore, current research neglects to consider Ukraine鈥檚 efforts at social media information warfare and related hybrid warfare efforts. This may be because Ukraine has largely mirrored Russia in its efforts, but this research seeks to investigate Ukrainian voices alongside Russian ones. Finally, the current field of research has given little consideration to the rhetorical devices employed by wagers of information warfare on social media platforms. This research seeks to fill the gap in existing study by providing a current and relevant investigation of the unique rhetorical devices employed in social media content by Russian, Ukrainian, and third-party forces.

Research Questions

The following questions will be the focus of this research project:

RQ1: What specific methods and rhetorical tools are used by Russian, Ukrainian, and PMC forces to frame the war in video social media content?

RQ2: What similarities and differences exist between Russian, Ukrainian, and third-party forces鈥 methods and video content?

RQ3: Why do these forces make these rhetorical decisions? What benefits might their unique utilizations of framing offer in advancing their narrative?

This research is timely because it seeks to analyze an ongoing conflict for the rhetorical goals present in social media videos. Previous research has examined Russia鈥檚 engagement with hybrid and information warfare, but it has not fully examined the latest iteration of the Russo-Ukrainian War beginning with the 2022 invasion of Ukraine. Furthermore, existing research does not tend to focus on the specific visual and rhetorical elements utilized in propaganda posts. Finally, the current scope of research studying information in the Russo-Ukrainian War focuses greatly on Russia but fails to adequately explore Ukrainian and third-party attempts at propaganda and information warfare. For these reasons, this research addresses important and neglected topics in the fields of information warfare and propaganda.

III. Methods

This research uses qualitative content analysis to identify the specific visual, aural, and narrative elements comprising Russian and Ukrainian attempts at propaganda. Rosenberry and Vicker (2021) define qualitative content analysis as a form of research where 鈥渢he researcher 鈥榠nteracts鈥 with the documentary materials to analyze the documents and better understand their meanings in context鈥 (p. 228). Qualitative content analysis is useful for this research project because it allows the direct analysis of Russian and Ukrainian social media posts in the context of the larger conflict, which is part of a war stretching back almost a decade. Qualitative content analysis also allows the researcher to examine these posts within the content of the surrounding battles and 鈥渇low鈥 of the conflict in the days and weeks leading up to each post, which influence the ways in which these entities choose to portray the conflict. Rosenberry and Vicker (2021) also state that the specific content selected for investigation must be 鈥渟elected for conceptually or theoretically relevant reasons鈥 (p. 228).

This project has selected videos posted in the aftermath of three events: one Ukrainian victory, one Russian victory, and the one-year anniversary of the invasion in February 2023. These have been selected to ensure a wide range of content to understand the ways in which each side frames (or ignores) victory and defeat. Deliberate selection within specific events is being used instead of simply selecting random posts to ensure that rhetorical elements are present in the videos. By remaining selective in determining the sample, this research attempts to avoid any indeterminate elements, such as leaked videos or heavily reposted and re-edited content. In this project, the specific qualities that are examined include the video, audio, editing, tone, and overarching narrative of each post. All these elements combine to create the unique profile of each post, which in turn aims to form or modify the audience鈥檚 perception of the war.

The specific social media websites that are used in this research include, but are not limited to, Twitter, Reddit, Telegram, 4Chan, and WhatsApp. Telegram and 4Chan have been selected because they are the most accessible 鈥淲estern鈥 based websites that still host pro-Russia propaganda videos. Telegram especially appears to host many Russia- and Wagner-affiliated channels. Twitter, Reddit, and WhatsApp are useful sources because they serve as content aggregators, with several pages, such as Reddit鈥檚 r/CombatFootage, dedicated to reposting war footage and propaganda for the purpose of increasing public knowledge or spreading a narrative. However, many videos and images found on Reddit, Twitter, and WhatsApp seem to originate from Telegram, so videos selected from these sources will be provided in their original Telegram context. Many mainstream social media platforms, such as Facebook or TikTok, have strict policies forbidding violent content, making analysis on their platforms difficult. To ensure a degree of randomness in sampling, two posts from each perspective will be randomly selected from a larger pool of deliberately selected content in the aforementioned contexts (being a Russian victory, Ukrainian victory, and the invasion anniversary).

IV. Findings

This study found that there are consistent and identifiable trends in posting online content in the Russo-Ukrainian War. Table 1 below briefly outlines the typical social media responses identified after a Russian victory, Ukrainian victory, and the one-year anniversary of the invasion.

Table 1

Ukrainian Sources Russian Sources Wagner Sources
Russian Victory (Battle of Bakhmut, around 20th May 2023) Ignored reports of Russian victory, sought to legitimize soldiers and capability. Celebrated victory in Bakhmut, praised Wagner and Yevgeny Prigozhin. Celebrated victory and praised Prigozhin, utilized memes and humor.
Ukrainian Victory (Kerch Bridge explosion, 8th October 2022) Celebrated explosion, posted memes and boasted heavily. Deflected to other events in the war, seldom mentioned bridge. Posted recruitment advertisements and made loose references to the explosion.
One-Year Anniversary of Invasion (24th February 2023) Posted images of dead Russian soldiers, reposted Zelenskyy speech, and posted drone bombing footage. Made solemn commemoration of the war, defended its legitimacy. Posted dead Ukrainian soldiers and combat footage.

Rhetorical Goals and Devices

In response to the first research question regarding the specific methods and rhetorical devices utilized on social media by Russian, Ukrainian, and Wagner forces, this study found that the most common themes between all sides were legitimization, deflection, humor, and violent content.

Legitimization

Figure 1. Post from Ukrainian Telegram channel 鈥3-褌褟 芯泻褉械屑邪 褕褌褍褉屑芯胁邪 斜褉懈谐邪写邪.鈥 Text translates as: 鈥淲ork at the headquarters on the front line near Bakhmut. Planning offensive actions of the 1st Assault Battalion.鈥

Images depict Ukrainian soldiers in a bunker surrounded by maps, papers, and computer screens.

Figure 2. Post from Wagner Telegram channel 鈥淭he Grey Zone.鈥 Text translates as: 鈥淲AGNER GROUP | BLOOD. HONOR. HOMELAND. COURAGE. Raising the Russian flag and the battle banner of the Wagner Group in Bakhmut after its liberation. This victory is like gratitude for the support of the Russian people and a tribute to the memory of those who gave their lives in battle along this path. Glory to Russia!

Video depicts Yevgeny Prigozhin announcing the capture of Bakhmut.

In this usage, legitimization refers to a broad category of posts that are designed to bolster support for a particular force by flexing its military might, emphasizing a nation鈥檚 values, or utilizing examples of civilian support. One specific example of this comes from Ukrainian sources following the fall of Bakhmut. Bakhmut is an important strategic city located north of Donetsk in the Donetsk Oblast of occupied Ukraine. On May 20, 2023, Russian and Wagner forces captured the city after more than 200 days of street-to-street and building-to-building combat (Kullab & Litvinova, 2023). After the city fell, the Third Separate Assault Brigade of Ukraine posted a series of high-quality photos displaying soldiers hunched over maps and documents along with the caption 鈥淲ork at the headquarters on the front line near Bakhmut. Planning offensive actions of the 1st Assault Battalion鈥 (Figure 1). This post seeks to legitimize the war and battle by showing the front lines hard at work.

Russian forces also sought to use tactics of legitimization in the context of defeat. On October 8, 2022, Ukrainian forces utilized a remote weapons delivery system to partially destroy the Kerch Bridge, a Russian structure connecting Crimea to mainland Russia. This bridge is an important symbol for Russian nationalists because it symbolizes a tangible connection between Russia and Crimea, which was annexed by Russia in 2014. The destruction of the Kerch Bridge was hailed as both a symbolic victory and a significant infrastructural blow by Ukrainian forces. In the aftermath of the explosion, the pro-Russian Telegram channel Razved_Dozor posted a blurry video allegedly showing a Ukrainian civilian tearing down a Ukrainian flag in the city of Marhanets with the caption 鈥淚n the city of Marganets (Dnepropetrovsk region), a citizen of Ukraine decided that the yellow-blue flag had an adverse effect on the residents of the city.鈥 Instead of directly addressing the Kerch Bridge explosion, the Russian source immediately sought to legitimize Russian goals and delegitimize Ukrainian morale.

In the aftermath of the Battle of Bakhmut, Russian and Wagner sources relied heavily on legitimization to justify the long and casualty-heavy campaign. While the long-fought siege resulted in a Russian victory, it came at a significant cost in terms of funds, equipment, and manpower. One video that circulated through nearly every pro-Russian and pro-Wagner source was a video from May 20, 2023, showing Wagner leader Yevgeny Prigozhin announcing the capture of the city (Figure 2). In the video, Prigozhin salutes his fallen comrades and touts Russian propaganda while emphasizing the supremacy of Wagner鈥檚 capabilities. The Russian media mill generally lauded Wagner after the victory at Bakhmut, with the pro-Russian WarGonzo posting a video of Wagner soldiers posing with the Wagner flag flying over Bakhmut on May 20. This feverous embracement of Wagner after Bakhmut by Russian media sources may have ultimately been the final motivation for Prigozhin to launch his ill-fated 鈥渞ebellion鈥 against the Russian government just a month later.

The use of informational and accurate posts as a form of legitimization typically occurred only when the facts of a scenario served to benefit the force posting the content. One way that this phenomenon is manifested is in the form of posts from official government channels. After the Kerch Bridge explosion, the Ukrainian Deputy of the VRU Committee on National Security, Defense, and Intelligence, Yuriy Mysyagin, posted 鈥渁nother beautiful video鈥 of the heavily damaged Kerch Bridge on his personal Telegram account. After the one-year anniversary of the invasion, a Telegram channel affiliated with the Ukrainian ZSU posted a video of Ukrainian president Volodymyr Zelenskyy speaking to a G7 conference. These posts use Ukrainian public officials to legitimize the war effort, especially in the context of a perceived victory, such as at the Kerch Bridge.

A similarly official appeal to legitimization can be found in a Wagner recruitment post uploaded immediately following the Kerch Bridge explosion. Many Russian nationalists expressed displeasure that Russian security would allow such an attack, and Wagner sought to distance itself from the Russian army by emphasizing the force鈥檚 good pay and 鈥渢he best special effects at concerts鈥 for anyone who would join, even those who 鈥渉ave a criminal record or are on the blacklist.鈥 This post is intriguing because it uses a Russian defeat to legitimize Wagner as a superior fighting force, rather than an allied force.

The final examples of legitimization on social media come in the form of long, text-based appeals to nationalist sentiment. This was seen prevalently in Russian channels following the one-year anniversary of the 鈥渟pecial operation,鈥 as the war raged on long past its intended end date. The Russian Razved_Dozor Telegram channel posted a letter emphasizing that 鈥渆verything that happened in a year had to happen,鈥 and that 鈥淲e are our own people, Russia is our own civilization.鈥 This post relies on emotional appeal and patriotic sentiment to spur legitimization, in a format that would be familiar to propagandists of old.

Deflection

Figure 3. Post from Russian Telegram channel WarGonzo. Text translates as: 鈥淭oday everyone is discussing the blowing up of the Crimean Bridge. Yes, this is a new twist in this war. Unfortunately, daily shelling of long-suffering Donetsk, which takes several lives at a time, has already become routine.鈥

Figure 4. Post from Ukrainian Telegram channel 鈥溞撔拘恍拘残叫 褍锌褉邪胁谢褨薪薪褟 褉芯蟹胁褨写泻懈 袦袨 校泻褉邪褩薪懈鈥. Text translates as: 鈥淭he enemy’s transfer of additional reserves to the Bakhmut direction indicates the failure of their offensive actions – the Bakhmut fortress is holding.鈥

All three sources examined in this study engaged in the practice of deflection or misdirection. In this context, deflection concerns a source creating content that seeks to ignore negative results and emphasize some other event or talking point to downplay the significance of a perceived loss. In response to the Kerch Bridge explosion, Russian sources sought to deflect attention toward other issues. In a post from October 8 (Figure 3), the pro-Russian war correspondent channel known as WarGonzo posted a video showing a smoldering building with the caption 鈥淭oday everyone is discussing the blowing up of the Crimean Bridge. Yes, this is a new twist in this war. Unfortunately, daily shelling of long-suffering Donetsk, which takes several lives at a time, has already become routine.鈥 This post brushes over the attack on the bridge and emphasizes alleged war crimes occurring elsewhere in Russian-occupied Ukraine, therefore deflecting attention away from the explosion and minimizing its significance to the pro-Russian audience. Sources affiliated with the Wagner group also engaged in deflection in the wake of the Kerch Bridge explosion. On the day of the explosion, Wagner-affiliated Telegram channel The Grey Zone posted a long paragraph describing organizational changes that were occurring in the upper echelons of the Russian military structure. The post briefly mentions the explosion, but does not provide any detail in discussing the event.

Ukrainian sources also engaged in deflection. In a post from May 21, 2023, a Telegram channel belonging to the Main Directorate of Intelligence of the Ministry of Defense of Ukraine posted an image of burning buildings with the caption 鈥淭he enemy’s transfer of additional reserves to the Bakhmut direction indicates the failure of their offensive actions – the Bakhmut fortress is holding鈥 (Figure 4). This post came after the fall of Bakhmut was reported by the Associated Press and by Russia itself. However, despite this discrepancy, the Ukrainian source remained committed to emphasizing its allegedly strong hold on the city and the region. This post takes deflection to an extreme level.

Humor

Figure 5. Post from Wagner Telegram channel 鈥淭he Grey Zone.鈥 Caption translates as 鈥淢r. Wagner Himself.鈥 Image depicts satanic figure standing in front of a pile of skulls next to a sign reading 鈥淏akhmut.鈥

Figure 6. Post from Ukrainian Telegram channel 鈥溞炐啃笛邪褌懈胁薪懈泄 袟小校.鈥 No caption. Video depicts 鈥淏ongo Man鈥 meme with burning Kerch Bridge in background.

Bia艂y (2017) writes that 鈥淰isual content has two major functions 鈥 to impress or to dismay. It rarely has a purely informative character. It is also interesting to note the significant role played in psychological warfare by humoristic drawings and pictures.鈥 (p. 85). Memes and humorous images or videos have long been a massive part of internet culture, so it should not be a surprise that civilians have taken to propagating memes to cope with the toll of the war. However, Russian and Ukrainian forces have also taken to posting such content. Due to the popularity of memes, integrating humorous images with traditional forms of propaganda has been a relatively fluid process for social media propagandists. Interestingly, official Russian channels do not seem as interested in posting humorous content as their Ukrainian and Wagner counterparts, but this requires more investigation.

The first category of humorous content found online represents images that are akin to traditional political cartoons. One example of this can be seen in a Wagner post following the fall of Bakhmut, where the victorious 鈥淢r. Wagner himself鈥 is portrayed as a horned devil standing over a pile of Ukrainian skulls (Figure 5). These posts are less about humor and more about using cartoons and hyperbolic imagery to transmit a message, in this case with the devil representing the lethal capability of Wagner mercenaries. The second category of humorous content, as alluded to earlier, takes the form of internet memes. Often, memes relating to the war attempt to capitalize on the meme trends of the current day. One example of this can be seen in the Ukrainian response to the Kerch Bridge explosion, when the official ZSU Telegram page posted a rendition of the briefly popular 鈥渂ongo man鈥 meme with the smoldering bridge in the background of the video (Figure 6).

Figure 7. Post from Ukrainian Telegram channel 鈥溞⌒炐毿樞犘 UA鈥 with caption translating 鈥淲agnerites near Bakhmut.鈥 Image shows rows of dead men.

Figure 8. Post from Wagner Telegram channel 鈥淭he Grey Zone.鈥 Image shows armed Wagner soldiers standing over the bodies of Ukrainian soldiers near Berkhovka.

One recent phenomenon is the act of posting violent content online. This trend was pioneered by fighters of the Islamic State, who gained notoriety for posting videos of beheadings and bombings on their media feeds, striking fear into the hearts of civilians and opposing soldiers the world over. Official channels tend to shy away from posting content directly showing death, but in a war with so many third-party content aggregators and semi-detached units, large amounts of violent content have emerged online and become some of the most popular videos of the war, likely due to their shock value and prevalence as propaganda tools. Every side engages in posting violent content, but this study found that Wagner and individual Ukrainian unit channels were much more likely to do so.

Around the one-year anniversary of the invasion, a Telegram channel branding itself as the 鈥淎x of the UA (Ukrainian Army)鈥 posted an image of dozens of partially dressed dead soldiers with the caption 鈥淲agnerites near Bakhmut鈥 (Figure 7). At the same time, Wagner channel The Grey Zone posted two images of dead Ukrainian soldiers near Berkhovka and a video of a skirmish near Vugledar, both located along the front lines in the occupied Donetsk Oblast where most of the fighting has occurred for the duration of the war (Figure 8). The motivation for posting these videos appears to be very similar to the motivation for posting legitimizing content; that being a desire to emphasize one force鈥檚 military might while portraying the opposing side as weak or ineffectual.

Figures 9 and 10. Post from Ukrainian Telegram channel 鈥淎erobomber,鈥 allegedly tied to the Strike Aerial Reconnaissance 30th OMBR unit. The image on the left shows an image from a drone鈥檚 video camera as a grenade falls on a group of Russian soldiers. The image on the right is the unit鈥檚 self-designed insignia, depicting an owl clutching a grenade in its talons.

During the war, modified civilian drones have been used, primarily by Ukrainian forces, to deliver explosive payloads, usually in the form of a grenade or small bomb. Because many of these drones come with cameras, recordings of these deadly attacks are some of the most common and intimidating videos on the internet. The message from Ukrainian posters is clear: death can come from above at any time, and Russian soldiers probably won鈥檛 hear it until it is too late. The format of these videos is almost universal: a heavily watermarked video with hardbass-style music blaring in the background shows a drone鈥檚 point of view as it hovers above a group of soldiers before ultimately dropping the payload, leading to an explosion and the often slow, painful deaths of the enemy. Entire units of the Ukrainian Army, such as the Strike Aerial Reconnaissance 30th OMBR, are dedicated to drone warfare, and have become prolific in their posting habits (Figures 9 and 10). The logo for the 30th OMBR is an Owl holding a grenade in its talons, with the pin already pulled.

Similarities and Differences Between Sides

The second research question that this study sought to analyze was the similarities and differences among Russian, Ukrainian, and Wagner posting patterns. One commonality between official Russian and Ukrainian channels was their reliance on content aimed at legitimizing the war effort. These videos and images are perhaps the most palatable and 鈥渁ppropriate鈥 form of social media propaganda. The content is largely aimed at soliciting support, and therefore maintains an authoritative tone. All three sides similarly engaged in deflective tactics to maintain a morally superior tone while ignoring or minimizing losses and negative facts. Ukrainian- and Wagner-affiliated sources were more likely to engage in posting violent content and drone bombing videos than their Russian counterparts, but individual pro-Russian content aggregators were willing to do so. Finally, meme content was spread throughout all three sources, but Ukrainian- and Wagner-affiliated sources were again more willing to participate in this trend. Ultimately, it appears that Ukrainian- and Wagner-affiliated channels were more likely to post edgier and more violent content, while Russian sources attempted to maintain a balanced and authoritative tone.

Potential Benefits

Regarding the third research question, the perceived benefits of these different types of content varies. The case for legitimization appears to be straightforward. By emphasizing one鈥檚 strengths and the enemy鈥檚 weaknesses, propagandists seek to win hearts and minds for the cause. The invasion of Ukraine has resulted in mass conscription in both Ukraine and Russia, and as a result the popularity of the war has been in flux for both sides. According to a Gallup poll from October 2023, the popularity of the war has decreased among Ukrainians from 70% to 60% in the past year (Vigers 2023). The popularity of the war among Russian civilians fell by a very similar 9% in almost the same period (Pribylov 2023). In the examples reviewed by this study, legitimization ranged from images displaying military might to long letters conveying patriotic passion. One video allegedly depicting a Ukrainian civilian tearing down a Ukrainian flag displays the heart of the issue 鈥 Russian and Ukrainian propagandists believe that legitimization helps bolster civilian morale and to give credence to their actions.

Deflection serves as the opposite of legitimization in its purpose. By deflecting attention away from losses to other issues, militaries seek to maintain their image of superiority and strength through silence. This can come in the form of either briefly acknowledging or flatly ignoring a negative fact, which is consistent with propagandist techniques from the past. Ukraine and Russia are far from the only nations to engage in this activity during war or peace, but it has become an enshrined tactic in the current war.

Humorous videos and images, along with internet memes, may serve to raise the spirits of the civilians and soldiers consuming content. Morale is a highly important resource in warfare, and the depletion of morale is tantamount to defeat itself in many scenarios. Therefore, the usage of memes as online propaganda can be interpreted as an attempt to win a war on the mental front. Furthermore, memes spread quickly across the internet due to their replicability and popularity, so integrating propaganda into memes is a good way of ensuring that a message spreads to a desired audience. This tactic has been used heavily by political campaigns in America to increase awareness, and it is likely that memes play a similar role in the Russo-Ukrainian War.

Violent content and drone bombing videos are a relatively new phenomenon in hybrid warfare. The initial impulse is to label these videos as an attempt to shock and demoralize the enemy through violent and jarring videos of their countrymen鈥檚 deaths. However, the presence of quick techno music and a flashy editing style implies that these videos may serve the primary purpose of energizing and exciting Ukrainian civilians and soldiers. More research is necessary to determine the exact motivation for this style of content, but its prevalence is certainly significant and noteworthy.

V. Discussion

Perhaps the most interesting aspect of the prevalence of social media propaganda in the Russo-Ukrainian War is how it reaches and interacts with Western audiences. Online communities, such as Reddit鈥檚 r/NonCredibleDefense or r/CombatFootage, are dedicated to posting and compiling large quantities of footage and memes from all sides of the war, often with shaded intent. Accounts such as Instagram鈥檚 atlas.news3 or ourwarstoday2 appear to be genuinely journalistic in their efforts, and post content from a wide range of topics including the Russo-Ukrainian War. However, individual accounts on Reddit posting to r/NonCredibleDefense may be posting content simply because it is shocking and will generate large amounts of discussion and 鈥渦pvotes.鈥 The most viewed and upvoted content on combat-focused subreddits is consistently content that shows close-quarters combat, impressive aerial duels, or simply the biggest explosions. Western, English-speaking audiences on these platforms appear to be overwhelmingly pro-Ukraine, but not universally so. Many commenters appear to be there simply for the spectacle, but in an age where irony and apathy are dominant, it is difficult to obtain a genuine perspective on these consumers based on their online comments and interactions.

The ultimate question of the hybrid warfare effort from both sides is whether their efforts are successful in turning the hearts and minds of domestic and foreign civilians and soldiers. This study has documented the rhetorical goals and devices employed by Russian, Ukrainian, and Wagner forces on social media, but the actual impact of these posts is unclear. Early in the war, reports surfaced of Russian soldiers surrendering en masse to Ukrainian soldiers and civilians alike, allegedly influenced by part in seeing their countrymen die on social media. However, today these reports are absent from major media cycles in the west. With no end to the war in sight, the continued efforts of social media propagandists increasingly resemble the trench warfare that the conventional soldiers of Russia and Ukraine are mired down in. Finally, this research was limited by several factors that warrant evaluation in future research. The form factor of this research inherently limited its sample size, the scope of the research questions was ambitious, and the findings of the research were widely reliant upon the services of Google Translate as opposed to a more robust translating approach.

VI. Conclusion

The ongoing phase of the Russo-Ukrainian War has led to an escalation in the amount and intensity of social media posting from channels associated with Russian, Ukrainian, and Wagner forces. This study examined the rhetorical goals and devices employed by these channels and found that the four primary goals of social media content were to legitimize, deflect, humorize, or convey violent content. This research further found that the posting patterns between all sides of the social media battle were relatively consistent across sides, but that Russian sources often deferred to post as much violent content as their Ukrainian or Wagner-associated counterparts. Finally, this research offered qualitative interpretations of the perceived benefits of each type of rhetorical goal, all ultimately attempting to either bolster or dishearten the morale of civilians and soldiers.

This research has only skimmed the surface concerning rhetorical goals and devices in hybrid warfare. The sheer amount of content not examined by this study demonstrates the need for more research focused on better categorizing social media posts. Additionally, research must be conducted to ascertain the internal motivations for such posting. However, without internal documents or whistleblowers, this data will be difficult to obtain. Similarly, more research must be done to determine the ways in which social media propagandists selectively target either pro-Russian or pro-Ukrainian consumers.

Acknowledgements

I want to thank Dr. Daniel Haygood, whose mentorship and guidance was instrumental to the completion of this project. I also want to thank Dr. Michael Carignan for nurturing my interest in research and higher learning. Finally, I thank my friends and family for their unwavering support throughout the process of writing and presenting this research.


References

3-褌褟 芯泻褉械屑邪 褕褌褍褉屑芯胁邪 斜褉懈谐邪写邪 [@ab3army]. (2023, May 25). 袪芯斜芯褌邪 褍 褕褌邪斜褨 薪邪 锌械褉械写芯胁褨泄 锌褨写 袘邪褏屑褍褌芯屑. 袩谢邪薪褍胁邪薪薪褟 薪邪褋褌褍锌邪谢褜薪懈褏 写褨泄 1-谐芯 褕褌褍褉屑芯胁芯谐芯 斜邪褌邪谢褜泄芯薪褍. [Images]. Telegram.

Bia艂y, B. (2017). Social media鈥擣rom social exchange to battlefield. The Cyber Defense Review, 2(2), 69鈥90.

Brown, J. (2018). An alternative war: The development, impact, and legality of hybrid warfare conducted by the nation state. Journal of Global Faultlines, 5(1鈥2), 58鈥82.

小袨袣袠袪袗 UA. (2023, February 22). 袙邪谐薪械褉芯胁褑褘 锌芯写 袘邪褏屑褍褌芯屑. [Image]. Telegram.

Danyk, Y., & Briggs, C. M. (2023). Modern cognitive operations and hybrid warfare. Journal of Strategic Security, 16(1), 35鈥50.

Feezell, J. T. (2018). Agenda Setting through social media: The importance of incidental news exposure and social filtering in the digital era. Political Research Quarterly, 71(2), 482鈥494.

袚芯谢芯胁薪械 褍锌褉邪胁谢褨薪薪褟 褉芯蟹胁褨写泻懈 袦袨 校泻褉邪褩薪懈 [@DIUkraine]. (2023, 21 May). 袩械褉械泻懈写邪薪薪褟 胁芯褉芯谐芯屑 写芯写邪褌泻芯胁懈褏 褉械蟹械褉胁褨胁 薪邪 袘邪褏屑褍褌褋褜泻懈泄 薪邪锌褉褟屑芯泻 褋胁褨写褔懈褌褜 锌褉芯 锌褉芯胁邪谢 褩褏 薪邪褋褌褍锌邪谢褜薪懈褏 写褨泄 – 褎芯褉褌械褑褟 袘邪褏屑褍褌 褌褉懈屑邪褦褌褜褋褟 袩褉芯 褑械 胁 泻芯屑械薪褌邪褉褨. [Image]. Telegram.

Grey Zone [@grey_zone]. (2022, October 8). 袙 芦袚褉褍锌锌械 袙邪谐薪械褉邪禄 褋薪芯胁邪 褋屑芯褌褉 薪芯胁芯褟胁谢械薪薪褘褏 屑褍蟹褘泻邪薪褌芯胁 写谢褟 谐邪褋褌褉芯谢械泄 蟹邪 褉褍斜械卸芯屑! 鈥 写芯褋褌芯泄薪芯械 褎懈薪邪薪褋芯胁芯械 胁芯蟹薪邪谐褉邪卸写械薪懈械 (褋泻芯谢褜泻芯 懈 芯斜械褖邪薪芯, 胁 芯褌谢懈褔懈懈 芯褌. [Image]. Telegram.

Grey Zone [@grey_zone]. (2022, October 8). 袙 褋胁褟蟹懈 褋 芯褋褍褖械褋褌胁谢褢薪薪芯泄 邪褌邪泻芯泄 校泻褉邪懈薪褘 薪邪 芦袣褉褘屑褋泻懈泄 屑芯褋褌禄, 褋屑械褋褌懈谢懈褋褜 锌谢邪薪懈褉褍械屑褘械 懈 褍卸械 薪械褋泻芯谢褜泻芯 薪械写械谢褜 锌褉芯胁芯写懈屑褘械 锌械褉械褋褌邪薪芯胁泻懈 胁 袦懈薪懈褋褌械褉褋褌胁械 芯斜芯褉芯薪褘. [Text]. Telegram.

Grey Zone [@grey_zone]. (2023, February 24). 袩芯写褌胁械褉卸写邪褞褖懈械 泻邪写褉褘 懈蟹 袘械褉褏芯胁泻懈, 泻芯褌芯褉邪褟 褋械谐芯写薪褟 褍褌褉芯屑 锌械褉械褕谢邪 锌芯写 泻芯薪褌褉芯谢褜 效袙袣 芦袙邪谐薪械褉禄 袙 褉械蟹褍谢褜褌邪褌械 褌褟卸械谢褘褏 斜芯械胁 锌褉芯褌懈胁薪懈泻 薪械 胁褘写械褉卸邪谢 薪邪褌懈褋泻邪. [Image]. Telgram.

Grey Zone [@grey_zone]. (2023, February 24). 袣邪写褉褘 懈蟹 锌芯写 校谐谢械写邪褉邪, 谐写械 斜芯泄褑褘 斜褉懈谐邪写褘 屑芯褉褋泻芯泄 锌械褏芯褌褘 孝懈褏芯芯泻械邪薪褋泻芯谐芯 褎谢芯褌邪 袪芯褋褋懈懈 褕褌褍褉屑芯胁邪谢懈 褍泻褉械锌褉邪泄芯薪 锌褉芯褌懈胁薪懈泻邪. 袙 芯写薪芯屑 懈蟹 褝锌懈蟹芯写芯胁 斜芯褟. [Video]. Telegram.

Grey Zone [@grey_zone]. (2023, May 20). |袦懈褋褌械褉 袙邪谐薪械褉 褋芯斜褋褌胁械薪薪芯泄 锌械褉褋芯薪芯泄|. [Image]. Telegram.

Grey Zone [@grey_zone]. (2023, May 20). 袚袪校袩袩袗 袙袗袚袧袝袪袗 | 袣袪袨袙鞋. 效袝小孝鞋. 袪袨袛袠袧袗. 袨孝袙袗袚袗. 袙芯写褉褍卸械薪懈械 褎谢邪谐邪 袪芯褋褋懈懈 懈 斜芯械胁芯谐芯 蟹薪邪屑褟 芦袚褉褍锌锌褘 袙邪谐薪械褉邪禄 胁 袘邪褏屑褍褌械 锌芯褋谢械 械谐芯 芯褋胁芯斜芯卸写械薪懈褟. 协褌邪. [Video]. Telegram.

Hanlon, B. (2018). It鈥檚 not just Facebook: Countering Russia鈥檚 social media offensive. German Marshall Fund of the United States. Retrieved from

Kullab, S., & Litvinova, D. (2023, May 23). Russia TV celebrates as it reports the capture of Bakhmut, comparing it to Berlin in 1945. AP News.

Lin, H. (2022). Russian cyber operations in the invasion of Ukraine. The Cyber Defense Review, 7(4), 31鈥46.

Mullaney, S. (2022). Everything flows: Russian information warfare forms and tactics in Ukraine and the US between 2014 and 2020. The Cyber Defense Review, 7(4), 193鈥212.

袦袠小携袚袉袧 [@mysiagin]. (2022, October 8). 袩褉芯锌邪谐邪薪写懈褋褌懈 锌芯泻邪蟹邪谢懈 褖械 锌褉械泻褉邪褋薪械 胁褨写械芯 褌芯谐芯, 褟泻 胁懈谐谢褟写邪褦 袣褉懈屑褋褜泻懈泄 屑褨褋褌. 袧邪 卸邪谢褜, 芯写薪邪 褋屑褍谐邪 胁褑褨谢褨谢邪 鈥 邪谢械 锌褉芯锌褍褋泻邪褞褌褜 谢懈褕械 谢械谐泻芯胁褨 邪胁褌芯屑芯斜褨谢褨. [Videos]. Telegram.

袨锌械褉邪褌懈胁薪懈泄 袟小校 [@operativno]. (2022, October 8). [Video]. Telegram.

袨锌械褉邪褌懈胁薪懈泄 袟小校 [@operativnoZSU]. (2023, February 24). 袙芯谢芯写懈屑懈褉 袟械谢械薪褋褜泻懈泄: 袙蟹褟胁 褍褔邪褋褌褜 褍 蟹褍褋褌褉褨褔褨 谢褨写械褉褨胁 芦袚褉褍锌懈 褋械屑懈禄. 袧邪褕邪 蟹褍褋褌褉褨褔 屑邪谢邪 写胁褨 褔邪褋褌懈薪懈. 校 锌械褉褕褨泄, 锌褍斜谢褨褔薪褨泄 褔邪褋褌懈薪褨 锌芯写褟泻褍胁邪胁 锌邪褉褌薪械褉邪屑. [Video]. Telegram.

Pribylov, S. (2023, September 6). Russia鈥檚 shifting public opinion on the war in Ukraine. Voice of America. Retrieved from

Prier, J. (2017). Commanding the trend: Social media as information warfare. Strategic Studies Quarterly, 11(4), 50鈥85.

Ratsiborynska, V. (2016). When hybrid warfare supports ideology: Russia Today. NATO Defense College. Retrieved from

Razved Dozor [@Razved_Dozor]. (2022, October 8). 袙 谐. 袦邪褉谐邪薪械褑 (袛薪械锌褉芯锌械褌褉芯胁褋泻邪褟 芯斜谢.) 谐褉邪卸写邪薪懈薪 校泻褉邪懈薪褘 褉械褕懈谢, 褔褌芯 卸械谢褌芯-褋懈薪懈泄 褎谢邪谐 薪械斜谢邪谐芯锌褉懈褟褌薪芯 胁谢懈褟械褌 薪邪 卸懈褌械谢械泄 谐芯褉芯写邪. [Video]. Telegram.

Razved Dozor [@Razved_Dozor]. (2023, February 24). 袛褉褍蟹褜褟 袩芯卸邪谢褍泄, 屑薪芯谐懈械, 泻邪泻 懈 褟, 褋屑芯褌褉械谢懈 芯斜褉邪褖械薪懈械 锌褉械蟹懈写械薪褌邪. 袟邪写褍屑邪谢邪褋褜. 袙褋械, 褔褌芯 锌褉芯懈蟹芯褕谢芯 蟹邪 谐芯写 写芯谢卸薪芯 斜褘谢芯 锌褉芯懈蟹芯泄褌懈. 袪芯褋褋懈褟 写芯谢卸薪邪 [Image]. Telegram.

Razved Dozor [@Razved_Dozor]. (2023, May 20). 袘邪褏屑褍褌 (袗褉褌褢屑芯胁褋泻) 胁蟹褟褌! 20.05.2023 褋械谐芯写薪褟 锌芯 锌芯谢褍写薪褞 斜褘谢 锌芯谢薪芯褋褌褜褞 胁蟹褟褌 袘邪褏屑褍褌 袟邪褟胁谢械薪懈械 袩褉懈谐芯卸懈薪邪 芯 胁蟹褟褌懈懈 薪邪褋械谢械薪薪芯谐芯 锌褍薪泻褌邪 袘邪褏屑褍褌 褋懈谢邪屑懈 效袙袣 芦袙邪谐薪械褉禄. [Video]. Telegram.

Rosenberry, J., & Vicker, L. A. (2021). Applied mass communication theory (3rd ed.). Taylor & Francis.

Ross, R. J., & Rutland, J. (2022). A military of influencers: The U.S. Army social media, and winning narrative conflicts. The Cyber Defense Review, 7(4), 213鈥226.

校写邪褉薪邪 邪械褉芯褉芯蟹胁褨写泻邪 30-袨袦袘袪 [@aerobomber]. (2023, February 24). 小屑邪卸械薪褨 泻邪褑邪锌褋褜泻褨 屑邪泻邪泻懈 锌芯胁械褉褌邪褞褌褜褋褟 写芯 褋胁芯谐芯 锌械褉胁褨褋薪芯谐芯 褋褌邪薪褍 – 薪邪 写械褉械胁邪, 薪邪 褔芯褌懈褉懈 谢邪锌懈 褨 薪邪 锌械褉械谐薪褨泄. 袗 褌懈屑 褔邪褋芯屑 褌褉懈胁邪褦. [Video]. Telegram.

Vigers, B. (2023, October 25). Ukrainians stand behind war effort despite some fatigue. Gallup.com. Retrieved from n

WarGonzo [@WarGonzo]. (2022, October 12). 校写邪褉褘 袙小校 锌芯 袛芯薪械褑泻褍 褋褌邪谢懈 褉褍褌懈薪芯泄 胁芯泄薪褘 小械谐芯写薪褟 胁褋械 芯斜褋褍卸写邪褞褌 锌芯写褉褘胁 袣褉褘屑褋泻芯谐芯 屑芯褋褌邪. 袛邪, 褝褌芯 薪芯胁褘泄 锌芯胁芯褉芯褌 胁 褝褌芯泄 胁芯泄薪械 [Video]. Telegram.

WarGonzo [@WarGonzo]. (2023, February 24). 袩褉芯械泻褌褍 芦孝褉懈斜褍薪邪谢禄 鈥 谐芯写! 袦褘 薪邪褏芯写懈屑 懈 锌褍斜谢懈泻褍械屑 懈薪褎芯褉屑邪褑懈褞 芯 锌褉械褋褌褍锌谢械薪懈褟褏 懈 锌褘褌泻邪褏 薪邪褑懈褋褌芯胁 懈蟹 芦袗蟹芯胁邪禄 懈 袙小校 胁 芯褌薪芯褕械薪懈懈 屑懈褉薪褘褏 [Video]. Telegram.

WarGonzo [@WarGonzo]. (2023, May 20). 肖谢邪谐 薪邪写 袗褉褌褢屑芯胁褋泻芯屑/袘邪褏屑褍褌芯屑. [Video]. Telegram.