by: Carolyn Holmes, Mississippi State University
In the midst of the February 2021 winter storm that gripped much of the continental United States, a picture circulated widely on social media. While rolling blackouts and power grid failures plagued Texas, tens of thousands of people, including major players in Texas’ oil and gas industry, members of Congress, and other high profile figures, shared and retweeted an image of a helicopter de-icing a wind turbine. The commentary accompanying these posts blamed Texas’ power outages on the failure of green energy infrastructure, like wind turbines. While later analysis has shown that the grid failures and outages were largely due to problems with natural gas-based generation capacity, state leaders, including the governor, and major figures in cable news commentary, blamed the outages on frozen turbines. The central problem with this argument, however, is that the image that generated so much outrage was from Sweden in 2015, not Texas in 2021.
This kind of information, detached from context and misattributed, spreads easily and quickly online. Images from one context are applied in another, either deliberately, as disinformation, or inadvertently, as misinformation. Both disinformation and misinformation have been blamed for myriad social and democratic ills, including fueling xenophobia and violence, as well as degrading public knowledge and preventing coalition formation. These effects are well-documented, and certainly of concern in terms of their effects on society. Yet, they also provide a potential jumping off point for a uniquely digital ethnographic immersion.
In the world of in-person fieldwork, wherein data are produced and documented through interaction with researchers and interlocutors, whether in structured interactions, participant observation, or fieldnotes, context is vital in producing information. In the digital realm, information can be dissolved from the context in which it was produced. A years-old picture from Sweden can become, for many people, an image of Texas. As with the chemical process of dissolution of solids, like salt, into solvents, like water, the change that occurs is physical, not chemical. The salt that was crystalline is now a saline solution. The salt is still there, and it can now be used in different applications. The image from Sweden was not altered, but dissolved from its context, and applied in another. It is therefore useful in understanding Texas in 2021, and the national conversations around Green Energy in general, in addition to being an artifact from a different time and place.
What is it about the image of a frozen turbine that resonated in the midst of one of the most acute power failures in recent memory? Why were people so quick to ascribe blame to the wind turbines, which provide a minority of the state’s total power supply? How does this incidence of misinformation link with other critiques of wind turbines, from figures like Donald Trump, or with larger critiques of environmental initiatives or green energy?
In answering these questions, it is less important that the image is from Sweden. What is important is that so many people, including powerful voices in media and politics, thought of this image as being of, and from, Texas. To simply say that people were wrong in attributing the picture, or that this application of the picture is wrong, is to largely miss the point of its importance. Certainly, it was a picture taken from its context and applied in another. It was never actually a picture of Texas. But, in the minds and experience of the people sharing the image, it was (or is) Texas. It reflected a piece of their understood reality in ways detached from the origins of the image, and resonant with the ways they saw (or see) the world. The attribution of the picture is not (just, or only) a lie, it is data.
But this approach is not unique to digital ethnography. Path-breaking work by ethnographers like Lee Ann Fujii have delved into the idea of how to discern truth from lies in the field. Adam Ashforth’s fieldwork on witchcraft and democracy in South Africa delves into the interface between beliefs in the supernatural and actions in public. Jessica Allina-Pisano’s work has explored the ways that idiom can be used by interlocutors in ways that misrepresent historical relationships but shed light on present political realities. James Scott’s Domination and the Arts of Resistance similarly explores how people meaningfully conceal behavior, and act differently depending on the networks of power in which they are situated. In El Alto, Rebel City: Self and Citizenship in Andean Bolivia, Sian Lazar addresses the ways in which local and indigenous knowledge mediates relationships between citizens and the state. Indeed, the idea of ethnography is not a “true picture…but an approach to such an understanding…a construction that as faithfully as possible reflects what the author learned” (Wessing). Whether from a positivist or interpretivist mode, ethnography has consistently grappled with the idea of truth, falsehood, authenticity, and consistency in presentation.
What does seem distinct, however, about digital ethnography and the presentation of such misattributed information is the scope and durability of the information. Unlike in-person ethnographic interactions, digital misinformation itself often spreads in cascades. Cascades occur when an image, fact, or frame, is shared widely both in support and in opposition, in a short time. They can be observed either while they are happening, by documenting incidences of an frame, fact, or image in a digital space like a message board or a Twitter network, or after the fact, through searches of key terms, or reverse image searches. Misinformation seems to travel in cascades in part because it evokes strong emotional reactions, both positive and negative, from receiving audiences, which inspires the sharing and interaction with that information on social media. These cascades themselves can be discrete events—as with the image from Sweden which has been the subject of several, unassociated cascades of social media prominence dating as far back as 2016—or as a collection of incidents as part of a broader pattern—as with the false and misleading statements about wind turbines and their alleged links to cancer, as articulated by Donald Trump.
Returning to the image from Sweden, a reverse image searches show more than 1400 incidences of the exact image being used in recently updated websites. So popular was the image, that a suggested auxiliary search from Google was “Frozen Windmills in Texas.” Some of these websites are debunking the application of this image to Texas, but more are participating in the dissemination of the image.
Such misattributed information tends to be quite durable. The influence of confirmation bias, in addition to a fractured media landscape, makes misinformation resilient in the face of correction for many news consumers. Individual cognitive factors, like the coherence of the information presented, its compatibility with existing beliefs, and source credibility also contribute to the “stickiness” of misinformation. All of these factors are also amplified by the spread of misinformation on social media, with siloed ideological communities and networks of known interlocutors. The durability of such impressions is why fact-checking as a post-hoc solution to misinformation does not often change minds. Fact checking, retractions, debiasing and corrections can sometimes be effective in correcting outright falsehoods, but have difficulty in terms of reaching the whole audience of those who had seen and propagated misinformation, and are often inefficient at presenting subtlety. The effects of fact checking are also substantially reduced in highly partisan environments, and positionality, in terms of race, gender, age, class and partisan identification, conditions the effectiveness of fact checking as a corrective to misinformation.
Information dissolved from context, or even wholly fabricated, becomes part of the conversation in ways that accrete over time. The effects of misinformation in instances like the PizzaGate incident, or in the case of the “birther” movement questioning whether President Obama was a birthright citizen of the United States, have compounded over time, escalating with passing events. The ripples of the original disinformation grow, and in saturated information environments, become self-reinforcing. Such effects can be in terms of extremist supporters of grand conspiracy theories, or adherents of less totalizing ideological commitments, like the anti-vaccination movement, or climate change deniers. The consequences of misinformation can be seen in a variety of different contexts, from vaccine hesitancy among older, white evangelicals in the United States in the context of the COVID-19 pandemic, or in lack of support for green energy interventions in the United States.
Discerning truth from lies, to borrow a phrase from Fujii, is not necessarily about a pursuit of Truth as such. Studying misinformation does require a sense of the empirics behind claims. The windmill photo was taken in Sweden, for example. But discerning these kinds of baselines takes time, and the process of tracing the dissemination of a fact, image, or frame and its debunking can sometimes be difficult. But just as with ethnography in in-person contexts, getting a sense of how information moves through a space and an audience requires immersion in that context. While ethnography has many virtues, it is not a method to be applied quickly. This is all the more true in a digital mode, where the density of information, the speed at which it travels, and the reach of the information, far surpasses local contexts in which so much ethnography is conducted. But even in small instances, like the sharing of the frozen windmill photo, what is being revealed in the virality of that image is the cognitive and political milieux in which such an image resonates.
Why did the misattributed image spread so far in the midst of an historic crisis of the electricity infrastructure in Texas? Because people wanted to believe it. That, in itself, is data.