Abstract

This article investigates a sprawling archive of memes (about “Shakira law,” “shari’a Barbie,” and the “jihad squad,”) and incorporates analysis of the original Serial podcast (about the case of Adnan Syed) to look at the role of metadata and dataveillance in criminalizing and apprehending Muslims. Given technological innovations, like autocorrect functions that “correct” conversations about the “racialization” of Muslims to the “radicalization” of Muslims (to give one example), algorithmic manipulations of data depend on sexualizing and racializing assemblages that tell a familiar story about the way Muslim lives are shaped by the discourses and representations through which they are figured and apprehended. The author explores the way that this archive of memes figures Muslims as a “measurable type”—whereby they are profiled into highly fraught categories, like “terrorist,” through algorithmic interpretations of their online activity—therefore enabling what John Cheney-Lippold calls “soft biopolitics.” Given the ability of this sort of data to materially shape a person’s life, the author looks at the roles of metadata and big data in apprehending Muslims, Arabs, and SWANA-identified people through a biopolitical framing of population, where apprehend is understood in both senses of the word—in terms of understanding Muslims as well as criminalizing them.

Shakira Law: Accidental Apprehension

Toward the end of Barack Obama’s second presidential term, a set of fairly widely circulating memes decrying the potential threat that “Shakira law” would soon be implemented in the global north demonstrates some of the networked circuits through which anti-Muslim sentiments travel, their virality, and their metaphorical overlap with the notion of contagion. What seems to be the original meme is simply a photo of President Obama standing to the right of DJ Khaled; there are blue curtains and two American flags in the background. Overlaid on the photo is text that reads, “Here is a photo of Obama and the leader of ISIS. They are plotting to steal a third term AND IMPLEMENT SHAKIRA LAW IN AMERICA” (fig. 1). The meme clearly relies on the idea of DJ Khaled as a generic bearded-Muslim-terrorist in order to be read as the “leader of ISIS.” Another Shakira law meme inexplicably depicts a donut with a cursive form of writing forming the literal icing on the donut, claiming that “free Islamic donuts” were being distributed to children in order to perpetuate “Shakira law in America” (fig. 2).1

Though likely a humorous parody of Islamophobic discourses about shari’a law, the messages in these memes were taken at face value in at least some of the contexts in which they circulated, and similar memes—like the donut example—have also circulated as solemn warnings about the dangers of shari’a law as a potential contagion in liberal-democratic societies.2 Sensationalistic popular understandings of shari’a law associate it with intimate forms of gendered oppression, such as the practice of taking multiple wives, women being forced to cover (i.e., wear the hijab), and even female genital cutting. In other words, shari’a law is often understood to frame the female (Muslim) body as belonging to men. Shakira, herself invoking the opposite of these gendered and sexualized stereotypes—i.e., connoting sexual freedom and even lasciviousness—creates an informative tension and even dissonance about the flexibility of stereotypical associations that cohere in seemingly lighthearted cultural productions. In these memes, the phrase “Shakira law” demonstrates how Muslims are often apprehended through flexible racialized and sexualized associations that can nevertheless have rigid and material consequences.

What can memes and their circulation tell us about how notions of contagion shape the context of surveilling Muslims in the United States? Memes are often discussed in terms of their ability to go “viral,” suggesting a link to contagion. In the case of memes that invoke Muslim subjectivities, they also depend on the socially embedded idea of “terrorism as contractible” (Cacho 2012: 100)—i.e., as a dangerous disease that needs to be contained. These contemporary associations build on the history and legacy of immigration exclusions, xenophobia, and racializations of Asian Americans, including Southwest Asia and North Africa (SWANA) diasporas (Shah 2001; Gualtieri 2020). In this essay, I build on these observations about the racist deployments of the metaphor of virality to look at how popular cultural productions—traveling through viral circuits—ultimately buttress an infrastructure geared toward apprehending, and essentially capturing, Muslims.

Given their economic composition, memes are able to synthesize particularly gendered, sexualized, and racialized modes of capture that materially impact US Muslims’ lives, as well as the life of anyone perceived to be such. Thinking about what it means for such memes to “go viral,” memes also enable a consideration of contagion as a motivating metaphor that literally frames US/Muslim life. Moving beyond the humorous or ridiculous sentiments expressed in Shakira law memes and tweets, this essay explores the circuits of memes, internet hoax stories, and podcasts in relation to the technological (as embedded in a heteronormative and racialized) infrastructure of big data. Considering the enormity of the amount of data coupled with the tools to manipulate and collate it (e.g., including algorithms and metadata as aspects of this techno-universe), I look at how Muslims are apprehended, by which I mean interpreted and criminalized in order to justify representational and literal capture. Though notions of contagion are clearly entrenched in the logic of representational circuits, this essay will also foreground the metaphor of radiation to think in particular about the infrastructures of violence (embedded in the technological infrastructures of big data) as perpetuating “the fundamental violence of American inclusion, exclusion, and extraction” (Masco 2021: 9). The metaphor of radiation helps us think further about how these technologies—e.g., predictive algorithms that cull and shape data in motivated ways—can present themselves through the conceit of scientific objectivity while actually presenting a permeating silent and deadly threat.

Glitches

While “Shakira” law seems to reference a clearly human mistake or slippage—whereby the word shari’a slips into the more familiar name of a popular singer—it mimics the kinds of slippages that are increasingly common throughout predictive technologies. Operating through algorithms, which are often presented as disinterested compu-mathematical formulas, the technological infrastructure of prediction can be deadly. Ruha Benjamin (2019: 11) describes algorithms as “a set of instructions, rules, and calculations designed to solve problems” while micha cárdenas (2018: 27–29) offers that “algorithms can be low tech. Their form is similar to a cooking recipe.”3 In these definitions, algorithms can potentially be abolitionist and world-making tools. Yet as these same scholars note, insofar as the dominant use of algorithms has been to wield “historical information to make a prediction about the future,”4 it is hard to miss the insidious consequences of prediction inherent in them. Predictive algorithms have been deployed toward deadly racist ends, particularly in state systems of capture, like domestic and military prisons (Benjamin 2016, 2019; Cheney-Lippold 2017; Miller 2019; O’Neil 2016; Stop LAPD Spying Coalition 2018, 2020).

As the example of predictive technologies demonstrates, supposedly objective technologies are necessarily built upon a classification scheme that is deeply embedded in social institutions of sexism, racism, heteronormativity, and other systems of oppression. The example of the slip from shari’a to Shakira demonstrates the feedback loop between social categorizations and the way the technologies both reify and operationalize these classifications. One only has to look at a couple of examples brought to us by the “autocorrect” function built into most email and social networking platforms to see this sort of slippage at work more widely. For example, consider a tweet by Huda F @yesimhotinthis (2018) that reads, “My iPhone just autocorrected hijab to hijack.”5 This tweet parallels the Shakira example insofar as it transforms the Arabic word hijab into the term hijack, a verb literally and figuratively associated with Arabs-as-terrorists since (at least) the 1970s. Another is my (personal) experience of emailing colleagues about our shared research on the racialization of Muslims, only to find that the email program consistently—and aggressively, I might add, as it would re-correct even after I went back to fix it—refigured the phrase into the radicalization of Muslims, a term used often in the realm of security studies, which frames Muslims through the racist and Orientalist lens of extremism.6 Junaid Rana (2016: 120) uses the rubric of “racial infrastructure” to describe how the “racialization of Muslims is a flexible process that incorporates the portability of a number of race concepts, such as Blackness, Indigeneity, colonialism, genocide, immigration, and religion, in a system that appears contradictory and nonsensical.” In this essay, I build on Rana’s formulation of “racial infrastructure” to explore the way such infrastructures, perpetuated by technological infrastructures, lead to the apprehension and capture of Muslim subjects.

Theorizing the heteropatriarchal racial infrastructure exemplified in the Shakira law slip demonstrates that it depends on a tripartite structure, where classification and categorization schemata are reinforced by the technological infrastructure of big data to collect, arrange, manage, and flexibly manipulate huge troves of categorized data, while both of these schemata are motivated and bolstered by racial capitalism. Thinking about the social power of classification, Geoffrey Bowker and Susan Star (1999: 3) note that “the material force of categories appears always and instantly.” Adding to this a consideration of the tendency of algorithms to miscategorize through what John Cheney-Lippold (2017: 65) describes as “neocategorization,” the power of these social and technological infrastructures to shape our realities comes into greater focus. Taken together, these structures and processes coalesce in the ability to capture, a phenomenon that I will theorize through the idea of being apprehended.

Ruha Benjamin’s (2019: 80) Race after Technology invites us to think capaciously and creatively about “glitches” like those we find in the autocorrect examples above and, in particular, to think of them as generative and informative rather than a fleeting mistake: “Glitches are generally considered a fleeting interruption of an otherwise benign system, not an enduring and constitutive feature of social life. But what if we understand glitches instead to be a slippery place . . . between fleeting and durable, micro-interactions and macro-structures, individual hate and institutional indifference?” Below, I follow up on the suggestion to ask questions about how the flexible heteropatriarchal racism of glitches—and, in this case, autocorrects—is built into the infrastructure of technology.

Most companies decline to discuss their autocorrect software, and the one that did discuss the topic gave an indication as to why: “Surreptitiousness seems to be the operating philosophy here: ‘You do your best not to be noticed,’ says Scott Taylor, the vice president of mobile solutions at Nuance [the company that created T9]” (Manjoo 2010). On phones (at least in 2010, according to what I could find), the software works by comparing what you type against a built-in dictionary. This seemingly straightforward explanation begs several questions: Which dictionary does the phone use? How does the phone deal with context in terms of what to suggest? And finally, how/does the phone “machine learn”? In 2010, the indication was that phone autocorrect software was moving toward crowdsourcing, which is a more open model (like Google Suggest), in which case what you are typing on your phone would be open and available to the internet broadly. Though there are “substantial privacy concerns with this approach—you would essentially be sending everything you type to servers in the Web” and one article speculated in 2010 that “phone makers would likely incorporate them only on an opt-in basis, if at all” (Manjoo 2010), it wouldn’t be unreasonable to assume that autocorrect software now generally operates at that level.

Nevertheless, we can assume the two main examples named above—racialization and hijab—were already subject to this sort of sourcing, since they didn’t occur in phones but rather in more embedded systems: like web-based media or email. So that would imply that radicalization and hijack were commonly used in mass online media. But we would also have to figure in the context-bound feature of autocorrect systems. This machine learning, therefore, demonstrates a learned affiliation between Muslims and associated characteristics (e.g., what they wear—hijab), and radicalization or hijacking.

These examples go beyond marking a quantitative fact about the frequency with which such terms are used and point toward a recognition of the ways that Muslims are usually figured in online discourses, coupled with the fact that the terms hijab and racialization do not appear as legitimate words in computer dictionaries. They chart an existing set of discourses associated with the populations they reference, in combination with a process—indicated by the term speculative thought—of creating the set of possibilities for apprehending these populations (here, understanding Muslims, Arabs, and others from SWANA through a biopolitical framing of population). I use apprehend intentionally to draw on its double meaning: both “to arrest someone for a crime” and “to understand or perceive.” In other words, it is connected both to creating knowledge and to framing criminality. In relation to framing Muslims, I would suggest that we can also add a third meaning of the root word apprehend, this time in relation to the noun apprehension. Criminalizing Muslims depends on a melding of the earlier two meanings (“understand” and “criminalize”) as well as the unease or anxiety that serves as a justification for capturing data in the first place. Ultimately, I argue that these three meanings coalesce around the core activity of capture—apprehending Muslims depends on a heteropatriarchal racial infrastructure that supports the cultural circuits of memes and other ephemeral products that circulate. Their circulation, in turn, creates knowledge and builds fear, helping to fuel the criminalization and capture of Muslims.

To Apprehend, Take 1: “Innocent Knowledge”

Setting the scene for how these three meanings (understanding, fearing, and capturing) of apprehend coalesce to frame Muslim life as criminal, the first part of this essay explores the game-changing National Public Radio (NPR) podcast Serial. The podcast focused on the case of Adnan Syed, who was accused of killing his ex-girlfriend Hae Min Lee and convicted, in large part, using “evidence” based on Islamophobic assumptions. Both the case and its sensationalized re-telling through a series of podcasts and other media demonstrate that through arbitrations of liberal notions of universality, “Whiteness is made through the Muslim” and that “the law is where white supremacy and racial violence get legitimated” (Razack, forthcoming). Here, the circuits I explore are the podcasts that retrospectively piece together the case. While the more well-known Serial popularizes Syed’s case, the follow-up podcast Undisclosed convincingly argues that Syed was apprehended—both criminalized and captured—based on anti-Muslim racism. In this case, then, the court is a site where the practice of apprehending Muslims through racist filters gets codified. Significantly, Syed’s case is also possibly one of the first court cases ever to employ metadata as part of the prosecution’s evidence. Given that metadata can be marshaled as scientific-objective data at the same time that they are inherently abstracted from the particular details and contexts they reference, they are a brilliant example of how the technological infrastructure of big data operates as a tool of capture and apprehension of Muslims through and with racist infrastructures.

In the first season of Serial, the motivations of alleged killer Adnan Syed are uncritically presented through the long-standing Orientalist idea that “Muslim rage” emerges from “feelings of resentment, jealousy, and impotency” (Sheehi 2011: 69). Even an academic essay exploring whether Syed was racialized falls sway to the romanticism of a Shakespearean frame and uncritically accepts the explanatory power of Syed’s “besmirched” honor as a compelling motivation for murder (Corredera 2016: 36). While the podcast purports to explore every possible facet of the case against Adnan Syed, a Pakistani Muslim teen who was later convicted of murdering his ex-girlfriend, Serial never questions the discursive framework of the honor-shame nexus as a compelling motivation for murder. Despite the journalist Sarah Koenig’s evident desire to exonerate Syed, she never questions the prosecutor’s framing of Syed’s alleged motivation—that he had sacrificed so much to be with his non-Muslim girlfriend that their breakup created an unbearable severing of his Muslim honor, and an anguish that drove him to murder.7 That the explanatory frame for the alleged murder was not read nor even considered through the tediously common language of the “crime of passion,” endlessly invoked in mainstream domestic violence cases, demonstrates how Syed’s case figured him through the lens of Muslim ways of life—as collectively bound by honor and shame—and not through the lens of “our” (i.e., US) way of life, through which he would have had recourse to the individualized logic of the legal category of the crime of passion, which figures the murder as the regrettable action of an otherwise reasonable person overcome by a fleeting surge of emotion. In other words, the prosecution would have had to build a case about Syed as an individual who fits the profile of a domestic violence batterer rather than the broader religio-cultural “commonsense” racist argument about Islam.

Despite a plethora of online blogs and articles defending Sarah Koenig and Serial in general for its framing of the case, the narrative momentum of the podcast (known for shifting the landscape of podcasts from single, contained stories to a serialized episodic format that keeps people tuning in week after week) depends on anti-Muslim racist assumptions. Despite revealing the dearth of convictable evidence in Syed’s case, Koenig maintains the possibility of Syed’s culpability through two mechanisms. The first is an alleged eyewitness account on which the prosecution’s case mostly relied (in the interest of space, I won’t go into this here), and the second is the power of the Islamophobic narrative that casts Adnan as a jilted lover who murdered his ex-girlfriend because she coerced him to date outside of his religion and then broke up with him—the assumption being that since he had betrayed his religious/cultural context, her leaving him pushed him into a murderous state.

The sheer improbability of such an interpretation is completely undercut by the “Islamophobic industry” that has cropped up all around to provide countless documents attesting to the criminalizing and murderous practices supposedly lurking inside of Islam. Prosecuted in a time period before the advent of memes and podcasts (the early 2000s), the circuits of apprehension in Syed’s case pass through what Colin Powell would come to describe as a “terror-industrial complex” and what James Risen describes as a “Homeland security-industrial complex” (Rana 2016: 113). In Syed’s case, a report produced by the Enehey Group (1999) on behalf of the state’s detectives offers the following analysis of Syed’s case:

Clearly Mr. Syed faced almost insurmountable odds to meet with this “infidel or devil” [his high school girlfriend] in secret. Ownership is not outside of his cultural belief system. After giving her a veil, literally covering her so that only he could have her, he set her apart from all others and for him alone. . . . Under Islamic law her murder was sanctioned. For many “ethnic” Pakistanis incidents like these are common-place and in Pakistan this would not have been a crime but probably a matter of honor.

That such a report would exist is not in itself surprising, but what does surprise Rabia Chaudry—a friend of Adnan Syed’s family who brought his case to Koenig’s attention—is the extent to which Koenig accepts the report as fact. For example, Koenig meets with Chaudry to ask if it is true that “if a Muslim man gives a woman a scarf, he owns her or it’s like some form of ownership” (Chaudry 2016). Indeed, over the course of the podcast, Koenig demonstrates both this kind of direct racism as well as “racist love” (Chin and Chan 1972; Lott 1993). As one critical online article notes: “If Koenig is a flawed, unreliable narrator, we should add ‘cultural tourist’ to the list of flaws” (Kang 2014).

The aforementioned report clearly parallels the language of the so-called Muslim ban executive order (Trump 2017), which sought to exclude “those who engage in acts of bigotry or hatred (including ‘honor’ killings [and] other forms of violence against women)” and which relied on the logic of Islam as intrinsically violent toward women, framing Islam as inherently backward with respect to gender, sexuality, and women’s rights. An additional detail—uncovered in a separate podcast, Undisclosed—further demonstrates how Syed’s case was always already figured through a monster-terrorist-fag framework (Puar and Rai 2002). A key alibi witness for Syed, Bilal, who could testify that he was with Syed at the alleged time of the murder (Syed was practicing a speech he was scheduled to deliver the following night at his mosque) failed to show up to Syed’s trial or respond to a subpoena. Following up on this detail, the Undisclosed podcast revealed that Bilal had been arrested by Baltimore County police on charges of sexual misconduct with a fourteen-year-old boy; they told him they would drop the charges if he left the country for a few years. Such direct interference in the types of evidence that could be presented suggests that Syed’s case was determined by a racializing-sexualizing assemblage that figures Islam and Muslims through the hazy associations illustrated by the Shakira law meme, themselves shored up by the heteropatriarchal racial infrastructure of anti-Muslim racism. Racializing and sexualizing assemblages invoke and create incorporeal transformations—Gilles Deleuze and Félix Guattari’s ([1980] 1987: 80) term for the process by which people and bodies can be materially impacted by the significations and “overcoding” that shape how others understand them. One example could be the figure of the “thug” (wearing a hoodie), which in the case of Trayvon Martin (to name only one high-profile case) led not only to his death but to the exoneration of his killer under stand-your-ground laws, since Martin’s hoodie effectively incorporeally transformed him into an always already dangerous criminal.8 Another example that particularly informs anti-Muslim racism is the category of the “terrorist,” which creates a literal framework for capturing Muslims and/or Arabs and questioning them in search of “actionable intelligence.” In these cases and others like them, the two meanings of apprehend slide into one another—a category for understanding and framing a group of people shifts into a category used to criminalize and literally capture and charge them. This melding, in fact, coalesces in the idea of framing a group of people. Playing out through “data assemblages,” the heteropatriarchal racial infrastructure here shadows the technological infrastructures, which can incorporeally transform people’s lives through manipulation of even purportedly innocuous metadata.

Circuits/Contagion v. Infrastructure/Radiation

As mentioned above, contagion is also built into the Shakira law meme; the fear specifically named is that shari’a law will spread in the United States, with the underlying idea that it would metastasize and eventually take over. As Sherene Razack (2008: 149) notes, the moral panics activated in relation to shari’a law—sometimes even in small towns, with no foreign-born or Muslim residents at all, that feel “compelled to announce [their] prohibition of the stoning of women” succumb to a fear-logic that the “Muslims are coming” (see also Naber 2008; Kundnani 2015). One trajectory of this fear-contagion logic leads to a painful irony of Syed’s case: Syed was denied bail (a fact that the podcast Undisclosed credits with leading to his eventual conviction as it redirected resources that would have gone into his defense) based on the state’s argument that he was a “flight risk.” Despite all that Syed offered in exchange for bail (he agreed to waive extradition; his passport was expired but he offered to give it up anyway; his parents and others offered to put up their house for collateral as part of a forfeiture agreement; and he offered to submit to electronic monitoring under house arrest), the state deployed racist logic to argue that he should remain in jail. For example, the state responded that (1) Syed had a lot of resources that could help him escape; (2) because Muslims have similar names, it would be easy for him to get another passport; and (3) there is a record of jilted Pakistani men who have killed their lovers and escaped to Pakistan (which was untrue, but even if it were accurate, Syed is a US, not a Pakistani, citizen). The denial of bail to Syed demonstrates the tenacity of Islamophobic narratives (even before 9/11, of course), but it also reveals a particular framing of “terrorism as contractible” (Cacho 2012: 100).

We can draw several conclusions from this exploration of Adnan Syed’s case as popularized through Serial and subsequently through several podcasts and even an HBO special series. As a case study, NPR’s Serial is embedded in the liberal logics of the justice system and its redemptive possibilities despite its proven flaws. In this respect, it is an excellent example of the way that the will to knowledge—the meaning of apprehend that depends on the conceit of innocent knowledge—can operate in tandem with the impetus toward criminalization and capture. Because the case was argued in 1999, it demonstrates that the racialized/sexualized/gendered stereotype of the “terrorist” was not inaugurated with 9/11, and, indeed, has been in development and operation since well before. Finally, the podcast framework itself references the metaphor of contagion, and provokes a consideration of how information goes “viral”—whether in the form of memes, storytelling mechanisms like podcasts, or metadata—and what its modes of infection are.

To Apprehend, Take 2: Metadata

So far, these observations about the Syed case, as framed by the kinds of sexualizing and racializing assemblages that we saw in the Shakira law meme, tell a familiar story about the way Muslim lives are shaped by the discourses and representations through which they are figured and apprehended. Yet another vector through which to analyze the Shakira meme is through its relationship to data assemblages (Kitchin 2014: 25) and “speculative thought.” “Speculative thought, or ‘soft thought,’ is not a form of reasoning modeled after any form of human reasoning. It is a particular form of algorithmic cognition that is independent of human thought or intervention” (Dixon-Román 2016: 486). In the data sense, speculative thought therefore gestures to the way that machines and software can impact material realities. Kitchin notes that since the word data comes from the Latin root “to give,” the word is actually a misnomer: “Technically, then, what we understand as data are actually capta (derived from the Latin capere, meaning ‘to take’); those units of data that have been selected and harvested from the sum of all potential data” (Kitchin and Dodge 2011). As Johanna Drucker (2011) argues, “No ‘data’ pre-exist their parameterization. Data are capta, taken not given, constructed as an interpretation of the phenomenal world, not inherent in it.” The concept of data assemblages, then, combines the harvesting—to denote both the selective taking and the uses to which they are put—of data with their ability to materially shape a person’s life. Recalling Junaid Rana’s description of the racialization of Muslims as flexible, it is important to contextualize the processes of classifying and creating categories in relation to the process of collecting and managing endlessly culled and stored data, and the machine-learned ability to apply these data to existing categories, or create new ones, in whatever way matches the desired narrative. There is perhaps no better example of such a practice than the potential uses and deployments of metadata, and the idea that Syed’s case is potentially the first to use them in a court is significant precisely because of the way that anti-Muslim bias was hidden under the radar of this supposedly objective evidence.

Because, as Dixon-Román (2016: 483) argues, “Data are assemblages that are more-than-human ontologies that consist of the forces of sociopolitical relations,” these captured data also actively interact with existing racializing and sexualizing assemblages. The case of Adnan Syed likely represents the first time that cell phone data was introduced as evidence in a court case in the state of Maryland. Beyond its novelty, this interesting fact becomes all the more so once we realize that the cell phone data introduced in Syed’s case actually represent possibly the first time that metadata were introduced to literally frame a criminal. The state (prosecution) argued that Syed could be placed at the scene of the disposal and burial of Lee’s body through the use of evidence cataloging cell phone “pings.” It used cell phone records and expert witnesses to argue that incoming calls to Syed’s cell phone “pinged” the cell phone tower closest to the burial site (a park on the outskirts of town) at the approximate time that her body was placed there, according to the state’s own time line. Serial does not question this evidence and therefore mobilizes it to sustain and animate the question of whether Adnan did kill Hae. As the Undisclosed podcast uncovers, though, the assumptions about such metadata were inherently false—even according to experts who could have been called (but were never contacted) at the time. When cell phone towers are overloaded with a glut of incoming and outgoing calls, excess calls will automatically be redirected to another, less occupied tower, so the record of “pings” doesn’t actually provide evidence of Syed’s location. Nevertheless, it functioned to serve as evidence of his location in the eyes of the jury that convicted him. In this way, the Syed case foreshadows (or, indeed, establishes the precedent for) the capture of metadata in the current context—a practice that President Obama would later attempt to downplay in his comments following the Snowden revelations about the PRISM program, when he said: “No one is listening to your telephone calls” (Feldmann 2013). Though PRISM allowed the NSA to collect bulk data from private corporations like Google, Facebook, Apple, Yahoo, and Microsoft, Obama’s comments imply that direct surveillance of the content of personal communication would be a breach of privacy laws, whereas capturing the bulk data form of those same communications (the metadata of the time and place during which a phone call or text is sent and received, along with its duration) is not a breach of privacy. In the case of Adnan Syed, as is the implication for subsequent and future cases, his cell phone metadata provided material evidence for his capture and incarceration, despite the fact that it could not actually link him to the crime. If Adnan Syed’s case is the analog precursor to the kind of predictive analytics that currently operate in countless social media (and other) platforms, it is a chilling and foreboding case to say the least. Wielding the notion that data are actually capta—pieces of information that are taken—metadata can in fact be captured and harvested to serve the argument of the state.

Obama’s dismissive comment that the state is not listening to our phone calls aims to invisibilize the technological infrastructure that serves to apprehend Muslims through racist patterns. Using metadata to apprehend and criminalize Muslims, algorithmic manipulations of data create what John Cheney-Lippold (2017: 47) calls “measurable types,” which are “ultimately classifications, empirically observed and transcoded as data, that become discrete analytical models for use in profiling and/or knowledge abstraction.” Developed and honed by private companies like Google, the concept of measurable types also demonstrates how private capital interests can partner with the state, shoring up the terror-industrial and homeland security complexes mentioned above. The (viral) circulation of (toxic) memes tells a familiar story about the way Muslim lives are shaped by the discourses and representations through which they are apprehended. When people are profiled into highly fraught categories, like “terrorist,” through algorithmic interpretations of their online activity, they are subject to a big-data form of “soft biopolitics” (Cheney-Lippold 2017: 132–37).

To Apprehend, Take 3: Data Industrial-Complexes/ Infrastructure

The form of criminal apprehension demonstrated in Syed’s case, now more popularly operating through cyber- or information technology, has roots in what Robert Scheer (2015: 103) calls the “military intelligence complex.” It is an apt description given that the Pentagon’s Defense Advanced Research Project Agency (DARPA), one of the key drivers of the weaponization of cyber/data, was created under the Eisenhower administration (Scheer 2015: 106). Referring to the role of private corporations and the dubious role of the profit imperative to perpetuate punishing institutions, like the military and prisons, the idea of the “military intelligence complex” also allows us to consider how our everyday and leisure uses of technology can be weaponized. Shoshana Zuboff argues in The Age of Surveillance Capitalism (2019: 182) that “both the world and our lives are pervasively rendered as information.” Our lives-rendered-as-information are then transformed into a form of “behavioral surplus” (i.e., a catalog of behaviors, likes, and interests that can be monetized), which are then subject to extraction—through an “extraction architecture” (Zuboff 2019: 129)—by information-service companies that can sell our lives-rendered-as-information for profit.

With the “extraction architecture” already developed and in place, it is a relatively short path to the ability of the state to harness data toward capture, and even to think of data in relation to weapons of mass destruction (WMDs). As the example of convicting Syed based on metadata demonstrates, our “digital selves” (Cheney-Lippold 2017) can be weaponized, a move that we could also connect to the 2004 additions to the definition of “weapons of mass destruction.” Taking advantage of the US invasion of Iraq in 2003, the National Military Strategy of the United States, issued in 2004, extended the phrase “WMDs” to “Weapons of Mass Destruction or Effect (WMD/E),” which was designed to include “more asymmetrical” weapons such as “cyberattacks” (National Military Strategy of the United States2004). In this way, the internet, and information technology more generally, were codified as weapons in relation to military strategy.

Though momentum had been building for decades, the events of 9/11 paved the way for the decimation of privacy in the service of security, which in turn enabled private corporations like Google and Facebook to play a crucial role in the weaponization of data. One of the prime examples of this is a governmental program called “Total Information Awareness” proposed by the admiral John Poindexter in 2002.9 This chilling and direct plan to create a “surveillance society,” described by the American Civil Liberties Union (ACLU) as a “virtual dragnet,” was defunded by Congress in 2003 (Diresta 2018; see also Scheer 2015: 107–13; Zuboff 2019: 116). Yet as Zuboff, Scheer, and others note, private sector companies like Google and Facebook were simultaneously busy developing technologies to gather, store, and mine capta for capital/ist purposes. As private companies, they were able to sidestep regulatory processes designed to guard against the sort of extraction their services render, and punitive state institutions have benefited from the secrecy with which they have been able to access and deploy such privately obtained data.

The weaponization of information is, of course, nothing new in the long, overlapping histories of policing and military warfare. In particular, we can locate these histories in relation to the keyword intelligence and its deployment in governmental institutions like the Federal Bureau of Investigation (FBI) and the Central Intelligence Agency (CIA) that frame their work through the rubrics of counterinsurgency. Significant here is the role that private companies play in the mining, capturing, and ultimately criminalizing of data, and the ability of governmental entities to hide behind the protections of private corporations to cull the data in the first place. Once they are culled, all that is needed is a security-based justification to commandeer them. That justification is easily fabricated and perpetuated through the prolific terror-industrial complex.

Going Viral—Circuits

Another means for apprehending in the criminalizing sense operates through the (“viral”) spread of fear. Looking more closely at the role of private corporations in enabling and disseminating racism and white supremacy, here I explore the virality of memes and the ways they shape anti-Muslim racist rhetoric. Though mainstream companies like Facebook may claim to be merely a venue for the exchange of ideas, Jane Lytvynenko (2019) notes that Islamophobia makes these companies a lot of money: “Researchers say Facebook is the primary mainstream platform where extremists organize and anti-Muslim content is deliberately spread.” They also reported that out of the top ten stories published on websites “publishing disinformation for profit . . . eight had the word ‘Muslim’ in the title.” Indeed, there is a whole industry working to actively proliferate (if not produce, since those associations were long active well before these technologies) such racist associations.

Hoax-based stories that go viral about Muslims often have long afterlives. Consider, for example, a hoax about a made-up group called “Public Purity” that “posted flyers asking people to ‘limit the presence of dogs in the public sphere’ out of sensitivity to the area’s ‘large Muslim community’” (Daro 2018). The story that went viral operated on the false assumption that Muslims are offended by dogs, and fabricated outrage that such a cultural-religious belief would curtail the public actions of non-Muslims and their dogs. Despite having been disproven in 2016, the hoax was still actively circulating in 2018. While we could certainly describe the circulation of this story in terms of its virality, the impact of its active afterlife also draws our attention to the metaphor of radioactivity. Like the depleted uranium used in US munitions because of its ability to penetrate armor and its relatively low radioactive status, which has nevertheless caused devastating long-term, chronic, and deadly health problems among the population in Iraq living with its remnants, pervasive and tenacious hoaxes about Muslims also have deleterious effects.

A similar thing could be said about the fabricated scandal that Ilhan Omar married her brother. Even in an online article by Business Insider ostensibly disproving the claim, it opens with the statement that “Omar has still not explained some puzzling discrepancies and inconsistencies in her marriage history” (Panetta 2019). These “inconsistencies,” of course, can only be understood as such within a heteronormative frame of reference in relation to the institution of marriage and with sexualized Orientalist and Islamophobic stereotypes fueling the story (fig. 3). One of the several memes addressing the Omar hoax depicts Congress members Ilhan Omar and Alexandria Ocasio-Cortez sitting next to one another, with overlaid text putting the following words in AOC’s mouth: “Now that you divorced your husband, does that mean you are no longer sister and brother?” As such memes go viral, despite being completely divorced from reality, they demonstrate both how lucrative anti-Muslim racism is and how easily such hoaxes can materially impact lives—in the case of the so-called jihad squad, this impact is evident in the number of death threats they receive. In her analysis of the way that Trump exacerbated such death threats—for example, by amplifying a meme suggesting that Ilhan Omar downplayed the gravity of 9/11—Zeinab Farokhi (2021: 21) describes such activity as “state-sponsored cyber Islamophobia” which results in “legitimizing and normalizing dehumanization and criminalization of the Muslim body, albeit in more subtle and invisible ways” in the digital realm (16).

Though we are used to thinking about these kinds of hoaxes and misinformation as “going viral,” Wendy Hui Kyong Chun (2016: 3) points out that virality is perhaps an incomplete or faulty metaphor, particularly given the way it is prone to fear-based assumptions. She suggests that “information spreads not like a powerful, overwhelming virus, but rather like a long, updated thin chain. Information is not Ebola but instead the common cold.” Brought into exaggerated relief during the current pandemic, as applied to the memes about Ilhan Omar and her brother, this suggestion implies that such memes are pervasive and persistent, consistently spreading and renewing the underlying Orientalist assumptions about Islam as immoral and hyperpatriarchal, and they can spread easily in unsensational ways. Indeed, the colloquial way of describing memes and other cultural products as going viral suggests that they are quite benign, and the economy that depends on their circulation also depends, to some extent, on this underlying assumption of benign intents and effects. This essay has sought to expose how the circuits of such memes shore up anti-Black, anti-Asian, and anti-Muslim infrastructures and how the anti-Muslim racism of memes like the Omar marriage hoax ultimately serves to shore up white supremacy (Razack, forthcoming).

Afterlives of Data

If the circuits of anti-Muslim memes and internet hoaxes can be understood as a permeating, long thin chain of toxic misinformation, the underlying heteropatriarchal racist infrastructure that sustains them can be theorized in terms of the metaphor of radiation. Here, I draw inspiration from Lars MacKenzie’s article “The Afterlife of Data,” which focuses on how trans people who obtain legal name changes can be “haunted by data.” Haunting is appropriate, and I begin with haunting and the idea of afterlives here because of the backdrop of violence and death that impacts these communities.

Bridging back to the idea of “going viral,” multiple online articles report that “Politicians have also used anti-Muslim rhetoric to bolster their popularity among voters, which then takes off on social media” (Lytvynenko 2019). And also, to Wendy Hui Kyong Chun’s point about letting go of an attachment to the origins of these memes and hoaxes, a clear strategy is to release a sensationalized meme/tweet/story and simply apologize later, as the Illinois Republican County Chairman Association did after sharing the meme that inaugurated the term “the jihad squad” (Chiu 2019). Apologizing after the fact, of course, does not prevent the afterlives that the meme/image will have and, in this sense, it may be more useful to think about it in terms of a half-life of radiation. What essentially makes matter radioactive is its instability; radiation is the energy released in particles or rays from this highly unstable matter, a decay that is measured by the unit of a half-life.

Though the concept of a “half-life” implies that it is constantly trending toward stability, the process of decay inherent to a half-life means that radiation is consistently released. In other words, it also accounts for the decay built into the system, but the decay and the tendency toward decay are precisely the source of the potential illness. Yet it doesn’t necessarily have a precise target or host, nor does it have to. It simply radiates—indeed, the metaphor here correlatively maps onto the substance.

My case study meme here is the Shari’a Barbie meme. Much like the Shakira law meme, it draws on sexualized, racialized, Orientalist, anti-Muslim stereotypes. The Shari’a Barbie meme started circulating in November 2017, when Mattel released a muhajabah Barbie, inspired by Olympic fencer Ibtihaj Muhammad. Various versions of the meme depict Barbie with black eyes and bruises, and in one case with a melted face presumably due to an acid attack. They also include text that reads, “Comes with jihab [sic], bruises, and quran” and “stoning accessories available for additional purchase” (fig. 4). Captions accompanying the circulation of Shari’a Barbie memes say things like, “I will not celebrate the subjugation of women being symbolized in a child’s toy,” demonstrating the popular equation of Islam with brutal gender violence and oppression. Yet though the meme originated in relation to the hijab-wearing Barbie, it was revived by Daniel P. Leonard, a school board member in Illinois in the summer of 2019 when he posted a tweet directed toward Rashida Tlaib. In response to a Fox News report about Tlaib calling for a hunger strike to shut down Immigration and Customs Enforcement (ICE), Leonard tweeted: “My life would be complete if she/they die” (Bella 2019). He also connected Tlaib to the revived Shari’a Barbie meme. In this literal death wish, Leonard’s tweet demonstrates how the meme’s half-life (in the radiation sense) and afterlife (in the sense of being revived in circulation) are permeated with and, in turn, continue to radiate, lethal anti-Muslim racism.

Tracing the paths, not to mention the interpretations, of ephemeral artifacts like the Shakira law and Shari’a Barbie memes is impossible, and yet that is precisely part of their power. The assemblages these memes synthesize and capture illuminate the sexualized and racialized assumptions attached to shari’a law and even Islam more generally, coalescing to perpetuate quotidian, but no less dangerous—and potentially deadly—forms of anti-Muslim racism. Thinking back to the “speculative thought” of autocorrect functions, particularly those that correct discussions of “racialized” Muslims to “radicalized” Muslims, one also has to wonder at the “data afterlives” (MacKenzie 2017) that such mistaken “corrections” will have.

Toward a Conclusion

In her book Updating to Remain the Same, Wendy Hui Kyong Chun (2016: 15) encourages us to move away from “dramatic chartings and maps of ‘viral spread’ toward questions of infrastructure and justice.” Describing information spread as having an “undead” quality, she also suggests that we shift “away from an epistemology of outing, in which we are obsessed with ‘discovering’ ‘Patient Zero,’ as though knowing the first case could solve all subsequent problems.”10 In other words, we would do well to ask in what ways the metaphors of virus and contagion feed into the logics of anti-Asian (including SWANA) and anti-Muslim racism. Thinking about Christina Sharpe’s (2016: 5) discussion of the “afterlives of slavery” as well as Lars MacKenzie’s consideration of how the “afterlife of data” impacts trans subjectivity and material life can also invite us to shift toward different metaphors for understanding the deadly impact of data/intelligence/weaponization in the service of institutional anti-Muslim racism, and foundational modes of anti-Blackness.

Building on the argument in the introduction of An Imperialist Love Story (Jarmakani 2015: 39) about the way the metaphor of radiation “reflects the structure of hegemony—the ways power can operate in seemingly invisible, unsuspecting ways while simultaneously having powerful material effects,” I suggest we think about the deployment of data through the metaphor of radiation, and particularly in relation to the question of its “undead” quality and its multiple “afterlives.” The metaphor accounts for the way the toxin can live on in silent and potentially deadly ways. Through the idea of the “half-life” of radiation, it also accounts for the decay built into the system, and the way in which the decay and the tendency toward decay (radioactive half-life) are precisely the potential source of illness. Yet it doesn’t necessarily have a precise target or host. It simply radiates. What essentially makes matter radioactive is instability. Far from an innocent mistake or a disinterested capture, the unstable circulation of data, whether through viral memes or silently culled metadata, plays a central role in the perpetuation of gendered anti-Muslim racism.

Notes

1

Adding to the humor, Snopes and other sites revealed the writing to actually be a made-up language from Lord of the Rings. www.snopes.com/fact-check/lord-rings-donut/.

2

For more on the “Shakira law” meme, see Jarmakani 2020.

3

Similarly, in an antiracist Critical Code Studies discussion group, moderated by Mark Marino, Sarah Ciston, Zach Mann, and Jeremy Douglass, @Samya offers this explanation of an algorithm: how you conceptualize a process from start to end (and then describe that process to the computer), see Roy 2021.

4

The quote comes from the documentary Coded Bias (Kantayya 2020), which focuses on the analysis and advocacy of computer scientist Joy Buolamwini and features the work of Cathy O’Neil, Safiya Umoja Noble, and others who work on exposing the racial biases built into the infrastructure of artificial intelligence and facial recognition technologies.

5

Thanks to Layla Zbinden for bringing this to my attention.

6

One of the first times I remember this happening was in email communication with Andrea Miller in 2013. Another example of this sort of autocorrect slippage comes from my colleague Keith Feldman, who reports having the word coloniality autocorrected to collegiality, a humorous slip when considering how universities have sought to temper faculty critiques of colonialism and settler colonialism (particularly in relation to Palestinian advocacy) through the discourse of “civility” (pers. comm., November 16, 2020). For more on the weaponization of “civility” in the academy, see Salaita 2015.

7

While Koenig (2014) does consider the possibility that anti-Muslim bias played a role in Adnan Syed’s case, she presents it through the sweeping and unnuanced claim made by Syed’s mother that he was convicted because he is Muslim. Not bothering to put this claim into context, Koenig therefore quickly and easily dismisses it.

8

For more on theorizations of the “thug” and the “hoodie,” see Amar 2017 and Nguyen 2015.

9

Thanks to Lucas Power for first making me aware of TIA, and especially for formative and ongoing conversations about data, big and small.

10

Noting, in particular, how abysmally efforts to contain COVID-19 failed, even our experience with an actual viral pathogen during the COVID-19 global pandemic suggests the wisdom of such an approach, which emphasizes shifts in habit rather than locating and rooting out the “original” pathogen.

Works Cited

Amar, Paul.
2017
. “
Thug Love: Populism, Policing, and Resistance from Egypt and Brazil to Trump’s America
.”
Bruce E. Porteous lecture, San Diego State University
,
San Diego, CA
,
April
10
.
Bella, Timothy.
2019
. “
A School Board Member’s Facebook Post Suggested His ‘Life Would Be Complete’ if Rashida Tlaib Died
.”
Washington Post
,
July
25
. www.washingtonpost.com/nation/2019/07/25/daniel-leonard-rashida-tlaib-death-new-jersey-school-board/.
Benjamin, Ruha.
2016
. “
Catching Our Breath: Critical Race STS and the Carceral Imagination
.”
Engaging Science, Technology, and Society
2
:
145
56
.
Benjamin, Ruha.
2019
.
Race after Technology: Abolitionist Tools for the New Jim Code
.
Cambridge, MA
:
Polity Press
.
Bowker, Geoffrey C., and Star, Susan Leigh.
1999
.
Sorting Things Out: Classification and Its Consequences
.
Cambridge, MA
:
MIT Press
.
Cacho, Lisa Marie.
2012
.
Social Death: Racialized Rightlessness and the Criminalization of the Unprotected
.
New York
:
New York University Press
.
cárdenas, micha.
2018
. “
The Android Goddess Declaration: After Man(ifestos)
.” In
Bodies of Information: Intersectional Feminism and Digital Humanities
, edited by Losh, Elizabeth and Wernimont, Jacqueline,
25
38
.
Minneapolis
:
University of Minnesota Press
.
Chaudry, Rabia.
2016
. “
Islamophobia in the Trial of Adnan Syed: Rabia Chaudry on America’s Anti-Muslim Industry
.”
Lit Hub
,
August
24
. lithub.com/islamophobia-in-the-trial-of-adnan-syed/.
Cheney-Lippold, John.
2017
.
We Are Data: Algorithms and the Making of Our Digital Selves
.
New York
:
New York University Press
.
Chin, Frank, and Chan, Jeffery Paul.
1972
. “
Racist Love
.” In
Seeing Through Shuck
, edited by Richard Kostelanetz,
65
79
.
New York
:
Ballantine Books
.
Chiu, Allyson.
2019
. “
A Meme Called Four Democrats ‘The Jihad Squad.’ A State GOP Group Is Sorry for Sharing It
.”
Washington Post
,
July
22
. www.washingtonpost.com/nation/2019/07/22/meme-called-four-democrats-jihad-squad-state-gop-group-is-sorry-sharing-it/.
Chun, Wendy Hui Kyong.
2016
.
Updating to Remain the Same: Habitual New Media
.
Cambridge, MA
:
MIT Press
.
Corredera, Vanessa.
2016
. “
‘Not a Moor Exactly’: Shakespeare, Serial, and Modern Constructions of Race
.”
Shakespeare Quarterly
67
, no.
1
:
30
50
.
Daro, Ishmael N.
2018
. “
How a Hoax about Muslims Wanting to Ban Dogs in Public Keeps Going Viral
.”
BuzzFeed News
,
April
6
. www.buzzfeednews.com/article/ishmaeldaro/hoax-muslims-ban-dogs-for-public-purity-manchester.
Deleuze, Gilles and Guattari, Félix. (
1980
)
1987
.
A Thousand Plateaus
. Translated by Massumi, Brian.
Minneapolis
:
University of Minnesota Press
,
1987
.
Diresta, Renee.
2018
. “
How the Tech Giants Created What Darpa Couldn’t
.”
Wired
,
May
29
. www.wired.com/story/darpa-total-informatio-awareness/.
Dixon-Román, Ezekiel.
2016
. “
Algo-Ritmo: More-Than-Human Performative Acts and the Racializing Assemblages of Algorithmic Architectures
.”
Cultural Studies, Critical Methodologies
16
, no.
5
:
482
90
.
Drucker, Johanna.
2011
. “
Humanities Approaches to Graphical Display
.”
Digital Humanities Quarterly
5
, no.
1
: paragraph 8. www.digitalhumanities.org/dhq/vol/5/1/000091/000091.html.
Enehey, Group.
1999
. “
Report on Islamic Thought and Culture with Emphasis on Pakistan: A Comparative Study Relevant to the Upcoming Trial of Adnan Syed
.” undisclosed-podcast.com/docs/6/Consultant%27s%20Report%20on%20Islamic%20Thought%20and%20Culture.pdf.
F., Huda (@yesimhotinthis).
2018
. “
My iPhone just corrected hijab to hijacked
.” Twitter, July 28, 8:01 p.m. twitter.com/yesimhotinthis/status/1023357836603080710.
Farokhi, Zeinab.
2021
. “
Cyber Homo Sacer: A Critical Analysis of Cyber Islamophobia in the Wake of the Muslim Ban
.”
Islamophobia Studies Journal
6
, no.
1
(Spring):
14
32
.
Feldmann, Linda.
2013
. “
Obama on NSA Data-mining: ‘Nobody is Listening to Your Telephone Calls.’
Christian Science Monitor
,
June
7
. https://www.csmonitor.com/USA/Politics/DC-Decoder/2013/0607/Obama-on-NSA-data-mining-Nobody-is-listening-to-your-telephone-calls.
Gualtieri, Sarah M. A.
2020
.
Arab Routes: Pathways to Syrian California
.
Stanford, CA
:
Stanford University Press
.
Jarmakani, Amira.
2015
.
An Imperialist Love Story: Desert Romances and the War on Terror
.
New York
:
New York University Press
.
Jarmakani, Amira.
2020
. “
Shiny, Happy Imperialism: An Affective Exploration of Ways of Life in the War on Terror
.” In
Affect and Literature
, edited by Houen, Alex,
373
89
.
Cambridge
:
Cambridge University Press
.
Kang, Jay Caspian.
2014
. “
White Reporter Privilege
.”
Awl
,
November
. www.theawl.com/2014/11/white-reporter-privilege/.
Kantayya, Shalini, dir.
2020
.
Coded Bias
. Coproduced by Hoffman, Sabine. Released November 11.
Brooklyn, NY
:
7th Empire Media
.
Kitchin, Rob.
2014
.
The Data Revolution: Big Data, Open Data, Data Infrastructures and their Consequences
.
Los Angeles, CA
:
Sage
.
Kitchin, Rob, and Dodge, Martin.
2011
.
Code/Space: Software and Everyday Life
.
Cambridge, MA
:
MIT Press
.
Koenig, Sarah.
2014
. “
The Best Defense is a Good Defense
.” In
Serial
, podcast,
December
4
. serialpodcast.org/season-one/10/the-best-defense-is-a-good-defense.
Kundnani, Arun.
2015
.
The Muslims Are Coming! Islamophobia, Extremism, and the Domestic War on Terror
.
London
:
Verso
.
Lott, Eric.
1993
.
Love and Theft: Blackface Minstrelsy and the American Working Class
.
Oxford
:
Oxford University Press
.
Lytvynenko, Jane.
2019
. “
Anti-Muslim Hate Speech Is Absolutely Relentless on Social Media Even as Platforms Crack Down on Other Extremist Groups
.”
BuzzFeed News
,
March
18
. www.buzzfeednews.com/article/janelytvynenko/islamophobia-absolutely-relentless-social-media.
MacKenzie, Lars Z.
2017
. “
The Afterlife of Data: Identity, Surveillance, and Capitalism in Trans Credit Reporting
.”
Transgender Studies Quarterly
4
, no.
1
:
45
60
.
Masco, Joseph.
2021
.
The Future of Fallout, and Other Episodes in Radioactive World-Making
.
Durham, NC
:
Duke University Press
.
Miller, Andrea.
2019
. “
Shadows of War, Traces of Policing: The Weaponization of Space and the Sensible in Preemption
.” In
Captivating Technology: Race, Carceral Technoscience, and Liberatory Imagination in Everyday Life
, edited by Benjamin, Ruha,
86
106
.
Durham, NC
:
Duke University Press
.
Nguyen, Mimi Thi.
2015
. “
Hoodie as Sign, Screen, Expectation, and Force
.”
Signs: Journal of Women in Culture and Society
40
, no.
4
:
791
816
.
Naber, Nadine.
2008
. “
‘Look, Mohammed the Terrorist is Coming!’: Cultural Racism, Nation-Based Racism, and the Intersectionality of Oppressions after 9/11
.” In
Race and Arab Americans before and after 9/11: From Invisible Citizens to Visible Subjects
, edited by Jamal, Amaney and Naber, Nadine,
276
304
.
Syracuse, NY
:
Syracuse University Press
.
The National Military Strategy of the United States of America: A Strategy for Today, a Vision for Tomorrow
.
2004
.
Washington, DC
:
Joint Chiefs of Staff
.
O’Neil, Cathy.
2016
.
Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy
.
New York
:
Crown
.
Panetta, Grace.
2019
. “
Here’s Everything We Know about the Persistent but Unproven Rumors that Rep. Ilhan Omar Married Her Brother, which Trump Repeated at a Recent Rally
.”
Business Insider
,
October
11
. www.businessinsider.com/unproven-allegations-ilhan-omar-married-her-brother-explained-2019-7.
Puar, Jasbir, and Rai, Amit.
2002
. “
Monster, Terrorist, Fag: The War on Terrorism and the Production of Docile Patriots
.”
Social Text
20
, no.
3
:
117
48
.
Rana, Junaid.
2016
. “
The Racial Infrastructure of the Terror-Industrial Complex
.”
Social Text
34
, no.
4
:
111
38
.
Razack, Sherene H.
2008
.
Casting Out: The Eviction of Muslims from Western Law and Politics
.
Toronto
:
University of Toronto Press
.
Razack, Sherene H. Forthcoming.
Nothing Has to Make Sense: Anti-Muslim Racism, White Supremacy, and Law
.
Minneapolis
:
University of Minnesota Press
.
Roy, Samya Brata.
2021
.
Comment posted in the #Coded-bias channel of the Anti-Racist CCS Reading Group
.
Discord
,
February
19
.
Salaita, Steven.
2015
.
Uncivil Rites: Palestine and the Limits of Academic Freedom
.
Chicago
:
Haymarket Books
.
Scheer, Robert.
2015
.
They Know Everything about You: How Data-Collecting Corporations and Snooping Government Agencies Are Destroying Democracy
.
New York
:
Nation Books
.
Shah, Nayan.
2001
.
Contagious Divides: Epidemics and Race in San Francisco’s Chinatown
.
Berkeley
:
University of California Press
.
Sharpe, Christina.
2016
.
In the Wake: On Blackness and Being
.
Durham, NC
:
Duke University Press
.
Sheehi, Stephen.
2011
.
Islamophobia: The Ideological Campaign against Muslims
.
Atlanta
:
Clarity Press
.
Stop LAPD Spying Coalition
.
2018
. “
Before the Bullet Hits the Body: Dismantling Predictive Policing in Los Angeles
.”
May
8
. stoplapdspying.org/wp-content/uploads/2018/05/Before-the-Bullet-Hits-the-Body-May-8-2018.pdf.
Stop LAPD Spying Coalition and Free Radicals
.
2020
. “
The Algorithmic Ecology: An Abolitionist Tool for Organizing against Algorithms
.”
Medium
,
March
2
. stoplapdspying.medium.com/the-algorithmic-ecology-an-abolitionist-tool-for-organizing-against-algorithms-14fcbd0e64d0.
Trump, Donald J.
2017
. “
Executive Order Protecting the Nation from Foreign Terrorist Entry into the United States
.” Trump White House Archives,
January
27
. trumpwhitehouse.archives.gov/presidential-actions/executive-order-protecting-nation-foreign-terrorist-entry-united-states/.
Zuboff, Shoshana.
2019
.
The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power
.
London
:
Profile Books
.