Author Essays, Interviews, and Excerpts, Digital Studies

Read an Excerpt from “Data Grab” by Ulises A. Mejias and Nick Couldry

Large technology companies like Meta, Amazon, and Alphabet have unprecedented access to our daily lives, collecting information when we check our email, count our steps, shop online, and commute to and from work. In Data Grab: The New Colonialism of Big Tech and How to Fight Back, Ulises A. Mejias and Nick Couldry show that this vast accumulation of data is not the accidental stockpile of a fast-growing industry. Just as nations stole territories for ill-gotten minerals and crops, wealth, and dominance, tech companies steal personal data important to our lives. It’s only within the framework of colonialism, Mejias and Couldry argue, that we can comprehend the full scope of this heist. The following excerpt from their introduction begins to outline their eye-opening argument.


Book cover for Data Grab

From Landgrab to Data Grab

The advisors to King Lobengula were suspicious of the telegraph wires being stretched across their land by the British South Africa Company in the late nineteenth century. They believed the white men’s plan was to use the wires to tie and restrain their king, ruler of the Northern Ndebele people in Matabeleland. Even when the official purpose of the telegraph was explained to them, they were still dismissive. Why would such a thing be needed, they asked, when they already possessed effective means of long-distance communication such as drums and smoke signals?

To many, this might sound like a familiar story: the story of premodern people standing in the way of inevitable progress, or the story of misguided resistance to a technology that eventually paved the way to a better world.

But the advisors to King Lobengula were justified in their suspicions. The British South Africa Company, led by Cecil Rhodes, declared war against the Ndebele in 1893 and continued with the suppression of the Matabeleland and Mashonaland uprisings in 1896. One of the pretexts used to wage war was that the locals were stealing the copper wire to make ornaments and hunting tools. The telegraph was important for other reasons too. From a military perspective, it would prove to be crucial for orchestrating the colonization of southern Africa, including what would become southern Rhodesia (now Zimbabwe). It would have been much more difficult to coordinate troop movements and send alerts without it. As a result of those wars, by 1930 about 50 percent of the country’s land—49 million acres—had been granted to European migrants, who represented only 5 percent of the population.

In other words, it was a landgrab. Colonialism may have proceeded by different methods at different times and different places in history, but in the end it always boiled down to the same thing: a seizing of land (and the riches and labor that went with it) perpetrated by force or deception.

Two things made colonialism distinctive from other asset seizures in history. First, this landgrab was global, reaching truly planetary proportions. From 1800 to 1875, about 83,000 square miles from all over the world were acquired each year by European colonizers. From 1875 to 1914, that figure jumped to 240,000 square miles per year. By the end of that period, Britain had 55 colonies, France 29, Germany 10, Portugal and the Netherlands 8 each, Italy 4, and Belgium 1. Colonialism is a story not only about the Ndebele in Zimbabwe, but also about the Bororo in Brazil and the countless other peoples who witnessed the simultaneous arrival of the telegraph, the rifle, and the cross—or whatever specific combination of colonial technologies, weaponry and beliefs they were colonized with. For none of them did these things bring peace and progress, only dispossession and injustice.

The second point is that colonial stories have long lives. We are not just talking about an isolated war here or the introduction of a technology there. Colonialism is a process that took centuries to unfold, and its repercussions continue to be felt. To put it differently: the historic landgrab may be over (obviously, southern Africa is no longer a colony of the British), but the impacts of the landgrab continue to reverberate. Compare present-day England and Zimbabwe, and you soon realize that, overall, the benefits have continued to accrue to the colonizer nation in the form of accumulated wealth, while the burdens have continued to accrue to the colonized in the form of poverty, violence, and lack of opportunity. We are increasingly sensing an urgent need to reinterpret our past and present in the light of that colonial landgrab.

But something else today is amiss that goes beyond this necessary reckoning with the past. Colonialism lives on in another way, through a new kind of landgrab. It is still new, but we can already sense how it could reshape our present and our future just as significantly as the old one.

This latest seizure entails not the grabbing of land, but the grabbing of data—data that is potentially as valuable as land, because it provides access to a priceless resource: the intimacy of our daily lives, as a new source of value. Is the exploitation of human life an entirely new phenomenon? Of course not. But this new resource grab should concern us because it exhibits some very colonial characteristics. It is global: nowhere is human life safe from this form of exploitation. It is very large-scale: the worldwide users of Facebook, YouTube, WhatsApp, and Instagram each exceed the individual populations of China and India, the world’s largest countries, with the Chinese platforms WeChat and TikTok not coming far behind. It is creating unprecedented wealth based on extraction: Big Tech companies are among the wealthiest in the world (for instance, with its stock market value of US $2.9 trillion, Apple is bigger than the entire stock market of any country in the world except the US and Japan). It is shaping the very structure of the world’s communications, with experts worried that the world’s two largest data powers, the US and China, are increasingly associated with exclusive networks of undersea communications cables. And most importantly, it continues the legacy of dispossession and injustice started by colonialism.

This book is the story of this data grab, and why it represents a reshaping of the world’s resources that is worthy of comparison to historical colonialism’s landgrab. It is the story, in other words, of a data colonialism that superimposes a data grab over the historical landgrab. We already know how the colonial story develops. To get a preview of the kinds of long-term impacts data colonialism will likely have, we don’t need to engage in hypotheticals. We need only to look at the historical record. Our present and not just our past is irredeemably colonial, and the new data colonialism is a core part of that.

The Four X’s of Colonialism

Today, you are the King of England. But just as easily you could have ruled over colonial Spain, France or the Netherlands. Regardless of your choice, the task ahead of you is essentially the same: there are territories to be settled, resources to be traded, cities to be built, and native populations to be pacified. A fair amount of ambition and greed seems to be a requirement for the job.

Explore, expand, exploit, and exterminate—the tools of your trade. With a few clicks, you apply these strategies in succession as you establish your empire. Then you apply them again. And again. And if your empire should fall, throttled by the competition or vanquished in a war, that’s not a problem. You can simply start anew, because this is only a videogame: Sid Meier’s Colonization, a turn-based strategy game released in 1994 (reissued in 2008).

Explore. Expand. Exploit. Exterminate. This is the time-tested “Four-X” formula for playing strategy video games. But it is also a fair summary of the formula applied by European colonizers to create vast fortunes for themselves, vast misery for everyone else, and in the process reshape completely the organization of the world’s resources.

Colonialism was a complicated project that required complicated enterprises. We’ve mentioned the British South Africa Company already, and there was of course the East India Company as well. The Spanish had the Casa de la Contratación de las Indias, while the Portuguese founded the Companhia do Commércio da Índia. The Dutch had their own Verenigde Oostindische Compagnie, which during the seventeenth and eighteenth centuries employed over a million Europeans to work in Asia, exporting 2.5 million tons of goods, and which was legally sanctioned to declare war, engage in piracy, establish colonies and coin money. All of these companies had close links with their respective nation’s rulers, and complex bureaucracies emerged around them.

In their operations, they neatly followed the Four-X model. They explored the world by launching missions to “discover” new places they could control through military and technological means; they expanded their dominions by establishing colonies where native labor and resources could be appropriated by force; they exploited those colonies by setting up a global system of trade where those resources could be converted into wealth, always to the advantage of the colonizer; and they exterminated any opposition by the colonized, in the process eliminating their ways of being in the world. From 1492 to about the middle of the twentieth century, that’s the story of colonialism in a nutshell. By applying the Four-X model, European colonizers managed to control over 84 percent of the globe, even though Europe represents only 8 percent of the planet’s landmass.

Let’s see how this matches the actions of Big Tech corporations.

Today, Big Tech’s efforts to explore and expand don’t involve continental land, but the virtual territories of our datafied lives: our shopping habits, for sure, but also our interactions with family, friends, lovers, and coworkers, the space of our homes, the space of our towns, our hobbies and entertainment, our workouts, our political discussions, our health records, our commutes, our studies, and on and on. There is hardly a territory or activity that is beyond this kind of colonization, and there is hardly a corner of the world that remains untouched by its technologies and platforms.

But, as with historical colonialism, that territorial capture is just the start. Once colonies were established, a system was put in place for the continuous extraction of resources from these territories, and for the transformation of these resources into riches. Big Tech has achieved a similar feat of exploitation by setting up business models that convert “our” data—that is, data resulting from tracking our lives and those of others—into wealth and power for them (but not for us). At the micro level, this means that our data is used to target us individually through advertising or profiling. At the macro level, this means that our data is aggregated and used to make decisions or predictions impacting large groups of people, such as the training of an algorithm to discriminate based on race, gender, economic status, or medical condition. This is possible thanks to a rearrangement of many aspects of our daily life in such a way that ensures we are continuously generating data.

Which brings us to the fourth “X,” where the picture is more complex. In history, colonial extermination took many forms. Principally, there were deaths caused by war, mass suicide, disease, starvation, and other forms of violence: 175 million Indigenous people in the Americas at the hands of the Spanish, Portuguese, British, and US; 100 million in India at the hands of the British; 36 million Africans who perished in transit during the transatlantic slave trade (in addition to those who perished as slaves once they arrived); one million in Algeria killed by the French; hundreds of thousands in Indonesia killed by the Dutch; and millions more who cannot be easily counted.

But brutal physical violence was not the only option. Early on, colonizers realized they needed to be able to deploy other forms of extermination that eliminated not just individual lives, but also the economic and social alternatives to colonialism (which in itself entailed the extermination of life, but at a slower rate). One strategy was the imposition of agricultural monocultures that were highly profitable for the colonizer but destroyed the ability of the colonized to feed themselves. Think of the Dutch investment in coffee production in the East Indies, which went from a harvest of one hundred pounds (45 kilograms) in 1711 to twelve million pounds (5.4 million kilograms) in 1723. Or think of the colonial sugar trade, which created great poverty and misery in the Caribbean while contributing a significant 5 percent to the British gross domestic product at its peak during the eighteenth century (without slavery, sugar would just have been too expensive for most British people to consume).

Another strategy of (economic) extermination was the throttling of business opportunities through the flooding of markets with cheap goods that eliminated homegrown industries. An example of this is the British cotton trade, which inundated global markets with cheap machine-made textiles that destroyed the lifestyles and livelihoods of domestic cultivators, spinners, and weavers in colonies such as India, not to mention the devastating human cost paid by plantation slaves in America. Throughout the colonial world, instructions like the following (sent from London to the governor of Quebec in 1763) were issued with the goal of retarding local industry: “it is Our Express Will and Pleasure, that you do not, upon any Pretence whatever . . . give your Assent to any Law or Laws for setting up any Manufactures . . . which are hurtful and prejudicial to this Kingdom.”

The monopolistic and anti-competitive practices of Big Tech are also having disruptive effects. The scale on which they operate cannot be ignored: if as late as 1945 one in three people on the planet was living under colonial rule, today, around one in three people on the planet has a Facebook account, and almost everyone uses search engines of some sort. The contexts and impacts are obviously different, but this resemblance in scale means that companies like Meta—which now owns Facebook, Instagram, and WhatsApp—or OpenAI have a lot of power over the lives of a lot of people. Meta’s power, many have argued, has contributed to the spread of misinformation and hate amidst genocidal violence (like in Myanmar), health crises (anti-vaccine disinformation), and political interference (the Cambridge Analytica scandal). Meanwhile, Sam Altman, the CEO of OpenAI, believes that the opportunity to solve humanity’s problems with Artificial Intelligence is so appealing that it is worth the risk of destroying the world as we know it. In other words, if AI ends up massively disrupting social values and institutions, as many experts claim could happen, Altman thinks it will be worth paying this price because of the problems AI will solve in the process. But others are not so sure this is a good bargain, which is why they are asking questions about where Big Tech’s new power to determine what is relevant, normal, acceptable, or true is heading.

Forms of economic and cultural extermination will, necessarily, take time to unfold, but they are the potential consequences of a change we can already see: a major shift in power relations that flows from the capture of virtual territories. Meanwhile, a very different story is being told about data, told with a much more positive twist. And here too there is a historical parallel. Colonialism has always required a strong civilizing mission, an imposed worldview that dismissed all alternatives and rendered invisible all contributions emanating from the colonized. This worldview allowed the colonizers to control not just bodies, but hearts and minds as well. In the past, Christianity and Western science were the cornerstones of this civilizing mission. They delineated the path towards the salvation of colonized souls and promised them a share in humanity’s scientific progress, provided they remained within their assigned roles.

Big Tech too has a civilizing mission that is mixed up with its technologies and business goals. Part of this civilizing mission continues to revolve around Western science: network science, data science, computer science, and so on. The other part no longer revolves around Christianity, but around parallel sublime notions like the convenience that will supposedly make all our lives easier, the connectivity that apparently will bring new forms of community, and the new forms of science and Artificial Intelligence associated with machines that purportedly can solve problems better than humans. It’s not as if some of these dreams are not becoming real for a select few; it’s just that they risk becoming nightmares for everyone else in the form of lost livelihoods, new forms of exploited labor, and the loss of control over vital personal data.

Civilizing missions, economic motives, the exercise of power, and the introduction of specific technologies have been deeply intermingled throughout the history of colonialism, but always with an uneven impact that favors some but not others. Take the introduction of the electrical grid to India throughout the Madras Presidency in the early twentieth century. Electricity was considered a triumph of Western science over the “devil of darkness,” and while it was initially used exclusively to improve the lives of white people as a display of cultural superiority, its application was eventually extended to the rest of the population as a kind of advertisement for the supposed benefits of colonialism. It powered cinemas, illuminated public spaces, propelled tramcars, and provided energy to places like hospitals—all while generating income for British companies. But beyond these comforts, amusements, and public services, electricity also served to run the lighthouses that guided ships carrying colonial goods, powered weapon factories, and electrified prison barbed fences that kept the population in check, extended the operating hours of offices and printing presses carrying out the colonizer’s administrative work, increased revenue by accelerating industrial and agricultural production, and provided the backbone for communication and transportation networks that guaranteed the smooth functioning of the empire.

Replace “electricity” with “data” and, while the specifics are different, some elements of the story remain eerily similar. Ways of processing data are also heralded as scientific achievements, a gift that promises convenience, connectivity, and new forms of intelligence. But look under the surface of this civilizing gift, and you will find that it also brings new forms of surveillance (through facial recognition or workplace monitoring), discrimination (when algorithms deny or control access to services based on people’s profiles), and exploitation (when gig workers’ wages are continuously adjusted downwards, for instance).

A discussion of the colonial legacy of Western science will be a recurring theme throughout the book, and this is a touchy subject. To point out the ways in which Western science has been used to justify social and environmental harms might come across as a wholesale dismissal of the benefits and contributions of science, which are many (not least to monitor and model the harms that humanity is currently doing to our environment and, if we can find them, monitor potential solutions). In no way is our argument anti-science, nor do we wish to fan the flames of science denialism. But that doesn’t exempt us from facing head-on the important critiques that colonized peoples have made of the ways in which Western science was used during and after colonialism to control and exploit the natural and social realms. In fact, it is only by looking at contemporary science through this colonial lens that we see these continuities, which go back to the origins of modernity generally and of modern science. That is all the more vital when this problematic legacy continues to shape developments like data science and AI, which have huge impacts on our present and our future. It is exactly as the Cameroonian philosopher Achille Mbembe has said: “Our era is attempting to bring back into fashion the old myth that the West alone has a monopoly on the future.”

Raw Materials

The dispossession of our data proceeds largely unabated. While it is the Big Tech companies that get the critical headlines, the data grab is not just down to a few rogue companies: it is happening at every scale, sometimes in dark corners and sometimes in plain sight.

Take for example Lasso, a leading marketer in the US health sector who you almost certainly have never heard of. Personal health data is widely assumed to be legally protected, but Lasso has found a way to offer a number of products for marketers who want to reach customers interested in healthcare, including Blueprint™, Connect™, and Triggers™. While Lasso, as one would expect, says it is compliant with US health data regulations, its ambitions for Blueprint™ are striking. To quote from its webpages:

Lasso Blueprint enables marketers to create high-value audiences composed of health providers and consumers based on diagnoses, medications, procedures, insurance data, demographic information, and much more. The product provides audience counts in real-time . . . audiences can also be dynamically refreshed on a weekly basis with the latest real-world data to ensure you [the marketer] never miss an opportunity to engage with your targets.

So, even if your health data remains strictly anonymized, it is collected in bundles which are then used to uniquely target you as a consumer with a particular medical history. It hardly matters whether your name is on the data package. All this relies on sophisticated data processing by Xandr, a company that was acquired for US $1 billion by Microsoft in 2022, and operates far from news headlines.

Data capture through surveillance for the purpose of marketing or algorithmic profiling was the biggest problem we had to worry about until recently. But the vertiginous evolution of AI in the last few years has proven that data colonialism is unleashing effects that may transform how we think or create—or, more specifically, how we allow machines to do thinking or creating for us. Consider what is happening in the art, culture, and media sectors with the arrival of natural language generation and generative AI tools. These programs are mimicking our creative endeavors using Artificial Intelligence algorithms, with increasing degrees of authenticity. Given a few text instructions, AI tools like ChatGPT, DALL-E , DeepDream, or DeepMind (the first two owned by OpenAI, the other two by Google) can generate text, images, sound or human speech that not only seem like they were generated by a human, but can imitate a specific author like Jane Austen, a painter like Salvador Dalí, or a musician like Fela Kuti.

For all this to be possible—for the AI to copy what Austen, Dalí, or Kuti read, paint, or sound like—it needs to learn what other artists, or indeed all of us, read, paint, or sound like. In other words, it needs to analyze not only Austen’s novels, but other novelists’ works too, as well as what you have written; it needs to analyze not only Kuti’s voice, but your voice too. It can fetch them from repositories that may contain the videos we upload to social media platforms or the voice messages we leave on our friends’ phones.

Some people may not think the AI generation of derivatives that mimic someone famous or unfamous are that big a deal. They can be seen as an entertaining gimmick, or perhaps a useful work tool (imagine being able to edit a voice or video message without having to re-record it). But questions about authenticity, about the value of original work, about our ability to recognize altered records, and about who controls the power to perform these feats are worthy of our consideration in these early days of Generative AI.

And there’s a bigger point too: that our collective cultural and social products now serve as the extracted raw material on which AI relies. For example, Google’s MusicLM software, which can generate melodies based on instructions like “Meditative song, calming and soothing, with flutes and guitars,” was trained on 280,000 hours of music. Did Google pay to license all of the music it used for this purpose? Not likely. This is probably the reason why it decided to delay the release of this tool for a while. If the generated music were to sound too much like the source material from which it derived, this would open the door to potential lawsuits. But eventually Google released the program free of charge, like most of its products. In its final form, the AI will not comply with requests to copy specific artists or vocals, allowing Google to avoid potential charges of copyright infringement. But there is still a corporation expropriating our cultural production as source material to train a machine to do the work of humans, because machines will be able to do the work more quickly and cheaply.

From internet searches to cloud services to generative AI, Google doesn’t need to charge us to use its products, because we are the source material for its products. Allowing the public to use its tools for free (as a way to “empower the creative process,” in the case of MusicLM) cannot conceal the nature and scale of this data “heist,” as Naomi Klein has called it.

Whether hidden in dark corners or not, these are extensive acts of appropriation. We are talking about the capturing and monetization, through data, of our collective activities, our interactions with each other across time and space, our shared resources. The “cool” factor of “generative AI” is basically a device to distract from this.

Welcome to data colonialism. It is happening everywhere. It is an appropriation of resources on a truly colonial scale. A data grab that will change the course of history, just as the original colonial landgrab did five centuries ago.


Adapted and excerpted from Data Grab by Ulises A. Mejias and Nick Couldry. © 2024 by Ulises A. Mejias and Nick Couldry. All rights reserved.

Ulises A. Mejias is professor of communication studies at the State University of New York at Oswego. Nick Couldry is professor of media, communications, and social theory at the London School of Economics and Political Science and faculty associate at Harvard University’s Berkman Klein Center for Internet and Society. Together, they are the authors of The Costs of Connection: How Data Is Colonizing Human Life and Appropriating It for Capitalism.

Data Grab is available now from our website or your favorite bookseller.