This article analyzes gradual disempowerment through the lens of exploitation. By exploitation I mean circumstances in which actors formally possess certain powers yet cannot exercise them in ways that serve their interests. A subsequent article extends this historical overview by examining potential remedies.
Societies are complex systems whose political, economic, and cultural norms are determined by their laws and power dynamics. Technology is another element that has the power to alter the outcome. Transportation and communication technologies define the paths of progress: infrastructural innovations like steam, electricity, and semiconductors fundamentally alter the frameworks for innovation. The two main transitions mentioned by the majority of the fields are the shift from agricultural to industrial society and the less abrupt shift from industries to services. Power dynamics can also change during these revolutions. Following the industrial revolution, land ownership and feudal political and economic structures did not translate into capitalist power. This allowed for the rise of new elites. [1]
Another power shift came with globalization. Manuel Castells identified the Thatcher reforms as the primary factor that contributed to the establishment of a new orthodoxy. It was a widespread deregulation and liberalization of markets, both nationally and internationally, that was aimed at reducing taxes for the wealthy corporations. The most powerful players benefited from global flows of investment shaped by liberalized global financial markets and the World Trade Organization. It led to a new dynamism of increased profits, investment, and economic growth, at least in its core countries, in the midst of a sea of poverty and marginalization. [2]
Policies associated with “trickle-down” economics (lower corporate taxes and lighter regulation) were expected to spur investment, increase employment, and accelerate growth, thereby benefiting society at large. Recent empirical studies indicate that these policies have not delivered the promised outcomes. Instead, they are chiefly associated with widening inequality. [3] As argued in Walter Scheidel’s The Great Leveler, [4] large reductions in inequality have historically been driven only by major catastrophes – total war, violent revolution, state collapse, and pandemics. This implies that, in periods of political and economic stability, advancing social and economic mobility through collective action is substantially more difficult than we ever thought. Yet, the future I prefer is one in which most large-scale catastrophes are prevented. Accordingly, mechanisms that foster mobility must be embedded in ordinary institutional design rather than triggered by crisis. Current forms of exploitation orient such efforts by exposing, and thereby specifying, the moral shortcomings [5] of today’s economic elites.
Variations in power relations produced distinct forms of exploitation. The exploitation of commoners in the feudal era is well documented. Principal mechanisms included bondage and corvée labor. Forms of rent extraction, such as dues, also persisted, in modified ways, into the present. The depth of this asymmetry is reflected in notorious practices that lingered in local memory, including the so-called “right of the first night.”
In the industrial era, overt forms of forced labor largely receded; instead, workers were compelled to sell their labor power—treated as a (so-called) “fictitious commodity”—on the market. With traditional support systems absent, survival required wage income; the alternative was destitution. Compulsion thus shifted from political, customary, or legal coercion to material necessity, grounded in hunger and other biological needs.The market order reoriented incentives, replacing the pre-industrial motive of subsistence [6] with the pursuit of monetary gain.
While traditional critiques emphasize child labor and intolerable working conditions, Karl Polanyi argues that the most profound exploitation under a market system is not primarily economic but social and cultural. The factory regime generated social dislocation on a vast scale, undermining social status, neighborhood ties, family structures, and customary standards of life. Coupled with the commodification of labor, the erosion of professional standards, and extreme volatility of earnings, these disruptions produced a broader degradation of human life. [7]
Society responded to these threats with what Polanyi calls a “protective countermovement.” Its aims were to constrain wage flexibility, strengthen the security of status and income, and limit the mobility of labor. Instruments included social insurance, trade unions, and factory legislation. Although such measures were essential to mitigate cultural collapse, other forms of exploitation expanded simultaneously.
Product adulteration had been common since the Middle Ages; profits were also increased by promoting addictive goods and minimizing their harms. Recent scholarship has documented this strategy of leveraging consumer vulnerabilities:
A Czech epidemiologist and public health specialist at the National Institute of Public Health observed that stigmatizing people with obesity is unjust, noting that contemporary environments make weight gain easy. [8] This observation is instructive: although many unsupported marketing claims have been curtailed and cigarette packages now carry mandatory health warnings, processed sugary foods remain largely unregulated. These products are still widely available—often more accessible than healthier alternatives.
Digital media provide firms with new avenues of exploitation that do not rely on physiological vulnerabilities or inherently harmful product features. The locus of exploitation is increasingly cognitive. It rests less on intrinsic product attributes and more on scalable sales techniques and interaction architectures enabled by the digital economy.
These newer practices include deceptive advertising, fabricated reviews, dark-pattern design, algorithmic manipulation, hidden fees and subscription traps, and novel forms of quality misrepresentation. Although these phenomena remain globally understudied, many cases have already been documented:
In a widely cited study, Michal Kosinski and colleagues showed that sensitive personal attributes—such as sexual orientation, ethnicity, religion, political views, personality traits, intelligence, happiness, use of addictive substances, parental separation, age, and gender—can be accurately inferred from readily available Facebook “Likes” data. [9] Within three years, Cambridge Analytica operationalized precisely the danger the authors had flagged. During the 2016 Trump campaign, the firm misused the personal data of tens of millions of U.S. voters to enable targeted political messaging. [10]
Targeting based on sensitive personal data is only one of several novel online dynamics. Many scholars have linked political polarization and radicalization to social media, yet the causal mechanism has often been misidentified.Early accounts emphasized “filter bubbles”: homophilous grouping and similarity-based recommendation systems that purportedly reinforce prior beliefs. [11] Subsequent work points to a different mechanism: exposure to dissimilar or antagonistic views can intensify radicalization.In everyday offline contexts, people are typically embedded among the like-minded; online, abrupt encounters with conflicting opinions are more frequent and can be incendiary. [12] Unlike deliberate, targeted exploitation, this pattern emerged as a byproduct of platforms’ optimization for time spent. Because advertising revenues scale with attention, platforms calibrate content to maximize scrolling and session length. High-arousal emotions—especially anger and moral outrage—are particularly engaging and therefore become privileged in ranking. [13] Anger and polarization are not the only consequences of time-on-site optimization. The prominence of short-form, highly engaging content (e.g., reels and quick videos) is associated with diminished capacity for sustained attention and with lapses in prospective memory (the ability to remember intended tasks). [14]
The information asymmetry is evident in both scenarios.In the first, third parties purchase clickstream data, infer sensitive personal attributes, and deliver content designed to shift behavior along predictable lines. In the second, so-called “behavioral surplus” [15], fine-grained signals such as brief pauses while scrolling, reveals what sustains attention, and this knowledge is used to keep users in front of screens indefinitely.
Given recent U.S. trends toward deregulating large technology firms and industry more generally, transparency about how platforms and their partners use data is likely to diminish. Reduced transparency enables the construction of more powerful predictive systems and choice architectures that steer individuals toward longer engagement or specific voting preferences. Targeting, microtargeting, and behavioral targeting may converge with AI-driven hyper-personalization, opening new dimensions of online marketing addictiveness and manipulation.
Beyond the business models of platforms and their clients—who purchase predictions about users’ shopping and voting behavior—a third set of actors complicates today’s online environment: content creators.
Their motives and tactics vary, but several strategies are frequently deployed for political purposes:
Identity deception
bots, sockpuppets, troll farms
Actors conceal or falsify their identity or affiliation.
Example: Russia’s Internet Research Agency (IRA), a troll farm with a reported monthly budget of roughly US$1.25 million by September 2016 [16], operated at scale in 2018: about 3,841 accounts on Twitter/X [17], 70 Facebook accounts, 138 Facebook pages, and 65 Instagram accounts. [18]
Coordination
Coordinated Inauthentic Behaviour (CIB) networks, hashtag brigades
Organizers hire real people, often via Telegram, to translate and propagate content through their own accounts.
Example: In Moldova, paid brigades recruited on Telegram posted fabricated news to TikTok, Facebook, and Instagram in an effort to influence an election; the network was later infiltrated and documented. [19]
Amplification markets
Paid engagement, follower buys, page/group resale, paid influencers/meme pages; PR firms
Some creators pursue virality regardless of informational value, amassing large follower pools that can be sold. Real user identities are sometimes stolen to create fake accounts providing inauthentic engagement which many public figures purchase to increase their reach.
Example: Devumi, exposed by The New York Times in 2018, sold large quantities of fake influence using automated accounts. [20]
Content deception
domain spoofing, cloned outlets, manipulated media
Networks clone reputable outlets by registering look-alike domains and copying their design to publish fabricated articles.
Example: EU DisinfoLab documented a Russia-based operation active in Europe since at least May 2022 that spoofed sites such as Bild, 20 Minutes, ANSA, and The Guardian. [21]
Feeds are saturated with a continuous stream of news, faux DIY videos that exploit the desire for quick instruction, and AI-generated clips ranging from AI-generated catastrophes to cute animals. What effects does this abundance of content produce? Gleiss describes the phenomenon as an “information flood”. [22] Once deployed as a tactic of hybrid warfare [23], a similar logic now permeates everyday media ecosystems: vast volumes of low-quality, often AI-generated material created for divergent purposes. The consequences are consistent – heightened anger, diminished capacity for sustained attention, and in some cases a reduced ability to identify reliable information. To counter these forms of emotional, attentional, and cognitive exploitation, the subsequent article outlines several potential remedies.
[1] “The government of the Crown gave place to government by a class-the class which led in industrial and commercial progress. The great principle of constitutionalism became wedded to the political revolution that dispossessed the Crown, which by that time had shed almost all its creative faculties, while its protective function was no longer vital to a country that had weathered the storm of transition” - Polanyi, K. The Great Transformation: Economic and Political Origins of Our Time. New York: Rinehart. 1944. p.41
[2] Castells, M. Informationalism, Networks, and the Network Society: A Theoretical Blueprint. THE NETWORK SOCIETY. 10.4337/9781845421663.00010. 2004.
[3] Hope, D., & Limberg, J. The economic consequences of major tax cuts for the rich. 2022. https://eprints.lse.ac.uk/107919/1/Hope_economic_consequences_of_major_tax_cuts_published.pdf
[4] Scheidel, W. The Great Leveler. Princeton University Press. 2018.
[5] I primarily refer to practices that securitize elite power. These include tax evasion, the acquisition or control of media outlets, and a broad array of anti-competitive strategies. Researching this through examination of exploitations is useful because it reveals the institutional sites where elites exercise unwarranted power closer to the places where it is actually taken from average citizens.
[6] "In almost all known societies prior to the modern market system, the economic system was submerged in social relationships. The motive was often reciprocity or redistribution, and the orderly production of goods was secured through a great variety of individual motives disciplined by custom, law, magic, and religion, where gain was not prominent" - Polanyi, K. The Great Transformation: Economic and Political Origins of Our Time. New York: Rinehart. 1944.
[7] Polanyi, K. The Great Transformation: Economic and Political Origins of Our Time. New York: Rinehart. 1944.
[9] Kosinski, M., Stillwell, D., & Graepel, T. Private traits and attributes are predictable from digital records of human behavior. PNAS. 2013.
[11] Milano, S. & Taddeo, M. & Floridi, L. Recommender systems and their ethical challenges. AI & SOCIETY. 35. 10.1007/s00146-020-00950-y, 2020. Chapter 4.6 Social effects p. 964
[12] Törnberg, P. How digital media drive affective polarization through partisan sorting, Proc. Natl. Acad. Sci. U.S.A. 119 (42) e2207159119, https://doi.org/10.1073/pnas.2207159119, 2022.
[13] Rosen, Z.P., Walther, J.B.. Social Processes in the Intensification of Online Hate: The Effects of Verbal Replies to Anti-Muslim and Anti-Jewish Posts Following 7 October 2023, Social Media + Society, 11, 4, 2025. https://doi.org/10.1177/20563051251383635
[14] Haliti-Sylaj, T., & Sadiku, A. Impact of short reels on attention span and academic performance of undergraduate students. Eurasian Journal of Applied Linguistics. 2024.
[15] Zuboff, Sh. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. New York: PublicAffairs, 2019.
[18] https://about.fb.com/news/2018/04/authenticity-matters/
[19] Rigged: Undercover in a fake news network - BBC World Service Documentaries [https://www.youtube.com/watch?v=pf8arQ03-lc]
[20] https://www.nytimes.com/interactive/2018/01/27/technology/social-media-bots.html
[21] https://www.disinfo.eu/doppelganger/
[22] Gleick, J. The information: A history, a theory, a flood. Pantheon/Random House, 2011.
[23] Paul, Ch. & Matthews, M. The Russian “Firehose of Falsehood” Propaganda Model - Why It Might Work and Options to Counter It. Perspective. https://www.rand.org/content/dam/rand/pubs/perspectives/PE100/PE198/RAND_PE198.pdf