Abstract
Introduction
In the digital era, algorithms are integral to platform operations as they structure online interactions and govern information flow by curating search results, personalising advertisements and predicting behaviours, primarily through automated user data extraction and decision-making. Their capacity to operate autonomously is often framed as emblematic of a new form of power, defined by surveillance, profit-driven motives and opacity, as their detailed inner workings remain largely impenetrable (Pasquale, 2015; Thrift, 2005). These narratives frequently depict users as passive data sources, mere nodes in a network whose online presence serves only as raw material for algorithmic processes and value extraction (Zuboff, 2019). On the other hand, an emerging camp of debates offers more moderate views. While acknowledging the power asymmetries inherent in algorithmic systems, this line of research advocates for a more nuanced engagement with forms of digital user agency and algorithmic resistance. Scholars in this camp often frame user resistance as an integral component of digital ecosystems along with structures of digital power (Ettlinger, 2018), focusing on how users make sense of algorithms and what strategies they develop to navigate, and even resist, their determinations (Bishop, 2019; Karizat et al., 2021; Velkova and Kaun, 2021). Recent scholarship has further developed these views by situating algorithmic resistance within a continuum of manifestations of algorithmic agency, intended as “the reflexive ability of humans to exercise power over the outcome of an algorithm” (Bonini and Treré, 2024: 19), encompassing user acts ranging from overt contestation of data practices to small-scale actions that lack an explicit challenge to power.
In line with this growing body of more tempered perspectives, this article analyses how individuals resist algorithmic determinations on e-commerce platforms. Drawing on 27 interviews and 3 focus groups with online buyers, the paper explores tactics that users devise to resist pricing mechanisms, avoid undesired personalised recommendations, and circumvent visibility constraints. E-commerce platforms represent a key site for examining user-algorithms interactions, given their widespread use and their direct impact on users’ financial decisions, creating strong incentives for resistance. Yet, these platforms remain largely underexplored in current debates on algorithmic resistance, which have predominantly focused on gig economy platforms (Bonini and Treré, 2024; Hao and Freischlad, 2022; Nguyen-Thu and Munn, 2025), streaming sites (Siles et al., 2024), social media apps (Bishop, 2019; Duffy and Meisner, 2023; Karizat et al., 2021) and activist practices seeking to reclaim data sovereignty or to correct algorithmic biases (Beraldo and Milan, 2019; Velkova and Kaun, 2021). The article addresses this empirical gap, spotlighting e-commerce platforms as a valuable empirical site to examine how resistance unfolds in deeply routine, yet economically consequential, digital environments.
By integrating insights from theories of everyday resistance (De Certeau, 1984; Scott, 1985) and contemporary debates on algorithms (Bonini and Treré, 2024; Bucher, 2018; Ettlinger, 2018; Seaver, 2017), the article foregrounds a type of algorithmic resistance materialised through nonpolitical, creative tactics motivated by self-interest and aimed at securing an immediate economic advantage. A focus on these often-overlooked dimensions reveals agency expressed through individualised, pragmatic acts rather than formal political or collective action, effectively focussing on the most subtle, nonconfrontational end of the algorithmic agency spectrum (Bonini and Treré, 2024). Mirroring the pervasive nature of algorithmic power, such resistance is equally embedded in the everyday, often constituting the only viable option for users lacking the resources (or intention) for larger-scale contestation. The article also identifies algorithmic flexibility as a structural enabler of such resistance, allowing users to attain outcomes that benefit them or, at least, that minimise the negative consequences of algorithmic outcomes. While algorithmic flexibility is a key aspect of algorithms' controlling power that makes them adaptable, it also creates openings for negotiations. Variations in input data or user interactions generate a range of outcomes that create opportunities for users to strategically seize favourable possibilities. These features of flexibility have been highlighted by recent debates (Bucher, 2018; Roberge and Seyfert, 2016; Seaver, 2019), but they’re often framed as inefficiencies or technical limitations, falling short of explicitly recognising them as strategic spaces for user resistance. In contrast, I argue that the flexibility embedded in algorithms makes resistance an inherent structural possibility, ontologically inseparable from algorithmic power.
Algorithmic systems, therefore, do not necessarily and deterministically exert unilateral power, rather they inherently create spaces for potential resistance to unfold, enabling users to challenge (at least some of) their outcomes while catering to personal needs. By spotlighting the creativity, initiative and perspicacity of e-commerce platform users, a relevant yet overlooked empirical domain, the article contributes to ongoing critical reassessments of overly negative, dystopian at times, narratives on algorithmic governance, while consolidating bridges between platform studies and resistance theories.
Algorithmic decision-making power
The increasing penetration of digital platforms in everyday life has prompted a significant wave of scholarly debates about their impacts on individuals and societies (Beer, 2019; Zuboff, 2015). With the recent developments in big data analytics and machine learning, algorithms come to play an increasingly prominent role in determining the functioning of said platforms by efficiently processing large volumes of user data, and perhaps more importantly, by being at the heart of decision-making on who sees what and when, determining the distribution of visibility, resources and opportunities (Beer, 2019). Although a unified definition of algorithms in the social sciences is missing, most would describe them as “complex sociotechnical systems” that have “social implications” (Gillespie, 2016, 22). Algorithms are more than just lines of code, as they are deeply woven into the sociocultural fabric that shapes them and within which they operate. In fact, most scholars agree on considering algorithms embedded in social practices and outcomes. As Beer (2019) aptly put it: ‘As well as being produced from a social context, the algorithms are lived with, they are an integral part of that social world’ (p. 4), also echoing Zuboff's stance that sees electronic texts not as ‘things in themselves’, but as ‘inherently embedded in the social’ (Zuboff, 2015, 77).
Another point of broad consensus is the opaqueness of data extraction, the workings, and the ways in which decisions are taken. This becomes even more concerning given the ubiquity of algorithms and their extensive impact on everyday experiences. The phenomenon has been described with various degrees of alarmist and dystopian tones, including the ‘technological unconscious’ (Thrift, 2005) or ‘a new kind of invisible hand’ (Zuboff, 2015). Particularly significant, both in scholarly discourse and popular language, is the metaphor of the ‘black box’ (Pasquale, 2015), which portrays algorithms as ‘systems whose workings are mysterious; we can observe its inputs and outputs, but we cannot tell how one becomes the other’ (Pasquale, 2015, 3). Such inscrutability, combined with the ability to make decisions with minimal or no human involvement, has been repeatedly framed as a manifestation of a new form of power that has the ability ‘to scrutinize others while avoiding scrutiny’ (Pasquale, 2015, 3). This is often operationalised as part of the capitalist logic of accumulation, aiming to dominate networked spaces for profit. In this sense, this power is also instrumentarian, namely operating through the ‘instrumentalization of behaviour for the purposes of modification, prediction, monetization, and control’ (Zuboff, 2019, 331).
Although users may benefit from the implementation of algorithms by enjoying personalised experiences, convenient online shopping, and greater connectivity, users also become subjected to opaque algorithmic decisions with little understanding of how or why they are made. This creates an ‘incredibly asymmetrical power relationship’ (Bonini and Treré, 2024, 3), where data collection and decision-making may come at the expense of users’ privacy, autonomy and creativity. From domains like social media, streaming sites and e-commerce platforms, algorithms subtly steer users towards predefined outcomes with the overall goal of turning behavioural data into financial profit. These mechanisms not only shape what users see, buy and engage with, but may also constrain their ability to make independent choices, reinforcing a system in which algorithmic-defined priorities dictate digital experiences while remaining largely unaccountable and elusive.
Algorithmic flexibility: Enabling power while inviting resistance
Despite platforms’ efforts to govern and influence them, users can challenge platform control through tactical and strategic acts of resistance. Due to perceptions of pervasiveness and inscrutability of algorithms, resistance may easily be overlooked, as it may seem beyond the reach of the average user. Yet, reality is far more complex. Emerging research shows that users are not mere recipients of algorithmic outcomes, but active agents who probe, navigate and strategically work to tilt outcomes in their favour. This is particularly evident on gig economy platforms, where algorithms structure labour and pay. For instance, gig workers in Hanoi strategically toggle their app availability and optimise routes to train algorithms (Nguyen-Thu and Munn, 2025), while couriers in Italy and China seek better pay via coordinated order refusals and generating fake orders (Bonini and Treré, 2024). Such resistance is frequently communal, as seen among Gojek drivers in Jakarta (Hao and Freischlad, 2022), who leveraged collective action to successfully pressure platform management for improved working conditions. In contrast, the collaborative dimension is less overt among female beauty vloggers, who engage in ‘algorithmic gossip’ to share strategies to navigate YouTube's recommender system (Bishop, 2019). On TikTok, producers strategically alter content to resist algorithmic bias against marginalised communities (Duffy and Meisner, 2023), while consumers curate feeds to match their “algorithmic selves” with their identities (Karizat et al., 2021). Elsewhere, the Swedish activist artist Johanna Burai initiated the World White Web campaign to “repair” Google's PageRank algorithm with the aim of diversifying racially biased image search results (Velkova and Kaun, 2021). Bonini and Treré (2024) offer a comprehensive theoretical framework to encompass all these acts based on the concept of algorithmic agency, defined as ‘the reflexive ability of humans to exercise power over the outcome of an algorithm’ (p. 20). They conceptualise agency and resistance along a continuum, ranging from explicit challenges to platform power to mundane practices that interfere with algorithms without necessarily confronting structural domination.
Here, I focus on the latter category of mundane and subtle user actions, and I contribute by highlighting the role of algorithmic flexibility as a key enabler for micro-acts of resistance, particularly on e-commerce platforms. Recognising this flexibility is key to unpacking such resistance. Unlike traditional algorithms with fixed input–output relationships, machine learning algorithms, the most common types in today's online platforms, adapt by continuously ‘learning’ from input data and user interactions (Policarpo et al., 2021). Throughout calculations, they apply human-defined thresholds and hypotheses to process data, meaning that they may not produce a consistent, deterministic output for a given input. It may even be challenging to know what exactly caused a certain outcome, or even what the outcome itself might be, including under similar sets of initial conditions. This unpredictability, though a key capability that allows algorithms to adapt to changing conditions and improve over time, it's a double-edged sword, as it also makes algorithms sensitive to user interactions. This means that small changes in input can shift outputs significantly, to the point that even ‘micro-failures’ can lead to large-scale distributed algorithmic dysfunctionalities (Miyazaki, 2016). This idea of complexity and mutability is well captured, among others, by Seaver's (2019) image of algorithms not as ‘standalone little boxes, but massive, networked ones with hundreds of hands reaching into them, tweaking and tuning, swapping out parts and experimenting with new arrangements’ (p. 419), aptly highlighting the inherent dynamism and imprecision of algorithmic systems.
This is the key to this article's conceptualisation of algorithms as instruments of resistance, as users can exploit these sensitivities to influence or disrupt outcomes, strategically altering the algorithm's behaviour in their favour, or, in less fortunate circumstances, minimising unwanted negative consequences. In fact, the exploitability of algorithmic flexibility is not uniform across systems, and its potential is mediated by factors such as goal clarity and outcome observability. Systems like e-commerce or gig work algorithms (Bonini and Treré, 2024; Nguyen-Thu and Munn, 2025) exemplify a positive scenario, whereby user goals are unambiguous (saving money and maximising earnings) and success is readily quantifiable (lower price and higher pay). In these cases, flexibility can be experimented with and leveraged proactively to gain an advantage. In contrast, systems like recommender or moderation algorithms involve ambiguous goals and delayed, unmeasurable or subjectively evaluated outcomes (Duffy and Meisner, 2023; Karizat et al., 2021; Siles et al., 2024). In these cases, there may not be an easily identifiable advantage that users can seize, so that flexibility is best exploited for reactive harm reduction. For instance, marginalised TikTok creators devise strategies ‘in hopes of evading future punishment’ (Duffy and Meisner, 2023, 298), such as shadow banning or content removal. Despite the variability in potential outcomes, algorithmic flexibility remains a structurally embedded enabler of user negotiation. It is the system's inherent responsiveness to inputs and operational contingency that enables users to intervene, whether to secure tangible advantages or for more modest harm reduction.
These features of flexibility are acknowledged in literature that describes algorithms as ‘inherently chaotic, vivid and dynamic’ (Roberge and Seyfert, 2016), operating within the ‘
Here, instead of viewing it as a by-product of algorithmic design or caused by improvisational user engagement, I highlight how users can
The nonpolitical, self-interested nature of everyday algorithmic resistance
Having established algorithmic flexibility as a structural enabler, this section dissects the nature and scope of the resistance it may enable. Specifically, the article documents how such flexibility can enable small-scale acts of resistance, not meant to openly challenge algorithmic power, but to quietly redirect outcomes for personal gain or to mitigate harm
To capture these dynamics on e-commerce platforms, I conceptualise them as acts of nonpolitical, everyday resistance, and adopt a practice-oriented lens to frame them. According to this view, resistance is understood as ‘a practice, not as a particular consciousness, intent or consequence […] that
Following this line of thought, I adopt Scott's (1985) and De Certeau's (1984) everyday resistance as a valuable theoretical lens. Everyday resistance equips the weak with ‘weapons’, whether it be pilfering, desertion or foot-dragging for Scott's peasants fighting landowners, or sociocultural re-appropriation, misdirection or misuse for De Certeau's urban dwellers navigating daily life. Acts of everyday resistance are seemingly minor and apparently inconsequential, yet offering potentially effective forms of opposition to individuals who do not have the opportunity (or the interest) to participate in broader, more visible forms of resistance. These acts are invisible, as individuals intentionally refrain from drawing attention to their actions, either as a deliberate choice to shield themselves from potential repercussions of confronting power, or because their goals are primarily self-serving rather than aimed at dismantling broader systems of oppression. Reflecting these dynamics, the user resistance to algoritms is, almost out of necessity, small-scale, due to the widespread reach and opacity of algorithms, leaving users with limited options for planning general strategies. It thus remains
As a non-political practice, this form of everyday resistance stands in sharp contrast to digital activism, such as Burai's intervention against Google's politics of representation (Velkova and Kaun, 2021). While Burai's (2021) effort was a ‘symbolic act of correction’ and an inspirational ‘political project […] serving political ends’ (p. 524), acts of everyday resistance involve ‘immediate self-interested behavior, and are not principled’ (Johansson and Vinthagen, 2019, 35). Both cases spotlight mundane interactions with algorithms and aim at redirecting their outcomes, but their underlying motivations and manifestations are fundamentally different. For instance, Burai leveraged coordinated media strategies to call attention to the issue, including press releases, blog posts and national television coverage. In stark contrast, everyday resistance ‘rarely make[s] headlines’ (Scott, 1989, 49), as resisters are more interested in ‘implicit,
Another defining feature is its individualised nature, geared towards securing personal economic advantages, unlike gig workers’ more coordinated forms of algorithmic resistance, who ‘organize themselves into online groups […] to orchestrate collective actions aimed at affecting algorithms’ (Bonini and Treré, 2024, 11; Hao and Freischlad, 2022). While both contexts involve financial transactions, the user's position determines the form resistance takes. Gig workers resist as
Overall, although these actions may not reach the threshold of explicit political contention and may not always be (visibly) successful, they become testimony to the potential for agency of digital users in resisting algorithmic control and outcomes. Against deterministic, dystopian narratives (Lash, 2007; Pasquale, 2015; Zuboff, 2019), this paper juxtaposes users’ ‘intellectual creativity as persistent as it is subtle, tireless, ready for every opportunity’ (De Certeau, 1984, 38), equating their cunning acts, however small and apparently inconsequential, to ‘stunning expressions of agency’ (Ettlinger, 2018, 7). While the efficacy and degree of control that users can reclaim is certainly up for debate, a more nuanced discussion seems necessary, as voiced by Roberge and Seyfert (2016), who cautioned against an excessive focus on ‘the creepiness and suspicious nature of algorithms’ (p. 17). These concerns will underpin the discussion in this article and should similarly inform broader academic discourses.
Algorithmic price regulations on e-commerce platforms
E-commerce platforms offer a compelling case for examining algorithmic interventions, as they mediate financial transactions and directly impact pricing and purchasing conditions. Unlike social media or entertainment platforms, where algorithms influence visibility and engagement, e-commerce algorithms determine who pays what price and when, making algorithmic decisions particularly visible and consequential. E-commerce platforms are nowadays deeply embedded in daily life. They have evolved from niche services into key infrastructures for consumption, used by billions globally due to their convenience, accessibility and the vast range of choices available from home. In 2024, the global e-commerce market soared to a staggering $6.33 trillion, driven by nearly 2.8 billion online shoppers, about one-third of the world's population. Of these, approximately one billion people make online purchases on a weekly basis (SellersCommerce, 2025). This deep integration into everyday life makes encounters with algorithmic mechanisms both commonplace and economically significant, positioning e-commerce platforms as a valuable site for examining how algorithmic power is lived with and contested in daily economic life.
One of the most prominent ways this algorithmic control is manifested is through algorithmic dynamic pricing, with algorithms automatically generating dynamic and customer-specific prices, either at the individual or aggregate level, in real-time, based on data collected about the customers (Seele et al., 2021). This may include users’ location, device type, browsing and purchase history, social media posts and interactions, leading to tactics like personalised advertisement and discounts, surge pricing, or price adjustments based on user behaviour (Gautier et al., 2020). Given that such practices are not explicitly prohibited in many jurisdictions (Sears, 2021), online retailers can be expected to employ them to increase profits, making their implementation likely and widespread. In fact, although the full scale and scope of dynamic pricing remain largely unknown, ghost shopping experiments have identified instances of price variations across various platforms, including Amazon, Lufthansa, Opodo, Booking.com and many others, with evidence ranging from price differences based on device type and location to more limited findings in specific markets and price ranges (Gautier et al., 2020; Seele et al., 2021).
In these systems, algorithms efficiently segment customers, grouping buyers based on collected data to curate product visibility and tailor prices. By assessing willingness to pay, these mechanisms increase purchase likelihood, maximising platform revenue over time (Seele et al., 2021). While such practices have long been part of the capitalist logic of accumulation, algorithms intensify this process, making segmentation more granular and dynamic. This fuels complex forms of dynamic pricing that may reinforce disparities in access and opportunities, transforming individuals into commodified data sources for profit extraction, often without consent. As digital subjects become objects of algorithmic control, their spending power is thus shaped by ‘a new kind of invisible hand’ (Zuboff, 2015, 82), reinforcing power and knowledge asymmetries. Algorithmic categorisation is, in fact, a political intervention, as it involves ranking, grouping, sorting and predicting, making ‘the world appear in certain ways rather than others’ (Bucher, 2017, 3). The act of categorising determines who belongs where, who has access to what, and who is excluded. This is worrying not only because of its potential to create new social, economic or political inequalities, but also because of its ability to mask and even amplify existing ones (Hoffmann, 2019).
In addition, research in psychology and marketing consistently finds that perceptions of dynamic pricing are largely negative, fostering mistrust and even feelings of betrayal (Hufnagel et al., 2022). Yet, little work has examined how users actively resist algorithmic pricing. These negative attitudes are not merely passive reactions but may work as catalysts for strategic engagement. Users develop an array of tactics, switching devices, using incognito mode, faking interest in items, delaying purchases, employing price-scraping tools, or creating multiple accounts, not just to evade pricing algorithms but to exploit their very logic to their advantage. As in social media, entertainment or gig economy platforms, and perhaps even more visibly, e-commerce sites become exemplary arenas for power-resistance struggles, where the politics of platform profit collide with users’ self-interest, offering a compelling case study to explore algorithmic control and resistance.
Methods
This study employs a qualitative research design to explore how users navigate and respond to algorithmic pricing systems. Data was collected through 27 semi-structured online interviews and 3 focus groups with e-commerce platform users. Participants were recruited from online discussion forums and social media groups using a purposive sampling strategy to identify individuals who actively engage with algorithmic pricing and recognise its impact on their purchasing decisions, making participants well-positioned to provide insights. The participant pool consists mostly of young, digitally literate users (aged 20–36), with balanced gender representation (13 men and 14 women) and occupation, including the fields of services, technology, education and creative industries. All participants are well-acquainted with a variety of online shopping platforms, which they use almost on a daily basis. These include travel and hospitality platforms like Booking.com, Airbnb, and Trip.com, as well as popular marketplaces such as Amazon, Temu, Shein and AliExpress. Interviews lasted 30 minutes on average and relied on open-ended questions about participants’ experiences with algorithmic pricing, decision-making processes, and strategies for influencing them. Each of the three focus groups included six participants, who were selected based on shared platform usage among the most tech-savvy and active interviewees to stimulate deeper insights and collective sense-making based on shared (or divergent) experiences. Interviews and focus groups were conducted remotely on Zoom, recorded with participants’ consent, and transcribed for thematic analysis.
Methodologically, this study follows a technographic approach that emphasises the situated practices and entanglements of human and non-human actors in algorithmic environments (Bucher, 2018). Rather than attempting to decipher the inner workings of pricing algorithms, this approach focuses on how users perceive, interpret and engage with these systems. The goal is also not to demystify algorithmic process, but, in line with Seaver's (2017) recommendations, to critically examine how people interact with algorithms through everyday practices. To represent the multifaceted nature of these interactions, the article adopts a crystallisation methodological framework (Ellingson, 2009) for the combined analysis of interview and focus group data. Crystallization leverages data sources to construct ‘deep, thickly described, complexly rendered interpretations of meanings’ (Ellingson, 2009, 17), thus generating a rich account of users’ practices and understanding of online purchasing experiences and algorithmic resistance. This approach recognises that knowledge is inherently partial and situated and uses the interplay between individual and collective narratives not to find a single truth, but to illuminate the spectrum of user agency within algorithmic systems.
Interview and focus group transcripts were analysed using thematic analysis. After familiarisation with the transcripts, manual, iterative and interpretive coding identified a set of inductive codes emerging directly from participants’ actions and concerns, such as ‘price tracking’, ‘switching devices’, ‘using VPNs’, ‘frustration’, ‘awareness’, and ‘ingenuity’. These codes were then interpreted and organised through deductive concepts from the literature, primarily everyday resistance, tactics and strategies (De Certeau, 1984; Scott, 1985), flexibility (Bucher, 2018; Seaver, 2019) and affordances (Ettlinger, 2018). This synthesis led to the interpretation of users’ actions as acts of everyday resistance and to their categorisation as resistance through deception and resistance through evasion, as detailed in the findings section. While survey-based research, which is often employed in the study of user perceptions of dynamic pricing (Poort and Borgesius, 2019), can ensure statistical significance, it often overlooks how users react and resist. In contrast, this study employs the qualitative approach outlines above to uncover
Throwing the algorithm off: Resistance through deception
The following sections lay out exemplary empirical data collected from interviews and focus groups. The resistance strategies are classified into two types: resistance through deception and resistance through evasion. The former involves users actively ‘misleading’ or disrupting algorithmic systems to manipulate pricing outcomes in their favour, while the second focuses on users deploying automated tools or strategies to evade profiling and price determinations. This distinction emerged from the interview data as a pragmatic arrangement rather than a conceptually defining feature for subsequent arguments. As such, these categories should not be seen as rigid or exhaustive, but as a heuristic device that reflects how users themselves describe and enact resistance in practice.
The exchange below illustrates the experiences that were narrated during one focus group, when the discussion turned to flight ticket buying.
The two quotes above illustrate how users feign interest in the item they intend to buy to deceive the pricing algorithm into ‘thinking’ they will not complete the purchase. Their tactic of delaying purchases aligns with Scott's (1985) concept of foot-dragging, a subtle form of challenge that, despite appearing minor, can ultimately contest determinations made by algorithmic pricing systems, much like how everyday resistance can ‘deny claims made by superordinate classes’ (p. 32). This approach also reflects De Certeau's example of ‘time theft’ (De Certeau, 1984, 25), as users deliberately disrupt the profit-driven logic of e-commerce platforms, which prioritise rapid transactions, by strategically manipulating time in their favour. Other similar strategies were repeatedly voiced in the focus groups, such as in the exchange below about buying on Amazon:
Taken together, these experiences highlight the ongoing tension between user agency and algorithmic processes. While resistance can be creative and strategic, it remains limited by the fundamental inscrutability of these systems, which ‘cannot be fully revealed, but only unpacked to a certain extent’ (Roberge and Seyfert, 2016, 2). At the same time, it is crucial to acknowledge that there is no inherent obligation, or even desire, to fully disclose algorithms. Users may neither have nor seek direct access to the underlying code, which would remain largely unintelligible to most. What ultimately matters is how users perceive and navigate algorithms. As Bucher (2017, 32) notes, algorithms can be ‘accessed’ through experience and how they make people feel, inviting us to appreciate that users’ goal may not be to challenge algorithms at a structural level, but to unpack their workings only as much as it's necessary to leverage it for personal advantage: a pragmatic, tactical engagement that is purely self-motivated. After all, everyday resisters are primarily driven by securing self-oriented benefits rather than aspiring to formal recognition or ideological opposition. It is precisely the convergence of self-interest and resistance that is the vital force animating user resistance in these digital spaces. I would go as far as to say that this entanglement of pragmatism and defiance not only defines resistance but also mediates how it unfolds, determining the forms it can take and the conditions under which it can emerge. Yet this pragmatism also underscores its inherent limitations in catalysing meaningful broader changes. These acts are not transformative, as they mould algorithmic determinations
How can the algorithm know me so well? Resistance through evasion
This section explores how users resist algorithms by manipulating their digital identities and automating evasive tactics via private browsing, cookie deletion, automated tools to track price fluctuations. These strategies reveal users’ awareness of algorithmic constraints and their attempts to exploit system loopholes, as the exchange below begins to show.
However, algorithmic power is responsive, as demonstrated by Tom's remarks about bots eventually getting detected. In fact, while such bots can be designed to be quite stealthy, websites and algorithms constantly evolve their detection methods to identify and block them. This is why many platforms turn to anti-scraping measures, including CAPTCHA, IP blocking, and rate limiting, to try to block bot activity. It's a cat-and-mouse dynamic: businesses deploy anti-bot mechanisms, but users constantly find workarounds, illustrating how digital resistance is as relentless as the power it's trying to counteract and exemplifying how digital power and resistance co-evolve in an entangled, mutually influencing relationship. This dynamic also highlights the inherent limitations of user actions, which remain doubly constrained by the platform's entrenched infrastructural and economic logic as well as by its reactive capacity to detect, adapt and counteract such acts of resistance.
Discussion and conclusion
Operating in algorithmic environments is akin paddling in a river allegedly engineered to flow in a given direction. While algorithms aim to channel users towards certain outcomes, its very fluidity creates small eddies, pockets of instability that diverge from the main flow that savvy users can harness, at least momentarily, to push against the stream. By identifying and leveraging these pockets of unpredictability, everyday acts of resistance, however subtle, disrupt the intended flow, transforming the very forces meant to guide them into opportunities for negotiation and strategic advantage, perhaps to catch a fish within an eddy. Crucially, like the paddler does not need to understand the complex physics of fluid mechanics to harness the currents, digital users do not need to open the ‘black box’ to exploit algorithmic flexibility and harness it to their own advantage. By exploring user experiences on e-commerce platforms, this article parallels digital users to paddlers in the stream of pricing algorithms, illustrating how users actively engage in strategies of everyday resistance by leveraging algorithmic flexibility. While these actions may appear minor or inconsequential, they offer valuable insights into the dynamics of human–algorithms interactions.
First, these acts reveal a nuanced form of digital agency, one that operates not through outright rejection but via the strategic use of available resources. Unlike projects of contentious politics that openly seek to reconfigure power relations and data sovereignty (Beraldo and Milan, 2019), the acts in this article fall within the realm of non-activist digital practices, placing them closer to the mundane and subtle end of Bonini and Treré's (2024) continuum of algorithmic agency, where users attempt to redirect algorithmic outcomes without openly confronting structural domination. This highlights users’ ‘exceptional agency, creativity, and skill’ (O’Brien and Li, 2006, 17), especially given that they confront a power whose extent and workings are not fully available to them. It is precisely because everyday resistance ‘lack[s] its own place, lack[s] a view of the whole, limited by the blindness’ (De Certeau, 1984, 38) that it demands perspicacity, ingenuity and persistence. By analysing these everyday tactics, the article nuances dominant narratives that portray users as passive subjects, instead emphasising their capacity to negotiate algorithmic outcomes in ways that benefit them or, in the least fortunate cases, minimise harm. This suggests a deep symbiotic relationship between humans and algorithms. De Certeau's (1984) writings continue to be illuminating, especially in describing how everyday resistance ‘insinuates itself into the other's place, fragmentarily, without taking it over in its entirety, without being able to keep it at a distance’ (p. xix). This notion of insinuation aptly captures the entanglement of user resistance and algorithmic power, to the extent that algorithms themselves can be seen as instruments of resistance, as the next point discusses.
Second, in line with emerging discourse on algorithmic resistance (Bonini and Treré, 2024; Ettlinger, 2018; Velkova and Kaun, 2021), the article contributes to extending the conceptualisation of algorithms as instruments of both power and resistance into the realm of everyday, non-political and individualised encounters with algorithms. They are instruments of power when in the hands of platforms, but also of potential resistance upon their interaction with users. Specifically, the article foregrounds algorithmic flexibility as a key enabler of both algorithmic power and resistance, most effectively in user-algorithms interactions characterised by clear goals and observable outcomes. Algorithms continuously adapt to new contexts, evolving their decision-making processes. They are a constant ‘work in progress’ (Bucher, 2018, 28). This capability is a key characteristic of their power, but it also comes with ‘slippage, fragility, and a proneness to failure’ (Roberge and Seyfert, 2016, 14). By engaging with this flexibility, exploiting its inconsistencies and limitations, users can find creative ways to redirect algorithmic outcomes for their own benefit. While the extent to which users benefit varies in practice, flexibility is an intrinsic property of algorithmic systems that enables users to interfere with their functioning. Whether this leads to a concrete advantage or merely allows users to minimise harm, flexibility shows that e-commerce platforms become rich arenas for dynamic contestation, rather than places for algorithms to unilaterally exert power.
Overall, by combining the two insights above, the article calls for de-emphasising focus on the attributes of secrecy and inscrutability of algorithms, inviting instead a more nuanced and holistic approach that intrinsically accounts for resistance. Just as it is widely acknowledged that algorithms are part of sociotechnical structures that include ‘complex assemblage of people, machines, and procedures’ (Gillespie, 2016, 26), so should resistance be considered an inseparable component of what makes algorithms tick, echoing Ettlinger's call for resistance to be considered ‘as part of the apparatus of digital governance rather than peripheral or discrete’ (Ettlinger, 2018, 1). Rather than existing outside or in ideological opposition, resistance actively shapes the evolution of algorithms, influencing their logic and functionality in ways often overlooked. Resistance is one of the many hands that Seaver (2019) imagines moulding algorithms by ‘tweaking and tuning, swapping out parts and experimenting with new arrangements’ (p. 419). If we also accept that algorithms are ‘
Further, contrary to the widespread analogy of algorithms as black boxes, this article calls attention on the ‘act itself’ (Weitz, 2001, 670), that is the user actions when interacting with algorithms. This focus is often emphasised in studies of everyday resistance (Bayat, 2013; Johansson and Vinthagen, 2019; Wang, 2024; Weitz, 2001) but frequently neglected in the study of algorithms. Resistance is not necessarily a way to open the box, but to rewire it from the outside. Users may have neither the means nor the desire to open it. What users do is to nudge the box via strategic, subtle and ingenuous acts to probe which properties of the box can be used to one's advantage. Algorithms are not ‘simple, deterministic black boxes that need only to be opened’ (Seaver, 2019, 419), but complex systems whose effects emerge through interaction. It thus seems appropriate to foreground the performative dimension of these tactics, marking both user agency and its generative role in shaping algorithms. As Bucher (2018) further notes, framing algorithms primarily through their opacity amounts to a ‘red herring’ (p. 44), drawing attention away from more meaningful issues of how algorithms function in practice. The black-box metaphor evokes an unknowability and elusiveness that too often absolve us from critically interrogating the ways in which user resistance actively co-produces the algorithmic environments in which it unfolds. Instead of reinforcing narratives of opacity, we should focus on the fluid, evolving nature of algorithms and how they are constantly reconfigured through interaction and contestation.
Obviously, these considerations are not intended to either overly romanticise acts of everyday resistance or downplay the potential harms posed by algorithms. While everyday resistance strategies are effective, they likely remain situational and inconsistent, mirroring the very algorithmic flexibility they leverage. The core limitation is that users’ actions remain structurally circumscribed by the boundaries of possible action ultimately dictated by platforms. Although algorithms are flexible and their input–output relationship is contingent, the range of possible outputs users may leverage remains confined by algorithms’ own predetermined parameters. Even the savviest paddler cannot escape or redefine the river's banks. Users can only negotiate outcomes from within the system's interstices, and even De Certeau (1984) admits that everyday resistant practices ‘elude discipline without being outside the field in which [they are] exercised’ (p. 96). Ettlinger (2018, 10) also cautions about ‘a lack of sustainability’ of resistant practices, limiting their capacity to bring about systemic change. Thus, these tactics may result in fleeting, momentary advantages to users, but do not fundamentally transform platforms; they just enable continued engagement with them. In addition, not all users may be as astute and digitally literate as the interviewees of this study. Some may simply accept algorithmic outcomes because they are normalised in their everyday lives, or they might lack awareness of how algorithms influence their experiences in the first place.
Future research should thus explore how resistance strategies differ across demographic groups, particularly among users with varying levels of digital literacy or from marginalised communities. Such studies should focus on disparities in users’ capacity, or willingness, to resist algorithmic control and identify barriers faced by less privileged users in navigating algorithmic environments. Longitudinal research would also help understand how user tactics evolve alongside algorithmic adaptations, while comparative cross-platform analyses could reveal how resistance strategies vary depending on platform design, regulatory frameworks, and cultural contexts. Based on the current study, resistant strategies are only configured as circumstantial and part of a constant cat-and-mouse game. While many discussions of everyday resistance acknowledge the cumulative potential of these small acts, which ‘may add up almost surreptitiously to a large event’ (Scott, 1989, 35), it is equally crucial to recognise that meaningful solutions and stronger pushback should ideally come from policymakers, who must address the emerging challenges posed by data harvesting and automated decision-making. Various government bodies are taking measures towards regulating the ethical use of new technologies, all emphasising the need for safe, predictable and transparent digital practices. In the meantime, it appears that users are left to fend for themselves within an uneven playing field. It is indeed everyday actions, however small and atomised, that are symptomatic of discontent among the digital populace. Whether they can coalesce into meaningful change or remain fleeting gestures of frustration remains to be seen.
Footnotes
Acknowledgements
The author would like to thank the anonymous interviewees who participated in this research. The author is also extremely grateful to the anonymous reviewers and the editor for their constructive feedback during peer review.
Funding
The author received no financial support for the research, authorship, and/or publication of this article.
Declaration of conflicting interests
The author declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
