Abstract
Introduction
Following the launch of a built-in AI drawing feature, Lofter, China's largest fan art platform, has faced significant backlash from its creator community. In response to Lofter's unexpected AI integration, numerous creators deleted their accounts and left the platform, with the hashtag #AntiAI rapidly amassing approximately 90 million views in less than six hours, marking the first large-scale artist AI protest in China (Zhao, 2023). This resistance emerges amid broader contestations surrounding generative AI in creative industries, exemplified by the 2023 Hollywood writers’ strike (Stein and Armitage, 2023). Scholarly discussions have begun to address the challenges posed by generative AI to creative work, focusing on issues such as artistic agency and identity (Lee, 2022), deskilling and job displacement (Walkowiak and Potts, 2024) and concerns related to attribution, originality and intellectual property law (Bonadio and McDonagh, 2020).
Social media platforms’ proactive integration of AI features has catalyzed these tensions. With the slogan ‘Focus on interests, share creativity’, Lofter has become the prominent platform for niche artistic groups and sub-culture communities in China, especially fandom communities. It offers a unique blend of blogging and social networking features, Lofter has become a hub for fandom to connect and share fan work, such as fan drawings, fiction and animation, among others. As of 2020, Lofter hosts a community of 12 million content creators (Huang et al., 2023). In March 2023, Lofter announced the launch of an AI drawing feature called the ‘Old Pigeon Drawing Machine’, which enables users to automatically generate pictures using a few keywords. It is not the first time creative communities have raised online AI boycott movements on social media platforms, as artists on ArtStation protested against the platform's juxtaposition of AI-generated images with their own work in 2022 (Zhao, 2023). However, the case of Lofter stands out as striking because, as a creator-driven platform, Lofter introduced AI drawing features into its system, putting its own creators at risk. Shortly after the protest broke out, Lofter published an announcement clarifying the feature was designed to ‘assist users who lack artistic skills in creating pictures’. The platform also stated that the AI feature was developed using open-source materials rather than illustrations from Lofter's creators and explicitly prohibited commercial use (Li, 2023). However, the protest persisted.
Platform-dependent creators already navigate precarious conditions due to opaque system changes (Duffy et al., 2021), while also facing power imbalances in the ownership and attribution of their user-generated content on social media platforms (Meese and Hagedorn, 2019; Perkel, 2016), leaving them with limited ways for recourse (Fiesler and Dym, 2020; Hallinan et al., 2024). The power imbalance between users and corporations can intensify with the creative industry's rapid uptake of generative AI, which has become increasingly commercialized and lucrative, particularly through the non-consensual extraction of creators’ content for model training (Sokler et al., 2024). Therefore, further investigation is needed into how platform-based creators negotiate changing power dynamics when platforms integrate AI systems – sometimes without opt-out options.
Within the creator economy, the unique status of fan art creators – operating as both fans and platform-based creators – positions them as a significant case for examining the challenges marginalized creative groups face in response to social media platforms’ AI integration. The development of AI tools requires large amounts of ‘publicly available data’ on the Internet for model training, exposing these creators to heightened vulnerabilities over the content that they produce and share for free for the online communities. Historically entangled in the ‘art theft’ accusation (Perkel, 2016), fan art creators now face the risk of having their work extracted and fed to the AI model training dataset. However, in discussions about the challenges AI poses to creative workers, little attention has been paid to fan art creators, who occupy a sub-cultural and underground position within the broader creator economy. Given that fans have often been early adopters and pioneers of new technologies (Fiesler, 2007), examining fan art creators provides significant insights into the challenges that disruptive AI technologies pose to creators.
In this study, we collect posts relating to anti-AI topics on Lofter and conduct content analysis. Our data analysis aims to answer what are the specific elements of concern relating to AI involved in fan art creators’ protests, and more importantly, what brings social media platforms’ integration of AI tools in conflict with fan art creators. Following this, we outline our findings, which reveal that fan art creators face the disruption that the platform's upholding of AI brings to the control of their free-sharing work, the reputation and representation as original creators, as well as the community culture and norms that foster their artistic identities. Together, this study complements the emerging scholarship on the employment of generative AI in creative work by providing an alternative grassroots and sub-culture perspective (Jenkins, 1992). This study provides evidence that fan art creators have established legal, ethical and aesthetic justifications with regards to their own content as well as the AI-generated content, grounded in community norms in a similar way to other creative communities (Meese and Hagedorn, 2019; Perkel, 2016). By analyzing small, micro-level instances of creators’ communities, larger, macro-level issues regarding generative AI in contemporary creative industries are brought to light. The conclusion reflects on how these emerging conflicts inform broader discussions of AI in creative work.
Literature review
To begin with, fan art creators produce derivative works based on existing intellectual property, often in the forms such as fanfiction, fan vidding and illustrations (Freund, 2016). Since Lofter is primarily an image-sharing platform, this study focuses on visual fan art, such as illustrations and cartoons that creators created based on existing characters. Within fan art communities, there is reciprocity among members: fans express appreciation and support for fan art creators, who devote their creative labour to producing and sharing work for the fandom (Turk, 2014). In this section, we examine the literature on how social platforms’ integration of AI systems creates tensions with fan art creators from economic, legal and cultural perspectives.
Economic perspectives: The profit problem
To begin with, the precarious economic status of fan art creators can be described as, ‘operating from a position of cultural marginality and social weakness… and lack[ing] direct access to the means of commercial cultural production’ (Jenkins, 1992: 26). Fandom has long been considered a productive practice, and the media industries have started to recognize the economic value of fan creation and to ‘take fandom seriously’ (Fiske, 2002:41). While fan-produced content generates significant value for media producers and marketers (Stanfill and Condis, 2014), fandom communities continue to experience systematic ‘appropriation and exploitation for value as data’ (Couldry and Mejias, 2019: 338). Taking the Chinese entertainment industry for instance, media corporations take popular online fanfiction over for commercial adaptation and produce web series, which has generally become a highly popular and profitable media genre in the mainstream entertainment market (Wang and Wang, 2023). However, fandom – the infrastructure for content recreation and circulation – still largely remains uncompensated (Stanfill and Condis, 2014; Wang and Wang, 2023).
The global creator economy is fundamentally shaped by the asymmetrical power relations between individual creators and platform companies (Cunningham and Craig, 2019). Creators are generally aware of the potential commercial viability of their online content that can be exploited by the platforms. Like many creators, fan art creators also seek to establish clear boundaries between non-commercial content sharing and commercial exploitation of copyrighted material (Meese and Hagedorn, 2019). Fan art creators often regards reusing their work as fair use, ‘at least not harmful and may help sales of authorized works by increasing loyalty to the source’ (Tushnet, 2007: 144). However, as Tushnet (2017) observed, non-commercial fandom cultural production now faces unprecedented challenges in preventing their free-sharing content from being appropriated by platforms for commercial use. In particular, Chinese social media platforms efficiently datafy fandom content and convert fan engagement and creative labour into monetizable data traffic (Yin and Xie, 2021). However, despite plenty of scholarly discussion on how social media platforms capitalize on fandom communities, fan art creators – who are not only fans but also platform-based creators that strive to gain visibility and rewards for their work – have received limited attention (Pokrovskaya and Gronic, 2024). With the integration of AI systems by social media platforms that can extract creators’ work for lucrative use, further discussion is needed to examine the impact of these changes on fan art creators.
In addition, although fan art creator mostly creates content for non-commercial purposes, they sometimes try to monetize their content as a source of income (Pokrovskaya and Gronic, 2024). Besides in-app tips from fans, fan art creators often monetize their work by posting contact details on the platform to get business opportunities outside the platforms, such as providing customized illustration commission and selling self-made merchandise, such as posters, zines and accessories that featuring their derivative work (Pokrovskaya and Gronic, 2024; Sun, 2020). Therefore, gaining recognition and visibility for their works on the platform becomes the key of fan art creators’ monetization potential (Huang et al., 2023).
Though growing in popularity, fan art still very much exists in a sub-culture and legal grey area (Fauchart and von Hippel, 2008). Given fan art products made of unauthorized use of original work, fan art creators often hide themselves from the mainstream marketplace to avoid the notice of copyright holders and seek to monetize through personal connection or community-based activities, such as fan art market and small exhibitions (Pokrovskaya and Gronic, 2024). In the discussion of the creative economy, the small-scale informal art production function beyond formal, large-scale structures can offer valuable insights into how marginalized creators negotiate the tensions in the broader creative industries context (Lobato and Thomas, 2015). The challenges creators face in maintaining control over their content amid platforms’ AI integration are further exacerbated by the ambiguous legal status of their transformative works.
Legal perspectives: originality and the attribution problem
Fan creators occupy a precarious legal position that mirrors their social status: marginal and tolerated rather than celebrated as part of the universe of creators (Tushnet, 2017). While there have been efforts to strengthen the copyright protection of fan art in China, fan art creators still situated in a powerless position within the current legal system (Pokrovskaya and Gronic, 2024). Subject to pressure from corporations and legal restrictions, individual fan art creators often operate underground and remain out of sight (Freund, 2016). With the remixing nature of their content creation, fan art creators posit as a different type of author to the long-standing notion of ‘romantic author’, who creates original works as a result of their individual genius (Pappalardo and Aufderheide, 2020). In digital environments, issues surrounding the remixing fan art creation – particularly concerning ownership, authorship and copyright – become increasingly prominent (Freund, 2016).
The authorship status of fan-created works exemplifies what legal scholars describe as the Internet's capacity for ‘assaulting the distinction between mine and thine’ (Woodmansee and Jaszi, 1994: 26). While online platforms make content publication and circulation easy (Meese and Hagedorn, 2019), the challenge of achieving content visibility makes attribution central to creators’ rights. While social media platforms thrive on user-generated content, scholars have discussed platforms’ inadequate attribution systems and their role in perpetuating copyright infringement (Kaye et al., 2021; Meese and Hagedorn, 2019; Tan, 2018). Monroy-Hernandez et al. ‘s (2011) study of Scratch, a social media platform for remixing creators, illustrates how creators uphold the values of originality and develop manual credit-giving practices to deal with the platform's insufficient automated attribution systems. This also stresses the need for separate analyses of how creators encounter the attribution challenges brought by a specific platform, with distinctive characteristics and features (Tan, 2018).
Fandom operates based on implicit social norms that are held in common by community members, what Fauchart and von Hippel (2008) termed ‘norms-based IP system’. Within the close-knit communities, members establish shared norms regarding copyright, such as encouraged creative practices, anti-copying rules and proper attribution practices (Darling and Perzanowski, 2017). As such, fandom community members established informal rules that they impose on each other to self-regulate issues, such as requirements for non-commercial use and proper attribution (Fauchart and von Hippel, 2008; Fiesler, 2007). Among fan art creators, an expectation of proper attribution – crediting the contributions of others – is nearly universal. Plagiarism – the unattributed copying with the expectation of receiving credit – remains one of the most serious offenses in fan art communities that typically results in public censure (Tushnet, 2017). However, AI model training requires scraping data from various sources on the internet, often including copyrighted works (Sokler et al., 2024). As platforms integrate AI systems that may act as automated plagiarists, it becomes essential to further explore critical questions about the rights creators should hold over their content.
What also makes proper attribution central to fan art creators’ rights is the challenges they face in asserting the originality of their work. Fan art creations are transformative and can demonstrate remarkable originality in revealing new possibilities within the existing intellectual property (Fiesler, 2007), typical examples include illustrations or fiction that depict the pre-existing characters with different perspectives, interpretations or artistic representation. Nonetheless, their work remains obscure and unpublishable as ‘fanfiction is often so deeply embedded within a specific community that it is practically incomprehensible to those who don’t share exactly the same set of references’ (Tosenberger, 2014:5). Fan art creators seek recognition from their peers and distinguish themselves from pirates by expanding new aesthetics and twists to the source materials (Tushnet, 2007). The subtle and subjective artistic styles of creators face significant uncertainty when confronted with generative AI, which allows anyone to produce replicas that mimic an artist's style and can be executed by interchangeable workers (Chia, 2022). It is therefore important to investigate the cultural implications of anti-AI movement, particularly regarding the value and artistic identity that creators uphold.
Cultural perspectives: Community norms and creator identity
Community is at the heart of fandom culture. Fandom communities cohere around shared appreciation for certain cultural products (Wang and Wang, 2023), with fan art creators producing work to share with community members based on their collective understanding of the cultural products (Sarikakis et al., 2017). As Fiesler (2007) pointed out, instead of seeing them in isolation, fan art creators should be studied as part of pre-existing communities with their own traditions of participatory culture. Fandom communities often establish norms and foster an exclusive notion of ‘member status’ (Sarikakis et al., 2017: 12). Alongside the uniformity of consensus, the controversies and tensions that are contested and iterated through are also their collective experience of shaping the shared norms (Tang, 2023).
The proliferation of new technologies has given rise to a new form of participation in which machines play a significant role as actors within communities – what Li and Pang (2024) termed ‘human–community–machine interactions’. In fandom members’ networked socio-technical interactions with AI, the ongoing negotiation has an impact on the community as a whole, such as shaping or reinforcing certain norms (Li and Pang, 2024). Recently, there has been a growing body of literature on AI and fans’ creativity in fandom studies. For instance, AI technologies like chatbots have enabled humans and non-humans to co-create, which introduces new forms of participatory fandom (Lamerichs, 2018). However, while AI may democratize fandom content creation, it also raises concerns about how its introduction could deepen existing inequalities, posing ethical questions about the value of fan labour, and the exploitation of fan-made content (Lamerichs, 2018; Mussies, 2023).
The adoption of AI technologies in creative work has sparked debates about the human creators’ identity. Heated debates have emerged regarding the creative agency, subjective affection and experience of AI compared to human creators in cultural production (Chia, 2022; Messingschlager and Appel, 2023), echoing historical tensions between artisanal identity and industrial production during the Industrial Revolution (Hesmondhalgh and Baker, 2011). Recently, Ashton and Patel's (2024) study on the artistic identity construction of Ai-Da, the world's first AI robot artist, illustrates how its development team seeks to construct its creative worker identity through strategies such as showcasing artistic processes and cultivating audience relationships. However, further research is needed on how human creators receive and respond to these artificial creative identities. As Tang (2023) suggests, ambiguities of fan identity negotiation are intertwined with how norms are being violated, making it crucial to investigate how fan artists (re)negotiate their creative identities in response to the platform's AI integration.
Research methods and data collection
To collect protesting posts, we conducted targeted keyword searches on Lofter and identified the hashtag #ANTI-AI as the dominant hashtag creators used in the protesting movement. Lofter users also used both hashtags #Against AI (反AI) and # Protest AI (抵制AI) aside from #ANTI-AI in their protesting posts. However, as the prefix – ‘ANTI’ – in Chinese refers to either ‘go against’ (反对) or ‘protest’ (抵制), we thus focused on the hashtag #ANTI-AI with the specific Chinese word ‘抵制’ (protest) to minimize posts linking multiple hashtags to boost exposure.
The anti-AI protest went viral after Lofter's announcement on its launch of the AI drawing tool on the 6th of March 2023 (Li, 2023). We, therefore, collected all posts under the #ANTI-AI hashtag from March 2023 till October 2023 to capture the timely responses from Lofter users. The posts were manually exported to Excel for coding due to the platform restrictions. To protect the privacy of Lofter users, identifiable personal information, such as username and contact details, was deleted. Posts that were not directly related to Lofter's launch of the AI drawing tool, such as posts that only add the #ANTI-AI hashtag for boosting content exposure, were excluded from dataset. In total, 648 posts were included in the analysis.
We started to code the posts from the most liked and commented posts independently. The authors reviewed coding results at every 10 posts and held regular moderation meetings to maintain consistency and minimize potential bias. At the number of 50, there was no new theme emerged. We continued to code another 50 posts and reviewed the remaining posts together to ensure that data saturation was reached. The coding process was iterative, employing both deductive concept-driven and indicative data-driven qualitative content analysis approaches (Schreier, 2012). The two authors first developed three categories for coding in the codebook – Fandom Community and Culture (Jenkins 1992; Li and Pang, 2024), Author Economic Precarity (Fiske 2002; Tushnet 2017) and Attribution and Authorship (Kaye et al., 2021; Meese and Hagedorn, 2019), based on existing research and literature. During the coding process, we refined the codebook and expanded the categories into six to account for a fuller depiction of creators’ concerns. The refined categories and a summary of the coding statistics are shown in Table 1. The units for coding within posts varied. For example, some posts contained content that linked to multiple categories, while some other posts only contained an image with the hashtag. Each post was coded based on its content, meaning a single post could receive multiple codes. As a result, the total count of codes presented in Table 1 exceeds the total number of analyzed posts (
Statistics and key concerns raised by Lofter users.
Findings and discussion: When fan art meets generative AI
As we will discuss in this section, we identify three key themes in the anti-AI protest, revealing creators’ concerns about how the integration of AI on Lofter could pose a threat to their economic gain, the reputation of their original art creation and sub-culture community norms.
AI as a capitalist tool
The first finding illustrates creators’ concerns about generative AI reaping unparalleled benefits for the platform while leaving the creators of original content behind. Creators view AI integration on fan creation-based platforms as an extension of platform capitalism that exploits their labour. A primary concern centres on the non-consensual use of their free-sharing content as training data for commercial use. The phrase ‘generating energy out of love’ (为爱发电), frequently appearing in protest posts, embodies the ideals of not-for-profit content production motivated by a deep emotional connection to the community and a commitment to original work (Stanfill, 2019). Creators emphasize that their work emerges from genuine passion and commitment: ‘We create the fan art out of love, for the fandom, and for free’. Such idealized creativity, anchored in community, was sparked by non-commercial motivations and distinct from the commercial sphere. In sharp contrast, Lofter creators often use the phrase ‘having bread made by human blood’ (吃人血馒头) to describe the platform taking advantage of their work and misfortune and express their indignance. This highlights the tensions between creators’ ideals of a free-sharing community and the profit-driven goals of the platform.
While fan art creators often post their work for sharing for non-commercial purposes, many maintain fluid status between hobby and career (Pokrovskaya and Gronic, 2024). Posited outside of the mainstream cultural markets and somehow functioning underground (Freund, 2016), fan art creators often relied on Lofter and its private message function to accept customized artwork commissions as a source of income. In the posts that creators announced leaving Lofter, 26% of the posts include alternative contact information, such as the messaging app QQ, as an attempt to maintain customers. However, many creators complained there were no real alternative ways for them to maintain visibility outside of Lofter, as they rely on Lofter for connecting and interacting with niche community members, which is essential for their small community-based business. For fan art creators whose economic gains depend on community networking, the loss of social connections can significantly diminish their potential income opportunities (Fiesler and Dym, 2020). In addition, our findings show that sub-cultural creators, who lack mainstream sites for engaging with audiences and gaining visibility, are particularly vulnerable to such precarity caused by platform changes.
Shortly after the protests broke out, Lofter posted an announcement that the AI drawing function was developed using open-source materials and did not use its creators’ work published. Later, the platform removed the feature. However, the protest persisted. Creators expressed their distrust of the platform, constant change with no transparency of what Lofter does with their content. Many creators expressed their disappointment with the ‘resurgence’ of Lofter's AI, given that Lofter's first attempt to introduce an AI fanfiction generator failed after receiving users’ discontent in 2021 (Zhao, 2023). The opaque changes in the platform's features amplify the precarity of platform-based creators (Duffy et al., 2021), and the fact that the platform's AI system is impenetrable further intensifies creators’ frustration. This highlights the misconceptions about how the platform can solve the problem by removing the AI feature: even if the feature is removed, the lack of transparency in platform changes has resulted in the distrust between creators and the platform. Creators experience Lofter's new AI feature not as ‘empowering’ as the platform advertised, but as bullying by the platform company that they support by producing user-generated content. As one creator contends, ‘[our loss] is not inevitable but shaped by violence and power’, the platform's integration of AI is often described as ‘top-down’ repression of the platform companies, which is linked to a loss of control over their representation and creative autonomy.
AI as a plagiarist
In our data, fan art creators see AI-generated content as a violation and degeneration of the value of originality, creativity and aesthetic taste in human art creation. Creators criticized AI-generated content as ‘just a work of stitching’ but not a creative artwork. Creators complained that AI lacks an understanding of the subtle aesthetics that convey abstract emotions, resulting in AI-generated images that fail to capture subjective senses. For fan art creators, it is their personal expression and artistic contributions of the pre-existing that distinguish their work from pirate products, which are at the heart of both their moral and economic rights (Tushnet, 2007). For instance, a creator who created an illustration featuring Harry Potter described her work as ‘trying to fill in the gaps between what J.K. Rowling described and the untold stories that may have happened in the magical world’. Fan art creators are concerned that their personal and creative expressions – which legal scholars argue should be protected (Bonadio and McDonagh, 2020) – are being relegated as ‘publicly available content’ for AI model training. Additionally, some creators noted that while they can recognize when AI appropriates their work, the regenerated images often alter the original character settings, which they perceive as an even greater insult and disrespect. Fan art creators consider their personal aesthetics and subjective artistic styles to be very valuable for the community, yet not a form of innovation that is effectively protected by the current operation of AI and platform policy.
Creators expressed their disappointment that there is little control and autonomy left for the creators to supervise where and how their content is being used. Lofter creators are vehement about how very wrong it is to appropriate pieces of their work. They often use phrases such as ‘dismembering’ (分尸) and ‘body parts’ (尸块) to refer to generative AI's practice of dissembling and resembling different artworks in the dataset into a new AI-generated image. The way creators describe their content in bodily terms reflects their dissatisfaction over the cruelty of platforms appropriating their work for AI training, and the deep desperation over the limited control they have over how their online content is handled. In some posts, original creators revealed that they had discovered AI-generated images that appeared to contain elements of their previous works. To showcase this, they placed AI-generated content alongside their original creations, highlighting the similarities. In the ‘spreadable’ social media environment (Jenkins et al., 2012), the (re)distribution of user-generated content is often difficult to trace and original creators lack proper attribution (Kaye et al., 2021). On digital platforms, the once underground sub-cultural communities who hidden from outsiders became visible to search engines (Freund, 2016) – such unexpected public exposure could be further intensified by the introduction of AI training data extraction system. In addition, although they do not lock their posts, fan art creators are highly concerned about privacy. For instance, they often state that their work should only circulate within the niche community and that reposting or self-printing without permission is not allowed. Many creators complain that even when they add anti-AI training statements to their posts, this still does not prevent the platform from extracting their content. Given that AI systems do not provide attribution to the original authors of the data used in their training datasets (Longpre et al., 2024), fan-produced content can be treated as free data to use without explicit consent.
What also emerges from the data is that creators’ concerns about AI can be used to camouflage creators’ identities. Like other types of creators, popular fan art creators also attract fanbase through their distinctive artistic style. Acknowledging and valuing a creator's unique style is a norm that community members are expected to follow, and those who create paintings that closely imitate another's style are considered dishonorable. Some creators mentioned that while they used to post on other general-purpose social media platforms, they moved to Lofter, believing it to be an ideal space that respects their creative spirit and artistic output. However, they are now disappointed that even this niche community has been ‘stained’ by AI. For creators, Lofter is regarded as a space for independent artists and other serious ‘art lovers’, who share an artistic spirit and adhere to a set of self-regulated norms regarding copyright-related issues (Fauchart and von Hippel, 2008). The potential invasion of AI users – outsiders that unaware of the community's norms and anti-copying rules – is an unusual situation and is therefore unbearable.
There are a few posts that mentioned the current Chinese copyright system, in particular the first administrative policy on AI-generated content that required platforms to label AI-generated work (Cyberspace Administration of China, 2023). However, the consensus among most posts was that the existing legal system was ineffective in protecting their work, and several creators expressed their concern as it is difficult for individual creators to assume the burden of litigating in such cases. As a result, many creators have called on the whole community to reduce content sharing or delete the account as a way to reduce the risk of their content being infringed. While such practices may allow creators to retain some control over their work when platforms fail to protect them, this also indicates the limited opportunities for individual creators to seek recourse, such as monetary compensation. This frustration is further exacerbated by the centralized design of platforms, where users are ‘subject to a power structure that is apparently absolute and unalterable by those who lack such power’ (Schneider, 2022, p. 1966).
AI as a rule breaker
The impact of AI on the fan art community was at the heart of creators’ concern. For fan art creators, Lofter's introduction of the AI tool is more unacceptable due to the fact that, as many creators mentioned, the platform ‘betrays its original aspiration’ and ‘doesn’t respect its core audience’. This conflict reflects what Perkel (2016) regarded as ‘the tensions between corporation and community’. Lofter distinguishes itself from general-purpose social media platforms like Weibo by emphasizing artistic sensibilities and celebrating community-driven creativity. Lofter described itself as a ‘home’ for young sub-culture artists. As reflected through the posts, fan art creators believe Lofter should be a place solely to attract like-minded people to form a hub built by fans and for fans rather than pursuing broader commercial interests.
Creators experienced Lofter's introduction of AI features as a threat to their control of artistic identity. In response to Lofter's announcement that its AI drawing tool is intended to ‘assist users who lack artistic skills in creating images’ (Li, 2023), many creators argued that this rationale merely provides a shortcut for people to ‘cheat’ and undermine authentic artistic practice. In the posts, creators often demonstrate consensus, such as expressing genuine interest in the human-created original work and despising AI-generated work, which fosters an exclusive notion of themselves as part of the community (Sarikakis et al., 2015). In addition, ‘imperfect’ community members, such as users whose personal pages contain AI-generated content, are regarded as hypocritical and were criticized in some posts. In demonstrating what is (un)acceptable artistic practice, creators form a ‘communal solidarity’ (Kumar, 2019) with distinctive community norms, and these norms are reaffirmed in the protesting process.
The data demonstrates how creators construct boundaries between authentic community members and AI users, whom they characterize as ‘outsiders’ that are unable to ‘understand the spirit of fan art creation’. Some creators invoke references to the Luddite movement in their protest posts, drawing parallels to their struggles with the rise of AI and automation in the workplace (Perrigo, 2023). Such narratives embody a quest for a romantic, alternative identity to the prevalent automation in the digital environment (Merchant, 2023). In sharp contrast, using AI to create content is often regarded as inclusive, and the automatable process can be replicated and outsourced, and workers are therefore replaceable (Chia, 2022). This tension illuminates the fundamental conflicts between community-based artistic practice and automated content generation, revealing how AI integration disrupts established community norms and rules.
In the protest, many creators’ understandings of ‘art’ came into direct conflict with ‘AI-generated content’. In the posts, creators emphasize the value of dedication and hardship in their content creation, which is opposite to the utilitarian logic of AI-generated content – quick, cheap and can manufacture a huge amount of content for effective use (Chia, 2022). Creators romanticize the obstacles and struggles in the creative process – including difficulties in gaining recognition and fair compensation – as essential elements of authentic artistic identity. In contrast, the AI drawing tool – seemingly coming from nowhere – is presented as a successful artist from the beginning, without facing any hardship (Ashton and Patel, 2024). This stands in stark contrast to the celebrated notion of the ‘tortured creator’ (Alacovska and Karreman, 2022). As some creators complained, it is easier for a platform-endorsed AI account to have a large number of followers than a human creator, as many audiences cannot distinguish what is true art and what is ‘just technology’.
Noteworthy, out of 75 creators who announced leaving Lofter in the dataset, 57 fan art creators have eventually rejoined the platform. The most frequently mentioned reason for their return is the solid fandom community nurtured by Lofter that other platforms cannot compete with. For example, many creators expressed similar feelings that ‘I miss the community vibe and my fans’ or ‘there is no one to talk to on other platforms’, highlighting how emotional and social bonds anchor them to specific platforms. For creators, connecting with the community is not just for fostering monetization opportunities but also for engaging in supportive long-term connections with fans (Baym, 2015). The relationship and emotional connection between a fan art creator and its followers are hard to immigrate or recreate on other scattered platforms without a well-cultivated fandom culture. As such, creators often mention they ‘have no choice’, as it is difficult for users to leave Lofter or switch to other platforms despite being upset and disappointed by Lofter's adoption of AI. This aligns with Shapiro et al.'s (2024) study that few creators have successfully transitioned their careers from one platform to another, and the monopoly platform market and network effect make platform migration difficult (Fiesler and Dym, 2020). Our findings suggest that creators from sub-culture groups face even greater limitations, whose specific cultural practices and social connections are deeply embedded within a particular platform.
Implications and conclusion
In conclusion, this study investigates the multifaceted challenges faced by fan art creators through examining the grassroots anti-AI protesting movement; and more importantly, it foregrounds the underlying tensions that arise from platform-based AI integration in relation to this creative community. Our findings suggest that fan art creators represent a salient case for examining the asymmetrical power relations between individual creators and platforms in AI integration. Fan art creators feel at a bigger stake in the ongoing debate about AI's takeover of human artists, as platform's introduction of AI technologies has placed them in a dual tension: On one hand, creators have historically struggled to assert their rights over their work, as control over cultural products has largely remained in the hands of major media corporations (Sarikakis et al., 2015); On the other hand, platforms’ top-down integration of AI leaves creators with little choice over their online content, exposing it to the risk of being appropriated as AI training data. Fan art creators regard generative AI as an extension of platform capitalism, which reaps unparalleled benefits while diminishing their creative autonomy. In addition, fan art creators have expressed concerns about AI technologies appropriating their unique artistic expressions without proper attribution, which poses significant challenges to creators’ recognition, public reputation and the integrity of creative communities. Moreover, the unwelcomed trespass of AI tools on creators’ communities undermines the shared values and norms that fan art creators cherish, which are fundamental to their artistic identity. The creators’ dependence on platform-specific networks for visibility and audience engagement further complicates their autonomy to migrate to alternative platforms. This precarious situation underscores the vulnerability of sub-cultural creators who are embedded in niche communities that lack broader support outside of these platforms. We suggest that our empirical findings from fan art creators therefore have implications for understanding the perils posted by generative AI to individual and grassroots creators, and what rights do, and should creators have over the content they generate.
Besides the industry hype around the positive potential of generative AI in creative work, this study reveals the need for a more consensual environment in the integration of AI. The platform's apology and removal of AI features did not address creators’ dissatisfaction, which indicates that platforms, as both intermediaries for and proprietors of AI tools, should build transparency and consent into systems by design. At a time when comprehensive guidance from copyright law or platforms concerning AI remains lacking, this study demonstrates that creators are beginning to develop norms and practices around the use of their online content with regards to AI. In addition, we suggest that the nuanced norms and controversies emerging within specific groups around AI can inform scholarly research and debates on copyright systems (Fauchart and von Hippel, 2008), which should change to better account for everyday media practices of ordinary creators alongside the large-scale industrialized creative production (Meese and Hagedorn, 2019).
In the coming years, concerns around creators’ rights will be exacerbated by social media platforms’ quick embrace of generative AI tools. Shortly after Lofter's anti-AI protest, China introduced the first administrative policy on regulating platforms’ incorporation of AI, specifying the requirement for platforms to attribute and watermark AI-generated work (Cyberspace Administration of China, 2023). In December 2024, the UK government initiated a consultation aimed at clarifying how creators’ copyrighted materials can be utilized in AI model training, seeking to ensure that creators maintain control over their works while fostering AI innovation (UK Government, 2024). While the creators in our dataset agreed that AI poses multiple challenges to their rights, what they want from the platform or how they negotiate the tensions within the industry remains an open question. How to build a consensual environment, as well as the trust between the creative and tech sectors, merits further investigation.
