broken image

In Justice We Act.

  • Home
  • About
  • News
  • Competition 
    • Results 2025 Spring
    • Competition 2025 Spring
    • Results 2024 Fall
    • Competition 2024 Fall
    • Results 2024 Spring
    • Competition 2024 Spring
    • Results 2023 Fall
    • Competition 2023 Fall
    • Results 2023 Spring
    • Competition 2023 Spring
    • Results 2022 Fall
    • Competition 2022 Fall
    • Results 2022 Spring
    • Competition 2022 Spring
    • Results 2021
    • Competition 2021
  • SIPI 
    • 2025 SIPI
    • 2024 SIPI Results
    • 2024 SIPI
  • Event 
    • 2024 Spring Concert for Peace
    • 2023 Spring Concert for Peace
    • 2022 Spring Forum
  • Voice
  • Interviews
  • Opinion
  • Gallery
  • Watchers
  • …  
    • Home
    • About
    • News
    • Competition 
      • Results 2025 Spring
      • Competition 2025 Spring
      • Results 2024 Fall
      • Competition 2024 Fall
      • Results 2024 Spring
      • Competition 2024 Spring
      • Results 2023 Fall
      • Competition 2023 Fall
      • Results 2023 Spring
      • Competition 2023 Spring
      • Results 2022 Fall
      • Competition 2022 Fall
      • Results 2022 Spring
      • Competition 2022 Spring
      • Results 2021
      • Competition 2021
    • SIPI 
      • 2025 SIPI
      • 2024 SIPI Results
      • 2024 SIPI
    • Event 
      • 2024 Spring Concert for Peace
      • 2023 Spring Concert for Peace
      • 2022 Spring Forum
    • Voice
    • Interviews
    • Opinion
    • Gallery
    • Watchers
broken image

In Justice We Act.

  • Home
  • About
  • News
  • Competition 
    • Results 2025 Spring
    • Competition 2025 Spring
    • Results 2024 Fall
    • Competition 2024 Fall
    • Results 2024 Spring
    • Competition 2024 Spring
    • Results 2023 Fall
    • Competition 2023 Fall
    • Results 2023 Spring
    • Competition 2023 Spring
    • Results 2022 Fall
    • Competition 2022 Fall
    • Results 2022 Spring
    • Competition 2022 Spring
    • Results 2021
    • Competition 2021
  • SIPI 
    • 2025 SIPI
    • 2024 SIPI Results
    • 2024 SIPI
  • Event 
    • 2024 Spring Concert for Peace
    • 2023 Spring Concert for Peace
    • 2022 Spring Forum
  • Voice
  • Interviews
  • Opinion
  • Gallery
  • Watchers
  • …  
    • Home
    • About
    • News
    • Competition 
      • Results 2025 Spring
      • Competition 2025 Spring
      • Results 2024 Fall
      • Competition 2024 Fall
      • Results 2024 Spring
      • Competition 2024 Spring
      • Results 2023 Fall
      • Competition 2023 Fall
      • Results 2023 Spring
      • Competition 2023 Spring
      • Results 2022 Fall
      • Competition 2022 Fall
      • Results 2022 Spring
      • Competition 2022 Spring
      • Results 2021
      • Competition 2021
    • SIPI 
      • 2025 SIPI
      • 2024 SIPI Results
      • 2024 SIPI
    • Event 
      • 2024 Spring Concert for Peace
      • 2023 Spring Concert for Peace
      • 2022 Spring Forum
    • Voice
    • Interviews
    • Opinion
    • Gallery
    • Watchers
broken image

Monetizing the Loss of Innocence: A Stakeholder Approach to Technological Harm in the Age of “Slow Feet Culture”

Tianyu Li, Beijing No One Zero One High School

· Winning Essays

Introduction

In 2023, a disturbing trend emerged in Chinese social media platform Kuaishou (literally “Fast Hand” in Chinese): minors, especially those in rural areas, filming themselves consuming menstrual blood mixed with food for online attention. This extreme behavior represents just one manifestation of what Chinese internet users have dubbed “Slow Feet Culture” — a derogatory term mocking Kuaishou’s reputation as a platform showcasing cringe-worthy, lowbrow content that encompasses minors creating bizarre and self-degrading videos for monetary awards.

When confronted with such disturbing phenomena, the immediate impulse is to attribute blame: developers who created the problematic algorithm, companies that profit from monetizing vulnerable groups, regulators who failed to prevent harm. Yet this simplistic allocation of responsibility obscures the complex ecosystem. This essay argues that technology itself does not cause harm; rather, harm emerges from the dynamic relations of power and conflicting interests among technology’s various stakeholders — developers, companies, regulators, content producers, parents, educators, advertisers, etc. A stakeholder approach to accountability examines these relationships to identify how harmful patterns emerge and persist despite the nominal desire of all parties to prevent exploitation.

Formation of Slow Feet Culture: Biopsychosocial Approach

Adolescents' Social Connection and Shared Understanding

The emergence of “Sloe Feet Culture” reflects both universal adolescent vulnerabilities and specific technological architectures that exploit them. Biologically, adolescents’ “brain regions associated with the desire for attention, feedback, and reinforcement from peers become more sensitive” (Weir, 2023). The heightened sensitivity of the prefrontal cortex, responsible for decision-making, combined with an active limbic system involved in emotional response and reward processing, makes adolescents more attuned to social feedback and peer approval. Social media platforms like Kuaishou leverage these traits through variable-ratio reinforcement schedules by giving creators monetized creation incentives and creating behavioral dependencies.

Psychologically, consuming “cringe comedy” provides viewers relief through downward social comparison. Social comparison theory explains how individuals evaluate themselves relative to others, affecting self-image and subjective well-being (APA, 2018). Urban viewers consuming rural creators’ content often engage in this downward comparison, deriving psychological positive feedback from perceived superiority. Stanley Hall’s “storm and stress” theory on adolescent development suggests that adolescents naturally experience mood fluctuations and engage in risky behaviors. Nevertheless, the interdependence model indicates that successful adolescent development requires balancing freedom and responsibility (Denise S. & Longmire, 2025). Digital platforms often maximize freedom while minimizing responsibility, creating environments where developmental vulnerabilities translate into harmful behaviors.

These individual vulnerabilities interact to create problematic digital subcultures, or shared realities. The homophily principle — “Birds of a feather flock together” — explains how like-minded individuals cluster in digital spaces (Baek & Parkinson, 2022). Evolutionarily, human beings demonstrate behaviors that align with the similarity-attraction hypothesis, and the similarities in how individuals construe the world is known as shared reality (Baek & Parkinson, 2022). Within these clusters of “shared realities”, behaviors unacceptable in mainstream contexts become normalized within specific digital communities.

Echo Chambers and Attention Economy

Social media creates echo chambers, wherein algorithms and personalization often reinforce and amplify selective exposure to like-minded information; individuals may perceive these chambers as reliable simply due to their high frequency of occurrence without counterarguments. In addition, the underlying operational mechanism of social media platforms aligns with the attention economy, wherein the users’ attention makes a profit. Complex algorithms analyze their preferences based on user duration, and the number of “likes” and “shares” they click. Then, platforms develop persuasive techniques by showing similar information to attract users’ attention. The underlying algorithms cause the formation of information cocoons and confirmation bias among users.

Beyond Individual Choices

Only within specific social contexts would “Slow Feet Culture” exist and thrive. The digital generation gap leaves many parents ill-equipped to guide adolescents’ online activities. Parental intervention often takes the form of device confiscation rather than substantive guidance about digital values and boundaries. This regulatory vacuum allows platform incentives to become the primary behavioral shapers for young users, with algorithms and comments of others effectively “parenting” digital behaviors.

Platforms like Kuaishou’s user base skews toward China’s lower-tier cities and rural areas, bringing about an urban-rural digital divide. Their economic precarity makes monetization opportunities particularly attractive for rural creators, creating greater willingness to engage in boundary-pushing behavior for financial reward. Simultaneously, these creators often lack access to digital literacy education that might help them understand the potential long-term consequences of their content.

Considering also the social media’s nature of triviality and frivolity, the resulting dynamic further perpetuates and exaggerates existing stereotypes. Urban viewers consume such content as pure entertainment rather than recognizing it as a manifestation of digital inequality. Media coverage often sidesteps discussions of rural marginalization, treating “Slow Feet Culture” as merely bizarre rather than examining the socioeconomic conditions. Meanwhile, algorithms further amplify polarized content, locking both rural creators and urban mockers into feedback loops that intensify problematic behaviors.

From Local Phenomenon to Global Pattern

While “Slow Feet Culture” emerges from China’s specific technological and social context, it represents a specific manifestation of a much broader global trend. Harmful digital youth subcultures display remarkably consistent patterns across cultural contexts: escalation of behaviors to maintain attention, exploitation of developmental vulnerabilities, platform algorithms that amplify extreme content, monetization structures that reward boundary-pushing

For example, the TikTok “Blackout challenge”, which encouraged users to choke themselves until losing consciousness, resulted in the company being sued by parents of UK teens after alleged challenge deaths (McMahon & Fraser, 2025). In Brazil, “cramming” platforms have created economies where young people monetize intimate content, blurring boundaries between social media performance and sex work (Veiverberg, 2023). The “Blue Whale Challenge” that spreads across Russia and other countries created game-like structures that started innocuously but progressively led children to self-harm and, ultimately, suicidal behaviors.

Even though the specific expressions vary, the underlying dynamics remain disturbingly similar. This raises critical questions about where accountability lies when technology facilitates self-exploitation among those least equipped to understand its consequences. Within this ecosystem, each stakeholder operates within constraints and incentive structures that make isolated intervention ineffective; addressing these harmful digital subcultures requires recognizing their emergence from complex sociotechnical systems rather than isolated failures of responsibility.

Beyond Simple Individual Blame: a Transdisciplinary Research on Systemic Accountability

Power Dynamics of Harmful Subcultures on Social Media

Varoufakis claims that capitalism has been superseded by techno-feudalism, the idea that extraordinary power is being held in a handful of corporations that control online access and personal data (Gilbert, 2024).

The “slow feet” phenomenon and similar harmful subcultures reveal profound power imbalance and asymmetries in the technology ecosystem. Currently, platforms possess disproportionate power through algorithmic control and data access (Caplan & Boyd, 2016), while content creators (especially vulnerable young users) and regulators operate with significant information deficits. Factors causing power imbalance include opacity of technical systems, convergence of media entities, and the limited recourse for accountability (Caplan & Boyd, 2016). As Pasquale (2015) argues in “The Black Box Society,” algorithms represent a form of power that operates beneath conscious awareness. The technical complexity of these systems creates insurmountable comprehension barriers for most users, causing people to oftentimes be unaware of the manipulative power of algorithms. In addition, as machine learning now becomes the dominant paradigm shaping data-driven technologies, user feedback can feed into existing biases or values that shape the spread of information. Without proper function of algorithms as “gatekeepers”, feedback loops that build bias into the design of information processes are created (Caplan & Boyd, 2016). In this case, the immediate solution is to decentralize the locus of power and control towards public discourse.

Beyond technological knowledge, platforms and users operate with vastly different levels of information access, creating a secondary power asymmetry. Digital redlining refers to the systematic process by which specific groups (often itself already considered as minorities) are deprived of equal access to digital tools (McCall et al., 2022), creating new inequalities of digital divide that perpetuates already-existing inequalities. Platforms possess behavioral data that can be transformed into prediction products, while young rural creators have no equivalent visibility into audience behavior or platform operations; platforms have research teams analyzing potential harms, while creators lack basic understanding of potential self-objectification, psychological impacts, or exploitation risks.

Furthermore, this phenomenon reveals how the digital divide operates not merely as access inequality but as profound disparities in technological literacy. Research demonstrates that marginalized users often develop distinct usage patterns focused on entertainment and social connections rather than information access or critical evaluation (Yan & Schroeder, 2019). Educational disparities further widen this gap. Complex literacy skills are not innate but require systematic development, which rural schools lack.

Stakeholder Interest Alignment and Conflict

“Digital media are cultural tools that at once reflect the cultural values and biases of the creators, and whose use is shaped by the cultural values of the users” (Manago & McKenzie, 2022). Platform interests align partially with content creators’ in maximizing engagement, but diverge significantly regarding harm and exploitation. Platforms benefit from attention-generating content regardless of its impact on creators’ wellbeing or dignity. Advertiser and platform interests align in targeting engaged audiences, manifested as platforms theoretically prohibiting exploitative content while algorithmically promoting it due to engagement metrics. Viewer and creator interests partially align in entertainment provision but conflict regarding exploitation. One way this phenomenon is ingrained in the social structures and perpetuates social hierarchy is through the use of symbolic violence. Symbolic violence refers to the subtle and often unnoticed forms of domination and control that are exerted through social norms, values, and cultural practices. It is a form of power that functions at the level of culture, and is often internalized by individuals, leading them to accept and reproduce the social hierarchies and inequalities that exist (Easy Sociology, 2024). Perhaps most troublingly, short-term and long-term interests conflict within individual stakeholders. For example, young creators' immediate interest in attention and monetization conflicts with their long-term interest in dignity and healthy development.

Reconfiguring Stakeholder Relationships

Instead of platforms now treating teen creators as expendable resources, prioritizing short-term profit over effective protection of long-term harm, they should legally mandate that 50% of ad revenue from related content be reinvested in rural digital education. In addition, it is important to ensure algorithmic transparency via public release of algorithms or code, making data more publicly accessible and leading to greater oversight (Caplan & Boyd, 2016). However, relying solely on algorithmic accountability is not sufficient — algorithmic transparency comes into conflict with other values such as privacy, accessibility, and legibility, and that “algorithms” might not be the right level for analysis as we cannot change the “content” simply by reconstructing the “code”, needless to say that only very few people have the technical capacity to understand and untangle complex algorithms (Caplan & Boyd, 2016). Furthermore, companies should introduce ethics-based education for algorithm designers. This approach is called values-in-design or value sensitive design, which requires the developers to critically examine how their personal values might integrate into their product. This approach advocates for developers to actively include “positive” values into technologies (Caplan & Boyd, 2016). Nevertheless, this approach raises a major concern — being a virtue-based approach, it relies on developers to determine the “positive” and “morally-correct” values built into technologies — giving them significant power, responsibility, and thus pressure for criticism.

Conclusion

The case of “Slow Feet Culture” ultimately teaches us that technological accountability cannot be reduced to a question of individual blame but must be understood as a collective responsibility to create digital systems that enhance rather than exploit human development. Looking forward, we must move beyond reactive approaches to technological harm toward proactive ethical frameworks that anticipate vulnerabilities before exploitation occurs.

References

APA. (2018, April 19). social comparison theory. https://dictionary.apa.org/social-comparison-theory

Baek, E. C., & Parkinson, C. (2022, October 17). Shared understanding and social connection: Integrating approaches from social psychology, social network analysis, and neuroscience. https://pmc.ncbi.nlm.nih.gov/articles/PMC9786704/

Caplan, R., & Boyd, D. (2016, May 13). Who Controls the Public Sphere in an Era of Algorithms? Mediation, Automation, Power. https://datasociety.net/wp-content/uploads/2016/05/MediationAutomationPower_2016-1.pdf

Denise S., S. C., & Longmire, A. B. (2025). Cross-cultural patterns in adolescents. https://www.ebsco.com/research-starters/ethnic-and-cultural-studies/cross-cultural-patterns-adolescents

Easy Sociology. (2024, January 17). Pierre Bourdieu’s Symbolic Violence: An Outline and Explanation. https://easysociology.com/sociology-of-violence-conflict/pierre-bourdieus-symbolic-violence-an-outline-and-explanation/

Gilbert, J. (2024). Techno-feudalism or Platform Capitalism? Conceptualising the Digital Society. https://repository.uel.ac.uk/download/955df62ab1dd99f6c452c021721b841b989b6d3d444b3025c1402c56632306e8/464310/Techno-feudalism%20or%20Platform%20Capitalism_%20July%2025th%202024%20version.pdf

Manago, A. M., & McKenzie, J. (2022, June 30). 7 - Culture and Digital Media in Adolescent Development from Part II - Digital Media in the Adolescent Developmental Context.

McCall, T., Asuzu, K., Oladele, C. R., Leung, T. I., & Wang, K. H. (2022, July 18). A Socio-Ecological Approach to Addressing Digital Redlining in the United States: A Call to Action for Health Equity. https://www.frontiersin.org/journals/digital-health/articles/10.3389/fdgth.2022.897250/full

McMahon, L., & Fraser, G. (2025, February 7). TikTok sued by parents of UK teens after alleged challenge deaths. BBC. Retrieved April 2, 2025, from https://www.bbc.com/news/articles/c0lz2x60w46o

Veiverberg, F. (2023). Brazilian Camming: The Monetization of Intimacy in Online Sex Work.

Weir, K. (2023, September 1). Social media brings benefits and risks to teens. Psychology can help identify a path forward. https://www.apa.org/monitor/2023/09/protecting-teens-on-social-media

Yan, P., & Schroeder, R. (2019, November). Variations in the adoption and use of mobile social apps in everyday lives in urban and rural China. https://www.researchgate.net/publication/337512577_Variations_in_the_adoption_and_use_of_mobile_social_apps_in_everyday_lives_in_urban_and_rural_China

Subscribe
Previous
Regaining Human Rights in the Age of AI Colonialism
Next
Tech Turns Against Women…Or Is It Us
 Return to site
Profile picture
Cancel
Cookie Use
We use cookies to improve browsing experience, security, and data collection. By accepting, you agree to the use of cookies for advertising and analytics. You can change your cookie settings at any time. Learn More
Accept all
Settings
Decline All
Cookie Settings
Necessary Cookies
These cookies enable core functionality such as security, network management, and accessibility. These cookies can’t be switched off.
Analytics Cookies
These cookies help us better understand how visitors interact with our website and help us discover errors.
Preferences Cookies
These cookies allow the website to remember choices you've made to provide enhanced functionality and personalization.
Save