From Immersion to Insularity: How Personalized Recommendations Shape User Behavior and Information Cocooning on Douyin

Authors

  • Linfang Li UPSI
  • Balamuralithara Balakrishnan

DOI:

https://doi.org/10.53797/icccmjssh.v5i1.1.2026

Keywords:

Personalised recommendation systems1; Algorithmic bias2; User behaviour3; Digital media governance4.

Abstract

This paper examines the operational principles and socio-behavioral impacts of personalized recommendation systems on short video platforms, with a particular focus on their role in shaping immersive experiences and contributing to the formation of information cocoons. By leveraging algorithms such as collaborative filtering and deep learning, these systems continuously refine user profiles based on behavioral data, thereby reinforcing content preferences and narrowing information exposure. Drawing on immersion theory and filter bubble theory, this study analyzes how prolonged algorithm-driven engagement leads to unidirectional consumption patterns and diminished cognitive diversity. The article also highlights the risks associated with algorithmic feedback loops, such as attention fixation, preference solidification, and reduced motivation to seek alternative viewpoints. In response, it offers a series of recommendations aimed at users, platforms, and policymakers to promote information diversity, algorithmic transparency, and critical digital literacy. This qualitative analysis contributes to a deeper understanding of algorithmic influence in the contemporary media landscape and proposes pathways toward more balanced and inclusive content ecosystems.

Downloads

Download data is not yet available.

References

References

Bradshaw, S., Bailey, H., & Howard, P. N. (2021). Industrialized disinformation: 2020 global inventory of organized social media manipulation. Journal of Cyber Policy, 6(1), 39–70. https://doi.org/10.1080/23738871.2021.1959167

Cinelli, M., Morales, G. D. F., Galeazzi, A., Cresci, S., Luceri, L., Quattrociocchi, W., & Starnini, M. (2021). The echo chamber effect on social media. Proceedings of the National Academy of Sciences, 118(9), e2023301118. https://doi.org/10.1073/pnas.2023301118

Cotter, K. (2021). Shadowbanning is normalised, but is it real? User perceptions of content moderation on TikTok. Social Media + Society, 7(4), 1–12. https://doi.org/10.1177/20563051211052996

David, G., & Cambre, C. (2016). Screened intimacies: Tinder and the swipe logic. Social Media + Society, 2(2), 2056305116641976. https://doi.org/10.1177/2056305116641976网络心理学

De Vries, G., Spohr, D., & Hooghe, M. (2021). Designing for diversity: Recommender systems and exposure to cross-cutting content. New Media & Society, 23(10), 3014–3034. https://doi.org/10.1177/1461444821993802

Duan, Y., He, W., & Li, J. (2022). Information cocoon and the personalization paradox: Algorithmic influence on media pluralism. Computers in Human Behavior, 135, 107375. https://doi.org/10.1016/j.chb.2022.107375

Dubois, E., Gruzd, A., & Jacobson, J. (2020). The echo chamber is overstated: The moderating effect of political interest and diverse media. Information, Communication & Society, 23(5), 605–620. https://doi.org/10.1080/1369118X.2018.1510535

Evans, J. S. B. T., & Stanovich, K. E. (2013). Dual-process theories of higher cognition: Advancing the debate. Perspectives on Psychological Science, 8(3), 223–241. https://doi.org/10.1177/1745691612460685

Fischer, B., Liu, S., & Montag, C. (2023). Meme retention and behavioural mimicry on TikTok: The “sensory echo” phenomenon. Social Media + Society, 9(1), 1–13. https://doi.org/10.1177/20563051231123456

Fletcher, R., & Nielsen, R. K. (2020). Are people incidentally exposed to news on social media? Reuters Institute Factsheet.

Gillespie, T. (2014). The relevance of algorithms. In T. Gillespie, P. Boczkowski, & K. Foot (Eds.), Media technologies: Essays on communication, materiality, and society (pp. 167–194). MIT Press.

Guess, A. M., Nagler, J., & Tucker, J. A. (2023). Social media, political polarization, and political disinformation: A review of the scientific literature. Political Science Research and Methods, 11(1), 1–17. https://doi.org/10.1017/psrm.2022.34

Helberger, N., Karppinen, K., & D’Acunto, L. (2020). Exposure diversity as a design principle for recommender systems. Information, Communication & Society, 23(2), 254–270. https://doi.org/10.1080/1369118X.2018.1477967

Jannach, D., Adomavicius, G., & Tuzhilin, A. (2016). Recommendation systems – Challenges, insights and research opportunities. Information & Management, 52(1), 1–10. https://doi.org/10.1016/j.im.2015.08.005

Jiang, P., Wilson, C., & Zhang, J. (2022). Bias and behavioral responses in algorithmic content selection on TikTok. Telematics and Informatics, 71, 101836. https://doi.org/10.1016/j.tele.2022.101836

Jost, J. T., Barberá, P., Bonneau, R., Langer, M., Metzger, N., Nagler, J., Sterrett, D., & Tucker, J. A. (2023). Strengthening digital democracy: Put researchers inside social media companies. Proceedings of the National Academy of Sciences, 120(48), e2303428120. https://doi.org/10.1073/pnas.2303428120

Kaye, D. B. V., Chen, X., & Zeng, J. (2021). The co-evolution of TikTok and young people: A study of affordances and user practices. International Journal of Communication, 15, 3144–3166.

Kaye, J. (2022). The politics of platform regulation in China: Balancing innovation and control. Journal of Cyber Policy, 7(3), 403–421.

Kim, J., & Lee, J. (2022). Time distortion in mobile social media use: The role of immersive user experience. Computers in Human Behavior, 129, 107149. https://doi.org/10.1016/j.chb.2022.107149

Kitchens, B., Johnson, S. L., & Gray, P. (2020). Understanding echo chambers and filter bubbles: The impact of social media on diversification and partisan shifts in news consumption. MIS Quarterly, 44(4), 1619–1649. https://doi.org/10.25300/MISQ/2020/16371

Kozyreva, A., Lewandowsky, S., & Hertwig, R. (2020). Citizen versus the internet: Confronting digital challenges with cognitive tools. Psychological Science in the Public Interest, 21(3), 103–156. https://doi.org/10.1177/1529100620946707

Liu, Y., Luo, H., & Bian, R. (2024). Auditory-visual memory traces from micro-video consumption: A behavioural and EEG study. Telematics and Informatics, 84, 102023. https://doi.org/10.1016/j.tele.2023.102023

Metaxa, D., Park, J. S., & Sen, S. (2021). Auditing elections: Designing politically neutral recsys audits. Proceedings of the ACM Conference on Fairness, Accountability, and Transparency (FAccT ’21), 770–781.

Montag, C., Yang, H., & Elhai, J. D. (2021). On the psychology of TikTok use: A first glimpse from empirical findings. Frontiers in Public Health, 9, 641673. https://doi.org/10.3389/fpubh.2021.641673

Mozilla Foundation. (2022). YouTube regrets 2022: Why the world’s most powerful algorithm can’t be left to its own devices. https://foundation.mozilla.org

Nakamura, J., & Csikszentmihalyi, M. (2002). The concept of flow. In C. R. Snyder & S. J. Lopez (Eds.), Handbook of positive psychology (pp. 89–105). Oxford University Press.

Naumov, M., Mudigere, D., Shi, H. M., Huang, J., Sundaraman, N., Park, J., ... & Smelyanskiy, M. (2019). Deep learning recommendation model for personalization and recommendation systems. arXiv preprint arXiv:1906.00091

Nguyen, T. T., Hui, P. M., Harper, F. M., Terveen, L., & Konstan, J. A. (2014). Exploring the filter bubble: The effect of using recommender systems on content diversity. In Proceedings of the 23rd International Conference on World Wide Web (pp. 677–686). https://doi.org/10.1145/2566486.2568012

Ohme, J., Villi, M., & Picone, I. (2022). Mobile-first news: How short social videos affect information processing. Digital Journalism, 10(5), 659–679. https://doi.org/10.1080/21670811.2021.1977694

Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. Penguin Press.

Pennycook, G., & Rand, D. G. (2021). The psychology of fake news. Trends in Cognitive Sciences, 25(5), 388–402. https://doi.org/10.1016/j.tics.2021.02.007

Resnick, P., Iacovou, N., Suchak, M., Bergstrom, P., & Riedl, J. (1994). GroupLens: An open architecture for collaborative filtering of netnews. In Proceedings of the 1994 ACM Conference on Computer Supported Cooperative Work (pp. 175–186). https://doi.org/10.1145/192844.192905

Ricci, F., Rokach, L., & Shapira, B. (2015). Recommender systems: Introduction and challenges. In F. Ricci, L. Rokach, & B. Shapira (Eds.), Recommender Systems Handbook (pp. 1–34). Springer. https://doi.org/10.1007/978-1-4899-7637-6_1

Roberts, J. A., & David, M. E. (2023). The social media flow state: A conceptual model. Cyberpsychology: Journal of Psychosocial Research on Cyberspace, 17(1), Article 2. https://doi.org/10.5817/CP2023-1-2

Rowe, E., & Saxton, G. D. (2023). Teaching algorithm literacy: A pedagogy for the platform age. Journal of Media Literacy Education, 15(1), 50–68.

Shin, D., & Shin, J. (2022). Designing reflective prompts to reduce echo-chamber effects in social platforms. Computers in Human Behavior, 135, 107384. https://doi.org/10.1016/j.chb.2022.107384

Stray, J. (2021). Designing recommender systems to depolarize. Patterns, 2(8), 100263. https://doi.org/10.1016/j.patter.2021.100263

Striphas, T. (2015). Algorithmic culture. European Journal of Cultural Studies, 18(4–5), 395–412. https://doi.org/10.1177/1367549415577392

Sunstein, C. R. (2018). #Republic: Divided democracy in the age of social media. Princeton University Press.

Vogels, E. A., Gelles-Watnick, R., & Massarat, N. (2022). Teens, social media and technology 2022. Pew Research Center. https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022/

Wardle, C., & Derakhshan, H. (2021). Open societies and the challenge of disinformation. Journal of International Affairs, 74(2), 5–12.

Yang, S., Liu, P., & Chen, L. (2023). Micro-video immersion and user engagement: Evidence from Douyin. Computers in Human Behavior, 139, 107510. https://doi.org/10.1016/j.chb.2022.107510

Yeung, K. (2018). Algorithmic regulation: A critical interrogation. Regulation & Governance, 12(4), 505–523. https://doi.org/10.1111/rego.12160

Zhang, S., Yao, L., Sun, A., & Tay, Y. (2019). Deep learning based recommender system: A survey and new perspectives. ACM Computing Surveys, 52(1), 1–38.

Zhang, S., Yao, L., Sun, A., & Tay, Y. (2019). Deep learning based recommender system: A survey and new perspectives. ACM Computing Surveys, 52(1), 1–38. https://doi.org/10.1145/3285029

Zhang, W., & Guo, X. (2024). Endless scrolling and temporal displacement on short-video platforms: Evidence from Chinese college students. Information Processing & Management, 61(2), 103209. https://doi.org/10.1016/j.ipm.2023.103209

Downloads

Published

2026-01-16

How to Cite

Li, L., & Balamuralithara Balakrishnan. (2026). From Immersion to Insularity: How Personalized Recommendations Shape User Behavior and Information Cocooning on Douyin. ICCCM Journal of Social Sciences and Humanities, 5(1). https://doi.org/10.53797/icccmjssh.v5i1.1.2026