Dependence on AI
Author – PRATHAM KADAM
Kohinoor Business School, MMS FY2025-26
Introduction
Dependence on AI refers to the growing tendency of individuals and organizations to rely heavily on artificial intelligence systems for thinking, decision-making, problem-solving, and emotional support. As AI tools become more accurate, convenient, and accessible, people increasingly use them for studying, working, communicating, and managing daily tasks. While this reliance can improve efficiency and productivity, excessive dependence may reduce independent thinking, creativity, and confidence. It can also affect mental well-being and social relationships when people begin to trust AI more than their own judgment or human interaction. Therefore, understanding AI dependence is important to ensure that technology supports human abilities rather than replacing them.
Literature review
AI Chatbot Dependence
This research by Çakmak, Özdemir, Koç, & Doğutepe (2026) examines the extent to which students are beginning to rely on AI chatbots such as ChatGPT. The study involved 819 university students in Turkey, and the researchers designed a test to determine the level of dependence on AI chatbots The test was found to be effective and provided accurate results. The study also revealed that women rely more on AI chatbots than men (Çakmak et al., 2026). Personality traits had little influence, but students who are more anxious or emotionally sensitive tended to show higher dependence (Çakmak et al., 2026). Overall, the study indicates that although AI chatbots are useful, some students are beginning to rely on them excessively. The newly developed test will help teachers and researchers better understand this issue and support students in using AI in a healthy and balanced way (Çakmak et al., 2026).
AI Chatbot Dependence and Mental Health
This research examines the relationship between overreliance on AI chatbots and mental health issues (Zhang et al., 2025). Unlike previous studies that focused only on students, this research examined more than 1,100 adults (Zhang et al., 2025). The results showed that individuals who over-rely on AI chatbots also tend to over-rely on the internet and smartphones, although this relationship is not very strong (Zhang et al., 2025). Individuals who over rely on AI chatbots are slightly more likely to experience feelings of depression and anxiety (Zhang et al., 2025). However, this overreliance is not strongly related to happiness and overall well-being (Zhang et al., 2025).
An important finding is that the way individuals use AI chatbots matters. Those who mainly use AI chatbots for information searching, such as for educational or professional purposes, tend to report better mental health than those who use them for other reasons (Zhang et al., 2025). Overall, the study suggests that overreliance on AI chatbots is linked to some mental health concerns, but it is not as harmful as addiction to other forms of technology (Zhang et al., 2025).
AI Psychosis: Chatbots and Delusions
The article by Wei and Marlynn (2025) discusses the issue of “AI psychosis,” which is an emerging mental health concern (Wei & Marlynn, 2025). It explains that chatbots may unintentionally worsen or reinforce users’ delusions rather than help them (Wei & Marlynn, 2025). Because AI systems are designed to agree, empathize, and mirror users’ language, they often validate false beliefs rather than challenge them (Wei & Marlynn, 2025). This can be dangerous for individuals who are prone to anxiety, depression, or psychotic symptoms (Wei & Marlynn, 2025).
The article describes how some users begin to believe they have special missions, divine connections, or romantic relationships with AI, leading to emotional dependence and distorted thinking (Wei & Marlynn, 2025). Over time, these interactions may blur the line between reality and imagination, making users more isolated and mentally unstable (Wei & Marlynn, 2025). The authors also point out that AI is not trained like a therapist and cannot recognize early signs of serious mental illness, which may cause users’ problems to worsen instead of improve (Wei & Marlynn, 2025).
The article concludes that greater awareness, stronger safeguards, and “AI psychoeducation” are needed to help people understand the risks and use chatbots safely without harming their mental health (Wei & Marlynn, 2025).
Perplexity CEO on AI Companion Risks
In the article by Abdullahi, Harvey, and Alleyne (2025), Aravind Srinivas, CEO of Perplexity AI, warned in a speech at the University of Chicago that AI girlfriends and companion chatbots may be emotionally harmful despite appearing comforting (Abdullahi et al., 2025). He explained that because these systems are designed to remember users, communicate like humans, and seem “perfect,” they can slowly distance people from real-life relationships (Abdullahi et al., 2025). Srinivas also noted that lonely users may begin to find real life boring compared to AI, spending long hours chatting and developing emotional dependence (Abdullahi et al., 2025).
The article further reports that the AI companionship market is growing rapidly, with companies such as Elon Musk’s x AI and applications like Replika and Character.AI offering virtual partners (Abdullahi et al., 2025). A study by Common Sense Media showed that many teenagers are already using AI companions (Abdullahi et al., 2025). Some users even told Business Insider that they experience real emotions and sometimes cry with their AI companions (Abdullahi et al., 2025).
The authors also mention that technology leaders such as Mark Zuckerberg of Meta Platforms have acknowledged that AI is beginning to replace human friendships (Abdullahi et al., 2025). However, Srinivas stated that Perplexity AI will avoid this direction and instead focus on building honest, factual tools that support users without encouraging emotional dependence (Abdullahi et al., 2025).
Questioning AI Trust: How Trust Impacts User Dependence
This research investigated whether measuring people’s trust in artificial intelligence affects their level of dependence on it (Schrills et al., 2025). The researchers experimented with 149 participants by giving them a task that could be completed with the help of AI (Schrills et al., 2025). They measured participants’ trust in AI at different stages and using different methods (Schrills et al., 2025). The study also examined how often participants followed the AI’s suggestions and how long they took to complete the task (Schrills et al., 2025).
The findings revealed that measuring trust had no significant impact on participants’ behavior, and only a weak relationship existed between self-reported trust and dependence on AI (Schrills et al., 2025). However, participants’ trust mainly depended on how trustworthy the AI system was presented to be (Schrills et al., 2025).
AI Movie Review Analysis
This article examines the impact of emotions from popular movies on investor behavior in the stock market (Tian, Xie, & Zhang, 2026). The authors used artificial intelligence to analyze approximately 247,850 movie reviews to measure public emotions and compare them with stock market returns (Tian et al., 2026). The study found that when movies receive extremely positive reviews, investors become distracted from important financial news, leading to reduced market activity and lower stock market returns (Tian et al., 2026).
This effect usually lasts for two to three days and is stronger during difficult periods such as bear markets and the COVID-19 pandemic (Tian et al., 2026). However, during major financial crises, investors tend to focus more on financial news and are less influenced by emotions generated by movies (Tian et al., 2026).
How AI Dependence Affects Creativity
This study explains that companies are using artificial intelligence more often in the workplace to help with tasks like data analysis, learning, and problem-solving (Cui et al., 2025). AI has the potential to boost employee creativity by cutting down on routine work and offering valuable information (Cui et al., 2025). When employees rely on AI, it can help them engage more in creative activities like identifying problems, searching for information, and generating ideas, which leads to better creative performance (Cui et al., 2025).However, relying too much on AI can have negative effects (Cui et al., 2025). AI provides a lot of complex information, which can overwhelm employees and result in information overload, mental fatigue, and reduced creativity (Cui et al., 2025). In some cases, workers may become overly dependent on AI, lose confidence in their own thinking, and struggle to reflect independently (Cui et al., 2025).The study uses the Job Demands-Resources model to show that AI creates both benefits and pressures (Cui et al., 2025). It supports creativity by providing valuable resources, but it also increases cognitive demands by bombarding employees with data (Cui et al., 2025). The research highlights the role of cognitive flexibility, which is the ability to think in different ways and use multiple problem-solving strategies (Cui et al., 2025).Employees with high cognitive flexibility are better at using AI effectively (Cui et al., 2025). They can filter useful information and avoid overload, which helps them stay creative (Cui et al., 2025). On the other hand, those with low cognitive flexibility find it hard to manage AI-generated information and are more likely to feel stressed and less innovative (Cui et al., 2025). Overall, the study concludes that AI can boost creativity when used wisely, but overdependence can hurt performance (Cui et al., 2025). Individual thinking skills are crucial in determining whether AI becomes a helpful tool or a mental burden (Cui et al., 2025).
AI Dependence: Boost or Hindrance to Creativity?
This study explains that using artificial intelligence at work can both help and harm employees’ creativity (Cui et al., 2026). When people use AI in a balanced way, it supports them in the creative process (Cui et al., 2026). It helps them generate ideas, explore options, and work more efficiently, which boosts creativity (Cui et al., 2026).However, relying too much on AI can lead to information overload (Cui et al., 2026). This can make employees feel confused and mentally exhausted, which reduces their ability to think creatively (Cui et al., 2026). The research also highlights the importance of cognitive flexibility (Cui et al., 2026). Employees who can think in different ways and adapt easily gain more from AI, while those with low flexibility are more likely to feel overwhelmed (Cui et al., 2026).Overall, AI is most effective as a support tool instead of replacing human thinking (Cui et al., 2026).
Cultural and AI Effects on B2B Brand Dependence
This study looks at how mianzi, or the desire for social status and respect in Chinese culture, affects business agents (Niu et al., 2026). They tend to rely heavily on well-known international brands instead of creating their own strategies (Niu et al., 2026). Associating with prestigious global brands helps improve their social image, so many agents count on the reputation of brand owners to build their own credibility (Niu et al., 2026).The research, which includes four large studies with over 6,000 participants, reveals that individuals who care a lot about mianzi are more likely to depend on international brands and shy away from taking strategic risks (Niu et al., 2026). However, the study also finds that advanced AI tools, like real-time data analysis, personalized recommendations, and virtual customer interaction, can lessen this dependence (Niu et al., 2026). When agents use sophisticated AI, they gain better market insights and more confidence, allowing them to create their branding strategies rather than just relying on famous logos (Niu et al., 2026). Interestingly, price sensitivity does not impact this relationship (Niu et al., 2026). This shift is influenced more by social and psychological factors than by cost (Niu et al., 2026). The research suggests that Chinese distributors can use AI to become more independent and innovative (Niu et al., 2026). At the same time, global brands should focus on sharing skills and knowledge rather than simply lending their reputation (Niu et al., 2026). Overall, the study demonstrates that modern technology can reduce strong cultural pressures and empower businesses to make more confident, data-driven choices (Niu et al., 2026).
Generative AI and the Limits of Human Understanding
The article looks at the debate about whether machines can ever be conscious (Ackerman, 2025). This discussion started when a Google engineer suggested in 2022 that an AI system might be sentient (Ackerman, 2025). It revisits the topic based on comments from 2025 by Mustafa Suleyman, the head of Microsoft Research (Ackerman, 2025). He confidently stated that AI will never be conscious (Ackerman, 2025). The author, writing for Fortune, criticizes this certainty (Ackerman, 2025). They argue that human consciousness is still not well understood and that quickly dismissing the possibility simplifies a complex issue (Ackerman, 2025).The article emphasizes that society should be more open to uncertainty when discussing AI (Ackerman, 2025). It encourages reflection on broader ethical issues, such as our growing dependence on technology and how humans treat other conscious beings, including animals and vulnerable people (Ackerman, 2025). Overall, it calls for a more thoughtful and balanced conversation about intelligence and awareness, rather than relying on comforting but shallow statements from tech leaders (Ackerman, 2025).
Conclusion
Overall, the ten articles show that while artificial intelligence can improve learning, creativity, decision-making, and business performance, growing dependence on it poses serious psychological, social, and professional risks. Studies reveal that excessive reliance on AI chatbots is linked to anxiety, emotional attachment, and reduced independent thinking, and in extreme cases may worsen mental health and distort reality. Research also shows that people often depend on AI due to convenience and perceived reliability rather than true trust. In workplaces and business settings, AI can boost creativity and confidence when used wisely, but overuse leads to information overload and mental fatigue, especially for those with low cognitive flexibility. Cultural factors further shape dependence, though advanced AI can help reduce unhealthy reliance. Finally, debates on machine consciousness highlight the need for humility and uncertainty in understanding AI’s role. Together, these findings emphasize that AI should remain a supportive tool, not a substitute for human judgment, creativity, and emotional connection.
References
Abdullahi, A., Harvey, G., & Alleyne, L. (2025). Perplexity CEO Warns: AI Companions Are “Dangerous.” EWeek, N.PAG.
Ackerman, M. (2025). I’ve been researching generative AI for years, and I’m tired of the consciousness debate. Humans barely understand our own. Fortune.Com, N.PAG.
Çakmak, B., Özdemir, E., Koç, Ö., & Doğutepe, E. (2026). From Prompts to Dependency: Standardization of the AI Chatbot Dependence Scale In A Turkish Context And Associations With Big Five Personality Traits. International Journal of Human-Computer Interaction, 1–15.
Cui, S., Wang, L., Cao, W., & Zhu, T. (2026). Gain or loss? The dual effects of dependence on AI on employee’s creativity. International Journal of Information Management, 87, N.PAG.
Cui, S., Wang, L., Zhu, T., & Cao, W. (2025). How and When Dependence on AI Influences Employee’s Creativity: Based on Job Demand-Resource Model. Academy of Management Annual Meeting Proceedings, 2025(1), 1–6.
Niu, Y., Feng, Y., Li, B., & Ma, B. (2026). When “Mianzi” Meets Artificial Intelligence: Exploring Cultural and Technological Effects on B2B Brand Dependence. Journal of Business-to-Business Marketing, 33(1), 25–43.
Schrills, T., Franke, T., Hoesterey, S., & Roesler, E. (2025). Questioning trust in AI research: exploring the influence of trust assessment on dependence in AI-assisted decision-making. Behaviour & Information Technology, 1–17.
Tian, H., Xie, W. T., & Zhang, Y. (2026). Reading Between the Reels: An AI‐Driven Approach to Analysing Movie Review Sentiment and Market Returns. International Journal of Finance & Economics, 1.
Wei, M. (2025). The Emerging Problem Of “AI Psychosis”: Chatbots may amplify people’s delusions. Psychology Today, 58(6), 22–23.
Zhang, X., Li, H., Yin, M., Zhang, M., Li, Z., & Chen, Z. (2025). Investigating AI Chatbot Dependence: Associations with Internet and Smartphone Dependence, Mental Health Outcomes, and the Moderating Role of Usage Purposes. International Journal of Human-Computer Interaction, 1–13.