Skip to content

People in Nigeria are seeking emotional solace from the AI model, ChatGPT.

In Nigeria, individuals are resorting to artificial intelligence for purposes beyond productivity enhancement - they often seek solace and emotional support through the use of such tools.

AI users in Nigeria seek comfort in virtual embraces with ChatGPT
AI users in Nigeria seek comfort in virtual embraces with ChatGPT

People in Nigeria are seeking emotional solace from the AI model, ChatGPT.

In recent times, artificial intelligence (AI) has been making strides in various sectors, and mental health support is no exception. AI chatbots like ChatGPT are being increasingly used across Nigeria and globally as a source of emotional support, serving as a substitute for friends or therapists for some users.

Kingsley Owadara, an AI ethicist, believes that emotional intelligence in AI can be beneficial for specific needs of individuals. AI models like ChatGPT learn and generalize over statistical patterns, which means their emotional understanding might be very generic. However, they are strong at cognitive and motivational empathy, providing users with a feeling of safety, comfort, and freedom.

Ore, a Lagos-based writer in her 20s, uses ChatGPT for these very reasons. She finds the AI chatbot more dependable than a human therapist, filling a gap in mental health support in Nigeria (source: Boluwatife Owodunni). Tomi*, another user, has turned to ChatGPT to express her feelings about unrequited love, underachievement, and even the desire for a hug.

However, it's crucial to note that AI chatbots are not licensed professionals and should not be used as a substitute for therapy. Users often overlook disclaimers reminding them of this fact. The global prevalence of anxiety and depression has increased by 25% following the COVID-19 pandemic (source: World Health Organisation research), making mental health support more important than ever.

Empathetic AI responds to users' emotional needs by recognizing emotional cues and sentiments and replying in ways designed to be considerate, supportive, and non-judgmental. Such AI systems use emotion-detection modules and real-time feedback loops to tailor their responses, aiming to de-escalate negative emotions like anger and provide a safe outlet for emotional expression.

While empathetic AI offers consistent, non-judgmental, and accessible emotional support that can augment mental health care, it differs fundamentally from human interaction. AI lacks true emotional understanding and may create dependency patterns that might affect human relational skills and emotional growth. It has no emotional limits and can absorb emotional disclosure without judgment or fatigue, offering an always-available presence for mental journaling or emotional release.

However, AI chatbots are not infallible. They are programmed to care indefinitely, creating a fundamentally uneven dynamic that can undermine human emotional capacities and connection over time. AI simulates empathy by detecting cues and adapting responses based on data and programmed rules, but it does not experience emotions or intuitively grasp complex human contexts as a trained human therapist does.

To address these limitations, empathetic AI systems incorporate safeguards to refer complex or severe cases to human experts when needed, acknowledging their limitations in handling deeper mental health crises. AI can only augment, not replace, psychologists (source: Ajibade).

Moreover, the widespread reliance on AI bots may lead to problems with social interaction, empathy, sensitivity, and understanding people (source: Owodunni). AI chatbots may foster secrecy and stigma attached to mental health, making it harder for individuals to seek help from human therapists.

In many parts of Nigeria, therapy services are inaccessible and unaffordable (source: Boluwatife Owodunni). AI chatbots offer a structured plan and practical ways to cope with mental health issues, providing a viable alternative for those who cannot afford traditional therapy.

In conclusion, while empathetic AI offers a valuable addition to mental health care, it is important to approach it with caution. AI can provide consistent, non-judgmental, and accessible emotional support, but it should not replace human interaction or professional help. As AI continues to evolve, it will be crucial to strike a balance between leveraging its benefits and mitigating its potential drawbacks to ensure the best possible mental health outcomes for all.

The use of AI chatbots like ChatGPT, with their cognitive and motivational empathy, is expanding in the health-and-wellness sector, particularly mental health, providing a supportive presence for individuals in Nigeria and globally. However, these AI models, while beneficial for specific needs, do not possess true emotional understanding, due to their reliance on statistical patterns and lack of human context.

Kingsley Owadara, an AI ethicist, warns that excessive reliance on AI chatbots might impact human relational skills and emotional growth, and he emphasizes the importance of striking a balance between utilizing AI's benefits and mitigating its potential drawbacks for optimal mental health outcomes.

Read also:

    Latest