• I write about AI for a living what people confessed to me about

    From TechnologyDaily@1337:1/100 to All on Thursday, March 26, 2026 13:15:26
    I write about AI for a living what people confessed to me about using
    ChatGPT surprised me

    Date:
    Thu, 26 Mar 2026 13:04:46 +0000

    Description:
    I asked a researcher why using AI can sometimes make us feel weird, and what she said surprised me.

    FULL STORY ======================================================================Copy link Facebook X Whatsapp Reddit Pinterest Flipboard Threads Email Share this article 0 Join the conversation Follow us Add us as a preferred source on Google Newsletter Tech Radar Get daily insight, inspiration and deals in your inbox Sign up for breaking news, reviews, opinion, top tech deals, and more. Contact me with news and offers from other Future brands Receive email from
    us on behalf of our trusted partners or sponsors By submitting your information you agree to the Terms & Conditions and Privacy Policy and are aged 16 or over. You are now subscribed Your newsletter sign-up was
    successful An account already exists for this email address, please log in. Subscribe to our newsletter I write about AI a lot. Over the past year, Ive covered everything from AI relationships to coaching bots to people using ChatGPT every day at work which is probably why people confess things to me.

    They tell me their company has rolled out AI and no one really knows how to use it. That they relied on it to understand a pregnancy before telling their family. That theyve used it to decode ambiguous dating texts or to stay calm during arguments. But what fascinates me isnt just what theyre using AI for. Its how they feel about it afterward. Some feel conflicted because of the environmental cost. Others worry theyre being lazy. A few are unsettled by
    the emotional attachments theyve developed and are trying to untangle. And some feel sharper, calmer and more capable than ever with a chatbot at their fingertips 24/7. Article continues below You may like I tried the viral
    future self ChatGPT prompt and the advice surprised me I asked a psychologist what worries the people trying to make AI safer The true emotional cost of retiring ChatGPT-4o To understand these reactions better, I spoke to Danielle Hass, a PhD candidate in the Department of Marketing at West Virginia University, who has studied the emotional consequences of using generative
    AI. In a 2025 study , her team researched what people are feeling, why those emotions arise, and what might reduce the discomfort. If AI is becoming part of everyday life, we need to understand the psychology of using it, not just the productivity gains. When AI gives you the ick People feel uneasy about using AI for all sorts of reasons. Environmental concerns come up often. So
    do worries about cheating at work. But Hass says the strongest emotional reactions tend to happen when people use AI in a specific way.

    Its the emotionally laden interpersonal messages, or heartfelt messages, she explains. Things like birthday wishes, love letters, wedding vows, and notes of appreciation. Get daily insight, inspiration and deals in your inbox Sign up for breaking news, reviews, opinion, top tech deals, and more. Contact me with news and offers from other Future brands Receive email from us on behalf of our trusted partners or sponsors By submitting your information you agree to the Terms & Conditions and Privacy Policy and are aged 16 or over.

    Drafting a shopping list with AI is unlikely to keep you up at night. Asking it to help with a work email may feel like common sense. But when a message
    is meant to signal care, effort and emotional investment, that's when things get difficult.

    What we find is that it's not just using AI that creates discomfort, Hass says. The guilt comes when messages are sent where the recipient expects genuine personal investment.

    In other words, context matters. When you use AI to write a birthday card for your best friend, you're in a situation where honesty, authenticity and
    effort are core to what the message is supposed to signal. That's where the negative feelings really kick in. What to read next What we lose when AI starts doing all our thinking at work AI writing's latest patterns and how I avoid them ChatGPT saved me from a roadside nightmare, and spared my blushes

    Hasss research suggests that the closer the relationship and the more meaningful the occasion, the worse people tend to feel if they rely on AI. (Image credit: Getty Images) Defining the emotional hangover We know people feel bad, but what's the emotion, specifically?

    The primary emotion we identify is guilt, Hass says. She explains that the distinction really matters here, because guilt is different from
    embarrassment or shame. Its not just about how others might see you, its
    about doing something that feels wrong to you.

    Using GenAI to write a heartfelt message and presenting it as your own
    creates a sense that you've misrepresented yourself to someone you care
    about, she explains. That violation of your own ethical standards is
    precisely what triggers guilt.

    If you send a love letter, your partner reasonably assumes you sat down and chose those words. That the phrasing reflects your thought and emotional effort. But then the actual source of those words is an algorithm, Hass says.

    She describes this as a source-credit discrepancy a mismatch between who appears to have authored the message and who actually did. That discrepancy
    is what makes the act feel so dishonest.

    Its not just that the message feels abstractly inauthentic, she adds. Its
    that youre creating a false impression of authorship in the mind of someone who trusts you. Thats where the emotional hangover comes from. Should you
    come clean? When Hass explained this to me, I couldnt stop thinking about
    what someone should do if theyre already stewing in this feeling. Should they admit it? Maybe.

    Hass says that transparency would likely reduce guilt because it removes the dishonesty at the core of the discomfort. If the recipient knows the message came from AI, there's no false impression of authorship, no source-credit discrepancy, she says. But disclosure doesnt magically make the situation simple.

    Now the dynamic shifts. How does the other person respond? Are they amused? Indifferent? Hurt?

    If theyre fine with it, that acceptance might facilitate a kind of self-forgiveness, Hass says. But if they feel let down that their special occasion didnt merit your personal effort that reaction could intensify your guilt, which now comes from hurting someone you care about.

    If you wrote your mums birthday card or your wedding vows with AI and arent sure what to do, honesty might be the answer. Explaining why you used it, and that you genuinely cared, may help. But this is new emotional territory, and you cant control how someone else will react. (Image credit: Shutterstock)
    How to avoid the emotional hangover One obvious option is to draw a hard boundary and avoid using AI for emotionally meaningful communication altogether.

    Hass suggests a more practical solution may be to reframe AIs role in your life. A more appropriate role for GenAI might be as a thinking partner rather than a ghostwriter, she says. In my time reporting on AI, thats broadly the view I hear from people who take a measured approach to AI tools.

    Using it to brainstorm, overcome writer's block, or refine a draft you've already written yourself preserves your genuine voice and investment in the message, she says. Its also what she tells her students: Give it your best, authentic shot first, and then consider whether GenAI can help you sharpen
    it.

    AI can help us articulate things we struggle to say. It can nudge, structure and polish. But when it starts standing in for effort in moments meant to signal care, something shifts internally. (Which seems related to our exploration into what happens when you let AI do your thinking for you at work. )

    If youve felt that strange emotional hangover after using AI, this proves youre not imagining it. Understanding that mechanism is a positive step
    toward using these tools in ways that support us, rather than leaving us unsettled about our closest and most important relationships. Follow
    TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!

    And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too. The best business laptops for all budgets Our top picks, based on real-world testing and comparisons

    Read our full guide to the best business laptops 1. Best overall: Dell Precision 5690 2. Best on a budget: Acer Aspire 5 3. Best MacBook: Apple MacBook Pro 14-inch (M4)



    ======================================================================
    Link to news story: https://www.techradar.com/ai-platforms-assistants/i-write-about-ai-for-a-livin g-what-people-confessed-to-me-about-using-chatgpt-surprised-me


    --- Mystic BBS v1.12 A49 (Linux/64)
    * Origin: tqwNet Technology News (1337:1/100)