There’s a person in my life who has told me she thinks of me as a mentor. From time to time we’ll talk about her career decisions and goals. Sometimes she initiates a conversation, sometimes I check in with her. I admire her and see a lot of her potential, and I feel good when I can give guidance or practical help.
So a week or so ago I forwarded a job I thought might be interesting to her, with some ideas of how she might think about whether or how it could work in her life and where she might go next if the idea captured her interest. A couple of days later I got an email back, thanking me for sending it along.
All pretty normal. Nice good feelings, what we do for people we care about, all part of the reciprocal “I saw this and thought of you” networking and sharing and helping backdrop of strong relationships.
Except that I think the email she sent me was written by AI. And the feeling that it might have been has changed the interaction for me. I’ve been thinking a lot about why.
- As a “mentor”, the good feelings I have about helping come from a belief that I’m helping someone through the messy obstacles of life. I’m seeing a real person, and empathizing with what it felt like when I was in her shoes.
- AI for writing might make writing more polished, more business-appropriate, or more grammatically correct. Emphasis on the “might”. But it also makes it more generic, less human, and less “real” — unless you do a lot of back-and-forth prompting. It’s not YOU. People who use it because they don’t like or trust their own voice are doing it exactly because of this.
- In a mentor relationship, both people get something back. The mentee gets wisdom, advice, a translation of the invisible rules or norms of an industry, practical information. And the mentor gets something, too. It’s the feeling that comes of knowing someone, and seeing the impact their help has on them.
- An AI email that takes your true voice away, and replaces it with something that sounds more corporate and polished and “normal” — also runs the risk of taking your genuine relationship and reducing it to something transactional and bland.
In this case, I would have preferred any number of emails to the one I got, which looked to me like what GPT would say. She might have sent a simple “Thanks, I’ll take a look,” or a “Hmm, I don’t know if this is right for me, but I always appreciate you thinking of me,” or “Wow — this is a direction I haven’t thought about.” Any of those — and maybe even silence — would keep me believing we have a relationship that’s candid and authentic. That’s a big motivator for me.
Instead, if she decided that a “better” email mattered more than her true voice, and trusted GPT to provide one, I am left wondering if we think of our relationship the same way at all.
Now, this is a lot to take from a single email that didn’t sound authentic. This relationship is important enough to me that I’m not going to assume, or take offense, or let an email sour my genuine admiration and desire to help. But there is so much marketing of AI as the solution, from so many directions, that I can see this happening to many people, in many situations, where they think the most important output is “a well-written email.”
When what matters most is “an authentic relationship.” That none of us can have with AI.