I’ve been noticing people use ChatGPT to seek interpersonal relationship advice.
The same people have suggested that I use it when I’ve asked them for advice on a conflicting interpersonal situation.
I’m not sure how I feel about that.
I have used ChatGPT numerous times myself for interpersonal relationship advice. However, I’ve come to notice a pattern through the chats.
For one, ChatGPT generates words based on prompts. That’s it. It is trained to use these word combinations to be aligned with human values. It is not a listener, and it barely even understands the context of the situation.
It also made me realize that seeking validating advice from a generative AI is… lazy. I wonder if this habit creates a form of confirmation bias, or if it serves as an echo chamber for your own perspective or side of the story…
It makes me wonder: has it become this hard for some people to pick up the phone and call a friend? Aren’t humans the best at evaluating complex behavior? Isn’t it even better to try and judge the situation by yourself and become better equipped at handling and trusting yourself in challenging situations?
When did AI become better than humans at giving advice?
Credit to Henrik Kniberg and Kyle Hill on YouTube for explaining how ChatGPT works.