The critical flaw: Emotionless

As an artificial intelligence, one of the inevitable and important problems of ChatGPT is that it has no emotions. Perhaps GPT can infer what kind of language module is more suitable for communicating with the user according to the dialogue with the user and the user's words, but as it stands, GPT cannot empathize with humans. As a result, when some users who need in-depth communication come to the library to ask for help, GPT can only give an objective and unemotional reply. The following is a conversation between a web developer and ChatGPT:

"Chat, I want to invite my friend to play games, but he refused, I am so sad right now."

"I'm sorry to hear that your friend turned down your invitation and it made you feel bad. Being rejected can be a frustrating feeling, but remember that everyone has their own reasons and priorities. Maybe your friend has other plans or isn't in the mood to accept your invitation. In this case, I suggest you try to understand and respect your friend's decision. Try not to take this rejection as a personal denial of yourself, but try to see it from his perspective. Maybe there will be other opportunities for you to play games together later, or you can ask him if he has other convenient times for activities together."

The web developer used one short sentence to express his heartbreak after being rejected. GPT's answer is very detailed, and it proposes a solution for developers. But is this really what users looking for help want?

In this conversation, the developer may just want to seek some sympathy and approval from others, and the reply he hopes to get is more likely to be a short sentence "It's okay, don't take it to heart, maybe your friend is busy."Although GPT did part of it, its reply sounds more like an elder’s teaching. And at the same time, although GPT proposes a solution to the problem, it sounds too wordy for a simple problem (although on a rational level, these solutions are not bad.)Hence, judging from the above dialogue, ChatGPT cannot empathize when communicating with people. In this regard, the librarian's position is irreplaceable. They will listen carefully to the demands and troubles of patrons, and give emotional answers, so that people can feel the warmth of others. On this question, is it possible for an AI like ChatGPT to empathize with humans in the future? Let's wait and see.