So there is supposed to be a computer chip that can empathise - at least that's what it says. the message from the Consumer Electronics Show (CES, Las Vegas) in the media.

EPU chip

Emotional Processing Unit

That makes me bright-eyed and curious, and I read on. What I read makes me want to write this blog, because my thoughts while reading are:

- What do we mean by empathy?

- Can a machine do that?

- Why should a machine be able to do that and why is that an advantage?

Empathy is described in the article with the recognition of feelings. As I understand it from my understanding and the attitude of non-violent communication, this fits so far. But empathy is much more than that! 

Not only the recognition but also the return of the feelings, the demonstration that I take the feelings of the other person seriously, without being able to be sure whether my impression corresponds to the facts. Because underneath the anger there is usually a whole range of other feelings of which the person in anger is not even aware. This is true for many feelings, not only for anger: on the surface there is a feeling, usually intense and yet unclear. Deeper down, there are often other feelings, all of them not quite conscious. Getting these unconscious feelings reflected back is part of empathy. It is the recognition of oneself through the empathiser that makes the empathy thus received powerful.

The article also says that a "machine" could then react appropriately, for example with comfort. At this point, at the latest, this no longer has anything to do with empathy for me. Quite the opposite! Comfort usually comes from bringing the other person back into socially acceptable behaviour. Often because we ourselves cannot stand the other person's feelings. So when we give comfort, we are "making things go away". Empathy that really gets through and is even able to heal in the long term gives back not only the feelings but also the needs. The needs whose unfulfilledness makes itself felt through unpleasant feelings. Receiving this double pack of feelings and needs and thus seeing, recognising and clarifying oneself in one's own confusion is consistently experienced as relaxing, clarifying and resolving. 

This is much more sustainable than any good tip on how to deal with my mess. Because: it not only lets me recognise myself (so no one tells me "how" or "what" I am), I also remain in self-responsibility and self-empowerment, because I can decide for myself whether and what I do now to fulfil my needs.

As a trainer for Nonviolent Communication and a facilitator of people who want to be able to use NVC for themselves and others, I experience again and again how much deep understanding and practice it takes not to fall into good tips and simple consolation. How demanding it can be to recognise feelings and needs and to reflect on them without reproach.

Of course, this can be taught to a machine. My question here is: does the developer know what he is doing? Does he know about the subtle but relevant differences between sympathy and empathy, between compassion and good advice? I have my doubts...

And why should a machine be able to do that? Of course I am interpreting this... Based on my experience with people (including myself) who are learning to give real empathy, I see it like this: Often feelings trigger something in our counterpart, we fall out of our centre and can no longer be empathetic (this is neurobiologically explainable and proven). So it's great when a machine can do that.

And that brings us to the consumer society, which prefers to treat symptoms quickly and easily rather than address the root of the evil. Instead of learning how I react and why when my counterpart shows emotions, I prefer to let a machine do "half a thing".

The knowledge of GFK and GFK trainings possible. For example with us.

BookmarkBookmark

BookmarkBookmark