Simulation Theory of Empathy

People have ability called empathy to understand others' emotions and sensations. The simulation theory of empathy is an explanation on how humans can understand others' emotions and sensations. Theoretical rationale of the simulation theory of empathy is that observing others’experiences automatically activates shared neural networks of viewers so that people could experience same emotions and sensations of the observed. For example, people feel disgust when they see others smell bad odor, people feel pain when they see others being pierced by a needle or get electrical shock, people sense touching when they see others being brushed, or people feel sad when they see others’ sad facial expression.

The Origin of the Theory
The simulation theory of empathy is based to a large extent on the discovery of mirror neurons in macaque monkeys and a similar mirror neuron system in the human brain.

Since the discovery of the mirror neuron system, many studies have been carried out to examine the role of this system in action understanding, emotion and other social functions.

Action Understanding
Mirror neurons are activated both when actions are executed and the actions are observed. This unique function of mirror neurons may explain how people recognize and understand the states of others; mirroring observed action in the brain as if they conducted the observed action.

Two sets of evidence suggest that mirror neurons in the monkey have a role in action understanding. First, the activation of mirror neurons requires biological effectors such as hand or mouth. Mirror neurons do not respond to the action with tools like pliers. Mirror neurons respond to neither the sight of an object alone nor an action without an object (intransitive action). Umilta and colleagues demonstrated that a subset of mirror neurons fired when final critical part of the action was not visible to the observer. The experimenter showed his hand moving toward a cube and grasping it, and later showed the same action without showing later part grasping the cube (placing the cube behind the occluder). Mirror neurons fired on both visible and invisible conditions. On the other hand, mirror neurons did not discharge when the observer knew that there was not a cube behind the occluder.

Second, responses of mirror neurons to same actions are different depending on context of the action. A single cell recording experiment with monkeys demonstrated the different level of activation of mouth mirror neurons when monkey observed mouth movement depending on context (ingestive actions such as sucking juice vs. communicative actions such as lip-smacking or tongue protrusions). An fMRI study also showed that mirror neurons respond to the action of grasping a cup differently depending on context (to drink a cup of coffee vs. to clean a table on which a cup was placed).

Emotion Understanding
Shared neural representation for a motor behavior and its observation has been extended into the domains of feelings and emotions. Not only movements but also facial expressions activate the same brain regions that are activated by direct experiences. In an fMRI study, same brain regions on action representation found to be activated when people both imitated and observed emotional facial expressions such as happy, sad, angry, surprise, disgust, and afraid. .

Observing video clips that displayed facial expression of feeling disgust activated the neural networks typical of direct experience of disgust. . Similar results have been found in the case of touch. Watching movies that someone touched legs or faces activated the somatosensory cortex for direct feeling of the touch. A similar mirror system exists in perceiving pain. When people see other people feel pain, people feel pain not only affectively, but also sensorially.

These results suggest that understanding other's feelings and emotions is driven not by cognitive deduction of what the stimuli means but by automatic activation of somatosensory neurons. A recent study on pupil size directly demonstrated emotion perception was automatic process modulated by mirror systems. When people saw sad faces, pupil sizes influenced viewers in perceiving and judging emotional states without explicit awareness of differences of pupil size. When pupil size was 180% of original size, people perceived a sad face as less negative and less intense than when pupil was smaller than or equal to original pupil size. This mechanism was correlated with brain regions that implicated in emotion process, the amygdala. Furthermore, viewers mimic the size of their own pupils to those of sad faces they watched. Considering that pupil size is beyond voluntary control, the change of pupil size upon emotion judgment is a good indication that understanding emotions is automatic process. However, the study could not find other emotional faces such as happiness and anger influence pupil size as sadness did.

Epistemological Role of Empathy
Understanding other’s actions and emotions is believed to facilitate efficient human communication. Based on findings from neuroimaging studies, de Vignemont and Singer proposed empathy as a crucial factor in human communication arguing its epistemological role; “Empathy might enable us to make faster and more accurate predictions of other people’s needs and actions and discover salient aspects of our environment.” Mental mirroring of actions and emotions may enable humans to understand other’s actions and their related environment quickly, and thus help humans communicate efficiently.

In an fMRI study, a mirror system has been proposed as common neural substrates to mediate the experiences of basic emotions. Participants watched video clips of happy, sad, angry and disgust facial expressions, and measured their Empathy Quotient (EQ). Specific brain regions relevant to the four emotions were found to be correlated with the EQ while the mirror system (i.e., the left dorsal inferior frontal gyrus/premotor cortex) was correlated to the EQ across all emotions. The authors interpreted this result as an evidence that action perception mediates face perception to emotion perception.