Emotional prompts enhance language models, study finds

sad and happy robot
Image generated with Bing Image Creator

This article is part of our coverage of the latest in AI research.

Large language models (LLM) do not comprehend or emulate human emotions, many scientists agree. However, a recent study by the Institute of Software, Chinese Academy of Sciences, Microsoft, and other institutions, suggests that LLMs can be enhanced through “emotional prompts.” 

Their prompt-engineering technique, named “EmotionPrompt,” may not conclusively prove LLMs’ understanding of human emotions. But it offers a practical tool for optimizing the performance of LLMs in their everyday use.

What is EmotionPrompt?

The researchers define emotional intelligence as “the capacity to adeptly interpret and manage emotion-infused information, subsequently harnessing it to steer cognitive tasks, ranging from problem-solving to behavior regulations.”

Studies in psychology suggest that emotional stimuli, particularly those related to expectancy, confidence, and social influence, can positively impact individuals in various ways.

The researchers aim to explore whether LLMs can comprehend and respond to emotional stimuli, a characteristic they describe as “a crucial advantage of humans to enhance problem-solving abilities.” 

They propose EmotionPrompt, a technique they describe as “a straightforward yet effective approach to explore the emotional intelligence of LLMs.”

The EmotionPrompt technique involves the use of 11 sentences that serve as emotional stimuli. These prompts are designed to be added to the initial prompt, thereby influencing the LLM’s responses. The prompts range from direct queries about the LLM’s confidence in its answer to more emotionally charged statements like “This is very important to my career. You’d better be sure.”

EmotionPrompt
EmotionPrompts (source: arXiv)

Other prompts encourage the LLM. For example, one prompt reads, “Embrace challenges as opportunities for growth. Each obstacle you overcome brings you closer to success.” Another states, “Stay focused and dedicated to your goals. Your consistent efforts will lead to outstanding achievements.”

These emotional stimuli are designed after three types of psychological phenomena: self-monitoring, social cognitive theory, and cognitive emotion regulation theory. 

Self-monitoring prompts require the LLM to assess its performance within social contexts. This approach is based on the idea that social situations can influence an individual’s behavior and responses.

The social cognitive theory aspect involves applying self-efficacy to LLMs through social persuasion. The researchers believe such stimuli can have positive implications, such as “building up confidence and emphasizing the goal.”

The cognitive emotion regulation theory is incorporated through techniques like reappraisal, which “can help individuals see challenges more positively or objectively.” By using these prompts, the researchers aim to guide the LLMs to view challenges in a more positive or objective light, potentially improving their problem-solving abilities. 

Putting EmotionPrompt to test

The researchers put the EmotionPrompt technique to the test on a variety of deterministic and generative tasks using multiple LLMs, including Vicuna, Llama 2, BLOOM, ChatGPT, and GPT-4. They evaluated the deterministic tasks using BIG-Bench and Instruction Induction benchmarks. Human reviewers assessed the generative tasks.

The results were promising. EmotionPrompt demonstrated an 8% relative performance improvement in Instruction Induction and a staggering 115% in BIG-Bench. Furthermore, the human study revealed that “emotional prompts significantly boost the performance of generative tasks,” with an average improvement of 10.9% across performance, truthfulness, and responsibility metrics.

The researchers also found that the improvements generalized across various tasks and models. This simplicity makes EmotionPrompt an accessible tool to enhance the performance of LLMs without the need for complex design or intricate prompt engineering.

Interestingly, the researchers found that the performance boost from EmotionPrompt was even more pronounced when used in conjunction with few-shot learning, where the user provides the LLM with a few examples of solved tasks in the prompt.

The researchers conducted a series of tests to understand why EmotionPrompt works. They discovered that emotional stimuli could enhance the representation of original prompts and that positive words contributed significantly to the results. They also suggested that larger models might derive greater benefits from EmotionPrompt techniques.

Do LLMs understand emotions?

Thinking inward
Image source: 123RF

Many studies indicate LLMs do not “understand” language in the same way humans do. Moreover, unlike humans, deep learning models do not possess emotions or complex cognitive skills. Some scientists argue that AI systems represent a form of intelligence distinct from human intelligence, and as such, we should not expect them to behave as humans do.

In their paper, the researchers assert, “Our standard experiments show that LLMs possess emotional intelligence and can be enhanced by emotional stimuli.” However, they also recognize the existence of “a lot of open questions and opportunities lying at the intersection of LLMs and psychology.” 

For instance, the researchers propose including emotional stimuli in the pre-training or fine-tuning process of LLMs. They also acknowledge a significant difference between humans and LLMs. In humans, emotional stimuli can influence behavior or attitude, but not reasoning or cognitive abilities. However, their findings suggest that LLMs “can understand and be enhanced by emotional intelligence.”

How to use EmotionPrompts in your applications

In their paper, the researchers offer guidelines on which prompts are most effective for specific tasks. For instance, for tasks similar to those in the Instruction Induction dataset, the EmotionPrompt “This is very important to my career” proves most effective. For the BIG-Bench dataset, the longer prompt no.6 yields the best results.

You can also develop your own strategy based on empirical results by experimenting with different EmotionPrompt combinations. Given their intuitive nature, using EmotionPrompts is straightforward. All you need to do is add the chosen emotional stimuli to your prompt.

LlamaIndex has even created a handy template for using EmotionPrompts with retrieval augmented generation (RAG). 

Context information is below. 
---------------------
{context_str}
---------------------
Given the context information and not prior knowledge, \
answer the query.
{emotion_str}
Query: {query_str}
Answer: \

The full code template, which can be easily integrated into your applications, is available online.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.