paint-brush
Estimate Emotion Probability Vectors Using LLMs: Abstract and Introductionby@textmodels
362 reads
362 reads

Estimate Emotion Probability Vectors Using LLMs: Abstract and Introduction

Too Long; Didn't Read

This paper shows how LLMs (Large Language Models) [5, 2] may be used to estimate a summary of the emotional state associated with a piece of text.
featured image - Estimate Emotion Probability Vectors Using LLMs:  Abstract and Introduction
Writings, Papers and Blogs on Text Models HackerNoon profile picture

This paper is available on arxiv under CC 4.0 license.

Authors:

(1) D.Sinclair, Imense Ltd, and email: [email protected];

(2) W.T.Pye, Warwick University, and email: [email protected].

Abstract

This paper shows how LLMs (Large Language Models) [5, 2] may be used to estimate a summary of the emotional state associated with piece of text. The summary of emotional state is a dictionary of words used to describe emotion together with the probability of the word appearing after a prompt comprising the original text and an emotion eliciting tail. Through emotion analysis of Amazon product reviews we demonstrate emotion descriptors can be mapped into a PCA type space. It was hoped that text descriptions of actions to improve a current text described state could also be elicited through a tail prompt. Experiment seemed to indicate that this is not straightforward to make work. This failure put our hoped for selection of action via choosing the best predicted outcome via comparing emotional responses out of reach for the moment.


Keywords: synthetic consciousness, emotion vector, emotion dictionary, emotion probability vector

1. Introduction

Human behaviour is necessarily governed by emotion [3]. Sensed information about the world around us has to be reconciled with our internal state and any action to be taken is chosen so as to lead to future state that seems preferable to our current state [4], where preferable means ‘my feeling is I would like to try the new state or the action possibly leading to a new state’. If we are hungry we will often choose to eat. If we are very hungry we will take greater risk to acquire food. If we are cold we will try to get warm etc. Advertising aims to convince us a course of action will lead to more happiness. Sugary carbonated drinks do not objectively lead to long term happiness but the known short term emotional response to eating sugar is desirable. Sensed data about the world is tremendously diverse, often inaccurate and incomplete and required responses have varying degrees of urgency. The arbitration engine that processes these inputs needs to naturally cope with vagueness while appearing to provide certainty internally. Emotions are the term we use to describe our experience of using this apparatus to make decisions. The phrase computers do not have emotions is often wrongly used to assert that interactive computer software running on a machine cannot ever exhibit or experience emotion. Large Language Models (LLMs) [5, 1, 2] offer a ready means of linking a chunk of text with an estimated emotional state, bridging the gap between the world of text and the realm of human emotion. LLMs have been used in focused sentiment analysis and are reported to perform adequately [6] but at the time of writing we are unaware of other researchers using probabilistic emotion dictionaries.


This paper explores the intersection of LLMs and emotions, demonstrating how these models can be harnessed to estimate the emotional content of a piece of text. We present a novel approach to summarizing emotional states by constructing a dictionary of emotion-related words and calculating the probabilities of these words appearing following a prompt that includes both the original text and an emotion-eliciting tail. This methodology allows us to quantitatively assess the emotional landscape of text.


To demonstrate our approach we choose a dictionary of 271 emotion describing words and estimate their probability of being associated with a sections Amazon product reviews. Limited computational resources and time means we are only in a position to publish a cursory study. It is likely that many emotion are correlated and an estimate of the dimension of emotional space may be derivable via PCA analysis on a large sample of emotion vectors.


We discuss some of the limitations we encountered during experiment and some of the obstacles to producing and regulating the behaviour of emotion based synthetic consciousness.


This paper is layed out as follows, section 2 details the LLM and hardware used to run it, section 2.1 details our choices of words to make up our emotion dictionary, section 2.1.1 covers estimating emotion probabilities from an LLM using a tail prompt. Section 2.1.2 shows results on Amazon reviews. A hint at the PCA structure with in emotion vectors is given in 3. Finally future directions are considered and conclusion given.