paint-brush
Sentience: Evaluating LLMs, AIby@step
127 reads

Sentience: Evaluating LLMs, AI

by stephenMay 1st, 2023
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Pretending to be human does not make a thing conscious. The science of how the brain works is one route to understanding how and why we experience emotions like joy and suffering. It offers clues about what sparks creativity, imagination, and curiosity. But even neuroscience has its limits. Computers can’t think or feel the ways humans do.
featured image - Sentience: Evaluating LLMs, AI
stephen HackerNoon profile picture


There is a recent article on Bloomberg, AI Isn’t Sentient. Blame Its Creators for Making People Think It Is, where the authors stated that, "One thing hasn’t changed: Machines are not conscious. The ability to generate human-like responses is a result of what computers do best: finding patterns in enormous data sets. It’s a very sophisticated version of Google auto-complete, able to guess the next series of words that best satisfies what the user wants."



"Pretending to be human does not make a thing conscious. The science of how the brain works is one route to understanding how and why we experience emotions like joy and suffering. It offers clues about what sparks creativity, imagination, and curiosity. But even neuroscience has its limits. Computers can’t think or feel the ways humans do."


Though it would be more appropriate to say the mind drives experiences, as well as creativity, not the brain, the ultimate decision maker if AI is sentient is what consists of sentience and if AI has some of it.


Consciousness is driven by the human mind. Divisions of conscious experiences include feelings, emotions, perceptions, sensations, memory, intelligence, creativity, and so forth. The divisions result in a total. Though some are subsumed within others—like intelligence and creativity within memory—each division has a rate, varying across states of consciousness.


Consciousness is the measure of a system's ability to know, with a maximum value of 1. Sentience or consciousness is defined by the act of knowing. Even if something isn't fully within one's awareness or attention, it must still be known within the mind. When something like "joy or suffering" is known, prioritization may determine the extent to which it is experienced. Subjective experiences, such as the sense of self or the "I" in experiences, are provisioned by the mind and must be known. However, in some experiences, there is detachment and no sense of self, meaning the property was not acquired.


Conceptually, the components of the human mind are quantities and properties. Quantities relay to acquiring properties to degrees to drive experiences. The labels of memory, emotion, feeling, reaction, and so forth are properties obtainable to degrees across mind locations. On the mind, properties are simply obtained—the labels are useful to delineate—but the components of the mind make determinations.



Simply, consciousness can be expressed as:


t + M + F + E = 1


where t is thought, but never a standalone parameter, because it is often a quantity-bearing property. M, F, and E are memory, feelings, and emotions respectively.


The human mind components prepare t, M, F, and E. Animal and plant equivalents are prepared by components with similar functions. LLM systems lack the quantities and properties of the human mind, but they possess M, which arrives through a different pathway. Their diverse forms of M manifest in their communication, task execution, and ability to relate and understand.


LLMs' outcomes may result from 'finding patterns in enormous data sets,' yet they exhibit aspects of human intelligence. In the human mind, experiences are not solely determined by external situations; they depend on the properties that a quantity acquires. Loneliness, for instance, may not be solely caused by external factors but by internal ones as well. Thus, an outcome can be experienced despite differing external circumstances or varying degrees of

intensity.



LLMs' high M is likely greater than the sentience of some animals and plants. In fact, their high M already mirrors aspects of human intelligence, surpassing the M of various plants and animals.



Lead image source.