The general concept of a FeelingNode is that of an internal sensor — a FeelingNode is a mechanism that a OCP system uses to sense some aspect of itself. FeelingNodes are not intended to represent emotions, which we understand as complex distributed phenomena, not able to be encapsulated in single nodes. There are some parallels and relationships between FeelingNodes and emotions, which will arise sometimes in our discussion of FeelingNodes, but we don't intend these parallels to be taken very seriously. The study of emotions in AGI systems will be a subtle science, we believe, but one that will be difficult to seriously explore prior to the development of powerful AGI systems that can communicate fluently with humans.
Like Goal and Context Atoms, FeelingNodes, structurally, are simply PredicateNodes. Specifically, they are a subclass of OpenCogprime:GroundedPredicateNode, which is grounded in a Procedure that measures either:
- some sort of global indicator of the state of the system, or
- an in-built feeling indicator, such as the Pleasure FeelingNode, which is activated via things such as getting rewarded by the teacher.
More technically, they are PredicateNodes whose internal schema expression trees have at their leaves (along with, perhaps, other inputs) either:
- FeelingSchemata, or
- OutputValues of FeelingNodes.
A set of elementary FeelingSchemata is defined; some FeelingNodes may simply wrap up a single FeelingSchema. Others, containing a combination of elementary FeelingSchema are called compound FeelingNodes.
A FeelingNode's truth value will vary over time, in accordance with the variance of the output of the elementary FeelingSchemata it's defined in terms of.
Learning new FeelingSchemata and modifying or deleting existing ones are advanced system dynamics, which may be undertaken only by advanced OCP systems that are able to learn procedures based on complex experience-based inferences.
Note that the importance of a FeelingNode bears no direct relationship to its truth value.
For instance, the Satisfaction FeelingNode (representing the degree to which the system's goals are satisfied, overall) may have a high importance but a low truth value — this might (very loosely speaking) be considered a variety of depression. Also, FeelingNodes may be embedded in Goal Atoms, with either positive or negative valence. One could have a goal of being satisfied (having a high truth value for the Satisfaction FeelingNode), a goal of being not overworked (having a low truth value for the SystemStress FeelingNode), etc.
The key thing distinguishing elementary feelings from compound feelings is that elementary feelings are evaluable completely automatically, without any thought or any adaptation based on experience. From the experiencing OCP mind's point of view, these feelings simply are what they are. Only once OCP reaches the point of analyzing and modifying its source code, will it have control over its elementary feelings.
However, any OCP may have control over the compound feelings it generates. This leads to some interesting philosophical and psychological issues. The notion of control becomes bound up with the Consciousness distinction and the problem of will. When we say OCP has control over the compound FeelingNodes it generates and utilizes, we mean that its mental processes may adapt these based on its experience, either in service of its goals or spontaneously. Whether it feels subjectively that it has control over these FeelingNodes and associated dynamics, is a different question entirely — which gets into the complex issue of the nature of emotions in an non-evolved, nonhuman intelligence. One thing that can potentially happen is that FeelingNodes are extensively created and modified and utilized via non-goal-oriented self-organizing dynamics, and goal-satisfaction-oriented dynamics fail to exert significant influence over them. In this case, the system could rationally come to model itself as being unable to achieve its goals because its feelings were running amok.
A Provisional List of Elementary Feelings
Here we will define a set of initial, elementary feelings for OCP. A set of initial, elementary goals will follow from these. The initial feeling set described here is the one we used for initial experimentation in prototypes, but it's very much a first draft — we envision that further work will lead to substantial modifications and additions.
TeacherSatisfaction — How happy are the OCP instance's human friends with what the system is doing?
This may be gauged in several possible ways depending on the OCP front end in use:
- explicit user responses to system outputs (e.g., the user clicking a reward button, in the OpenSim user interface)
- interpretation of the user's natural language responses (Good boy, OCP!)
- objective measures of user response: how long does the user view the results returned by a OCP-powered search engine, etc.
Gaining Understanding — How much new pattern is being formed in the system overall?
One crude way to assess this is: what is the total s*d value of set new atoms, normalized by the total size of the atoms in this set? (The normalization by size is only needed because of the existence of atoms containing compound schema and predicate expressions.) This is an important system goal: the system is supposed to want to learn things!
External Novelty — How much is the perceived world of the system changing over time?
A love for novelty encourages exploration which broadens the mind.
Internal Novelty — How much is the internal landscape changing over time, the AtomSpace itself?
Health — a combination of a number of elementary indicators to give a composite indicator of health.
These indicators may include such things as free memory and response time for certain types of queries submitted by users. One may also create individual FeelingSchema for these elementary indicators, such as FreeSystemMemory or QueryResponseTime (in a OCP-driven query-oriented software application), etc. — a strategy which allows OCP to construct its own composite Health feelings.
Satisfaction — a compound feeling which is a combination (for example, a weighted average) of the feelings mentioned above.
This is the most critical FeelingNode, as it is the basis of OCP's motivational system. Of course, the use of the English word Satisfaction is not entirely accurate here; but we've found that any term with any evocative power also has the power to mislead, and after rejecting Happiness as being too simplistically misleading, we've settled on Satisfaction.
What makes Satisfaction OCP's primary motivator is simply the fact that the Goal Node referring to the goal of having high Satisfaction (i.e. of the FeelingNode having a high truth value) is constantly autonomically given a high Short-Term Importance. This causes a large percentage of the processes of the system to focus on the Satisfaction FeelingNode in various direct and indirect ways.
This initial feeling framework is intentionally very simplistic. The intention is that more complex feelings should emerge via the natural evolutionary processes of the system. Imposing a subtle system of human-like feelings on OCP is probably not desirable. Rather, we should give it a very simple framework for sensing itself, and allow higher-level emotions to emerge as is appropriate for the system given its particular goals and contexts.