EmbodimentProxyMessages (Embodiment)

From OpenCog

This document contains some details about the XML messages exchanged between the Proxy of a virtual world and an embodiment OAC.

The XML schema at opencog/embodiment/Control/PerceptionActionInterface/BrainProxyAxon.xsd has the full specification.

Also of interest may be details about NetworkElement and details about PerceptionActionInterface.

Messages

OAC to Proxy

map-info

This message is used for creating the LocalSpaceMap and contains the physical elements perceived by an avatar. Each entity would be stored into the current LocalSpaceMap of the avatar and will be available to the other Embodiment modules.

<?xml version="1.0" encoding="UTF-8"?> 
<oc:embodiment-msg xmlns:pet="http://www.opencog.org/brain" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.opencog.org/brain BrainProxyAxon.xsd"> 
    <map-info global-position-x="319084" global-position-y="-193599" global-position-offset="67400"> 
        <blip timestamp="2009-11-20T19:37:49.631" > 
            <entity id="83965" name="Fido" type="pet" owner-id="7270" owner-name="Suzy"/>  <!-- required -->
            <position x="339213.0" y="-152664.0" z="0.0"/> <!-- required -->
            <rotation pitch="-0.0" roll="-3.141592653589793" yaw="1.3587255595480952"/> <!-- required -->
            <velocity x="0.0" y="0.0" z="0"/> 
            <properties> 
                 <!-- required properties . Please note that you can add you own properties here, but must to keep the following ones -->
                <property name="visibility-status" value="visible" />  <!-- allowed values: ?visible? or ?non-visible? and -->
                <property name="width" value="100.00" /> 
                <property name="length" value="200.00" /> 
                <property name="height" value="200.00" /> 
                <property name="detector" value="true" /> <!-- the reported object provided its update. If another entity reported it, detector value would be false. You can use ?true? as default-->
                <property name="remove" value="true" />  <!-- optional and explained below -->
                </properties> 
        </blip>
        <blip timestamp="2009-11-20T19:37:49.631" > 
        ...
        </blip>
    ...
    </map-info> 
</oc:embodiment-msg>

Distances in OpenCog(Embodiment) are measured in 'millimeters'. If the environment, where the agent body 'lives', has a different spatial measurement, its Proxy must to convert all the distances, dimensions, etc. into millimeters before sending it to the OAC.

i.e. lets suppose our map has 10m2 and has its left bottom corner at (0,0) You need to send a map-info containing:

global-position-x = 0
global-position-y = 0
global-position-offset = 10000

then

x-min = 0, x-max = (x-min+global-position-offset) = 10000
y-min = 0, y-max = (y-min+global-position-offset) = 10000

If a given entity is located at the center of the map, then its position will be:

Vector position( x = 5000, y=5000, z = 0 )

Please note that the Z axis is the UP axis, and when the entity has its orientation at 0, the X axis is the Front axis, and the Y axis is the left one, in the OAC side. So, if the axis configuration of the real/virtual environment was different, the Proxy needs to convert it to the OAC coordinate system first.

A rotation is composed by angles measured in radius. So, you need to send roll, pitch and yaw in radius to the OAC.

i.e. rotation at 90 degrees in the Z axis rotation( pitch= 0.0, roll=0.0, yaw=1.57 )

Even though the rotation is in Z axis you need to send it as yaw. The current version of the system only supports rotation in the UP axis (Z), and it must be sent as yaw. Pitch and roll are ignored for now.

The velocity is computed as millimeters/milliseconds or meters/second. It must be provided as a vector where the direction points to the entity direction and the vector length is the current velocity in meters/second. So, basically you need to compute the current velocity of the entity, multiply it by the unit vector that represents the entity orientation and then send it to OAC.

i.e.

velocity = 0.5 m/s
direction = (x=1,y=0,z=0)
final velocity (x=0.5,y=0,z=0)

An entity can also be marked as unknown. In that case, another property must be sent to the OAC to notify it that it doesn't know where that entity is located. For instance, lets suppose the agent saw a given entity, then a map info with the property visibility-status="visible" will be sent to the OAC. Now, suppose the agent turn its face to the opposite direction of that entity. Another message with visibility-status="non-visible" will be sent to the OAC informing it that that entity is not inside its FOV anymore. Finally the agent turn back to the entity position, but it was removed from its original place by another agent. That entity is then marked as unknown and another message must be sent to the OAC informing it that the agent doesn't know the current position of the entity. In that case a property called remove must compose the blip message as shown in the example above.

emotional-feeling

Feelings are used as Psi rule pre-conditions (formerly they were used by RuleEngine). Generally a rule that makes use of feelings fires an action that simulates its emotional condition. i.e. if excitement has a high value, a rule that fires the action ?jump_to_play? can be executed and the agent will behave as an excited dog, for instance. Every N embodiment processing cycles (configurable in the config file as PSI_FEELING_UPDATER_CYCLE_PERIOD), the feelings of the agent are updated. Depending on the sequence of the actions it is has executed, a feeling will increase or decrease at a given rate. PsiFeelingUpdaterAgent is the mindagent responsible for updating an agent's feelings and has specific rules for managing the intensity of all feelings.

<?xml version="1.0" encoding="UTF-8" standalone="no" ?> 
<pet:emotional-feeling xmlns:pet="http://www.opencog.org/brain" entity-id="83965"> 
 <feeling name="anger" value="0.45"/> 
 <feeling name="excitement" value="0.771563"/> 
 <feeling name="fear" value="0.45"/> 
 <feeling name="gratitude" value="0.45"/> 
 <feeling name="happiness" value="0.46"/> 
 <feeling name="hate" value="0.45"/> 
 <feeling name="love" value="0.45"/> 
 <feeling name="pride" value="0.674375"/> 
</pet:emotional-feeling>

NOTE: ?physiological needs? and ?energy status? are not part of the agent feelings. They are sent to the agent as an avatar-signal. The proxy is responsible for updating the physiological needs and the energy status of the agent.

action-plan

In most cases, after a rule is executed, an action plan is created. The execution of a rule implies the execution of a Combo script. The Combo script will fire built-in actions, and actions written in C++. These actions will do specific things and will compose the action plan. For instance, a walking command will generate one or more ?walk? actions in the action plan, like the example below.

<?xml version="1.0" encoding="UTF-8" standalone="no" ?> 
<pet:action-plan xmlns:pet="http://www.opencog.org/brain" entity-id="83965" id="2"> 
  <action name="walk" sequence="1"> 
    <param name="target" type="vector"> 
      <vector x="339213" y="-152664" z="0"/> 
    </param> 
    <param name="speed" type="float" value="2.5"/> 
  </action>
  <action name="drop" sequence="2"/> 

  <action name="sniffAt" sequence="3"> 
    <param name="target" type="entity"> 
      <entity id="7270" type="avatar"/> 
    </param> 
  </action> 
...
</pet:action-plan>

The available combo built-in actions can be found at:

opencog/embodiment/PetComboVocabulary/pet_builtin_action.h

Note that the name of the action in the action plan doesn't correspond directly to the script name. The action function signatures that goes into the action-plan can be seen at:

opencog/embodiment/Control/PerceptionActionInterface/ActionType.h

Proxy to OAC

communication

This element type is a message representing something another agent says.

<?xml version="1.0" encoding="UTF-8"?>
<oc:embodiment-msg xmlns:oc="http://www.opencog.org/brain" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.opencog.org/brain BrainProxyAxon.xsd">
<communication source-id="1111" timestamp="3242342342">
hello world!
</communication>
</oc:embodiment-msg>

Note: in the past the proxy did natural language processing directly. However, this prevents any syntactic language learning and would require implementing in every proxy. The current way is better as it lets the OAC determine exactly how it wishes to process the communication it receives.

agent-sensor-info

This message is used to tell OpenCog about something that was perceived by a given sensor. Actually there is only one sensor that makes use of this message, the visibility sensor [Field of View (FOV)]. The world is segmented into a grid. Each cell of that grid have two states (seen or hidden). If the agent already saw that cell it would be marked as seen, otherwise it is marked as hidden. This mechanism is useful for map exploring.

?row column_start column_end column_start column_end ... column_start column_end;...?
I.e.
0 1 1 3 6; 1 1 2 means
row 0 column 1 to column 1 and column 3 to column 6 were seen by the agent
row 1 column 1 to column 2 were seen by the agent

That is, once the cell is marked as seen, it will not be sent to the OAC while it is marked as visible.

<?xml version="1.0" encoding="UTF-8"?> 
<oc:embodiment-msg xmlns:pet="http://www.opencog.org/brain" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.opencog.org/brain BrainProxyAxon.xsd"> 
    <perception sensor=?visibility? subject=?map? signal=?0 1 1 3 6;1 1 2?/>
</oc:embodiment-msg>

agent-signals

An agent signal is a notification sent to the OAC to inform it about a given action executed by a specific agent. Agents are entities controlled by humans or other OACs, but not the OAC receiving the message.

<?xml version="1.0" encoding="UTF-8"?> 
<oc:embodiment-msg xmlns:pet="http://www.opencog.org/brain" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.opencog.org/brain BrainProxyAxon.xsd"> 
    <agent-signal id="65" timestamp="2009-11-20T19:37:45.641"> 
        <action name="walk">
            <param name="target" type="vector"> 
                <vector x="339213.0" y="-152664.0" z="0.0"/> 
            </param> 
            <param name="speed" type="float" value="2.5"/> 
        </action>
    </avatar-signal
    ...
</oc:embodiment-msg>

avatar-signals

This message is sent to a specific OAC about feedback from its own avatar. The structure is similar to the message ?agent-signals?. One example of the contents of such a message are the physiological needs of the agent. The proxy extracts the physiological needs and sends them to the OAC. The level of the physiological needs goes from 0 to 1.

<?xml version="1.0" encoding="UTF-8"?> 
<oc:embodiment-msg xmlns:pet="http://www.opencog.org/brain" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.opencog.org/brain BrainProxyAxon.xsd"> 
    <avatar-signal id="83965" timestamp="2009-11-20T19:37:45.641"> 
        <physiology-level name="hunger" value="4.340277777777778E-5"/>
        <physiology-level name="thirst" value="6.365740740740742E-5"/>
    </avatar-signal>
    ...
</oc:embodiment-msg>

Also, an avatar-signal can represent the execution status of an action it performed. This is feedback sent by the proxy in response to an action sent by the OAC within an action plan. Suppose, for example, that an action plan containing two actions (walk and drop) was sent by the OAC. After the avatar has performed these two actions, the OAC will then send an avatar-signal message with the execution status of the action: done if everything was ok or error otherwise.

<?xml version="1.0" encoding="UTF-8"?> 
<oc:embodiment-msg xmlns:pet="http://www.opencog.org/brain" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.opencog.org/brain BrainProxyAxon.xsd"> 
    <avatar-signal id="83965" timestamp="2009-11-20T19:37:55.658">
        <action plan-id="2" sequence="1" name="walk" status="done"/> 
    </avatar-signal>
</oc:embodiment-msg>
<?xml version="1.0" encoding="UTF-8"?> 
<oc:embodiment-msg xmlns:pet="http://www.opencog.org/brain" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.opencog.org/brain BrainProxyAxon.xsd"> 
    <avatar-signal id="83965" timestamp="2009-11-20T19:37:57.792">
        <action plan-id="3" sequence="2" name="drop" status="done"/>
    </avatar-signal>
</oc:embodiment-msg>

Main Complex types

These are the XML structures used to represent the main data types.

enumeration EntityDataType: pet, humanoid, structure, avatar, acessory, object and unknown

BaseEntityType
EntityDataType BaseEntityType::type (required)
string BaseEntityType::id (optional default=-1)

VectorDataType
double VectorDataType::x (required)
double VectorDataType::y (required)
double VectorDataType::z (required)

RotationDataType
double RotationDataType::roll (required)
double RotationDataType::pitch (required)
double RotationDataType::yaw (required)

ListDataType
VectorDataType ListDataType::vector
RotationDataType ListDataType::rotation
string ListDataType::value