OpenSim for OpenCog - GSoC 2008

From OpenCog
Jump to: navigation, search

Basic Details

Abstract

The goal of this project is to develop an interface to Opensim for OpenCog, via AGISim via RealXTend and/or LibSL. In the past I have worked on learning to control agents in Second Life using LibSL, and have developed agent like objects in both Second Life and OpenSim. RealXTend provides for both server and client side scripting allowing a potentially easier way to implement and interface between AGiSim and OpenSim using RealXTend as a proxy. In addition, RealXTend supports Python, which is one output form for Yield Prolog (a system I am integrating as an OpenSim scripting language). Such a combination would allow use of a Prolog specification to perform task/action/behavior decomposition within the proxy.

Initial plan includes - Setup, compile and run the existing AGISim or other OpenCog related sim interfaces. - Setup, compile and run the existing RealXTend (already running client and server working) - Catalog the functions supported by AGISim and produce a map between AGISim concepts and OpenSim concepts - Specify and implement an RealXTend script for each functionality requiring mapping - Specify and implement an AGISim agent proxy that uses RealXTend instead of the local simulation - Develop an equivalent environment to AGISim in OpenSim and/or RealXTend server - Verify the perceptual equivalents between the two simulations and note differences and/or possible extra info (higher/lower fidelity or resolution) - Write up and release

Other possible courses may include direct usage of OpenSim by LibSL or equivalents, or modification of OpenSim to make interface easier. The inclusion of an agent like OpenCog as an NPC option is highly desired by the OpenSim community and has longterm potential for both projects. A possible sideffect may be the ability to use OpenCog on all RealXTend supported grid types ( Opensim, Second Life and RealXTend servers).

Contributors

Mentors

List of Suggestions

Work Log

Prior to 2008-05-29

  • Setup a private 3-region OpenSim mini-grid
  • Acquire SecondInventory software to import full permission items from Second Life into mini-grid.
  • Setup multi-building "playpen"
  • Discovered [Eyepatch project] which is a trainable vision recognition framework. Acquired a desktop webcam product and verified that Eyepatch can use the virtual camera to recognize SecondLife scenes.
  • Examine Triplify and its possible use with the OpenSim supporting databases. Triplify provides a RDF view of SQL databases, and thus provide a standard view to the underlying model. However it does require an ontology to be defined somewhere for the relations output.
  • Examine RESTBOT - LibSecondLife accessed via a Rest interface
  • Download, compile and run TestClient for LibSL
  • Start testing of a socket accessible version of TestClient. A TestClient command line is submitted over the socket and a xml-ish response comes back. Requires providing the xml-ish definitions for all the commands.
  • Examine Player/Stage/Gazebo as an standard robotics interface and simulator. Considered using OpenSim as a Stage/Gazebo replacement.
  • Asked OpenSim and RealXTend community about the feasibility of implementing a servo model for animation. Such a mode of operation would be possible with OpenSim/RealXtend but would cost prohibitive for SecondLife. Learned that for basic physics the avatar is modeled as an egg-ish elipsoid, and can be thought of as a polar coordinate robot which can play animations.
  • Submitted the Yield Prolog patch to OpenSim. http://opensimulator.org/mantis/view.php?id=1314
TODO: 2008-05-29
  • Setup BZR/SVN access (currenlty using SVN to access Opensim)
  • Get OpenCog related code working locally
  • Learn everything OpenCog ...
  • Acquire any and all pervious work related to OpenCog and sims.
  • Complete Avatacron socket testclient. Complete xml-ish definitions
  • Examine dynamically creating animations.
  • Examine creating a Player based interface through testclinent.
  • Examine how to get video or visual classifications back in the different frameworks.
Comments: 2008-05-29

Currently looking at LibSL. Most Open Second Life projects use it, and it defines the protocol and capabilities of the avatars. Testclient provides basic functions and examples of the possible avatar actions an how they are implemented. At a minimum it will provides a base set of functions. Now the question is how to map the SecondLife avatar functions in to something OpenCog can understand and use.

Week of to 2008-06-07

  • Updated Yield Prolog to Opensim to match sourceforge version 669
  • Tested in-prim Prolog code. Can do basic operations and search. One possibility is "given a goal of showing X which set of animation modules will achieve it", where X is some final position or communication. Developing basic library to interface the prolog system with the LSL/OS universe.
  • Import some animation and learn about their execution. Animations have priority associated with them, with joints control being given to the animation with the highest priority. Thus multiple animations can be blended together.
  • Implemented basic vector following in TestClient. Does not do collision avoidance, but does a simple turn-towards and advance using fly, run or walk based on the distance. Maybe a potential fields system. Or simply wait until OpenCog is ready to drive.
  • Released a basic script for inworld terminal to a command shell. Should be useful to interact with TestClient and shell. More complex shells are possible, including VNC connections to a VM-based computer. Virtual world interfaces to virtual computers ...
TODO: 2008-06-07
  • Get new processor, install os, install SL client and OpenCog
  • Learn more about BVH and start experimentation
  • Ultra-intelligent pose balls for animation???
Comments: 2008-06-07

Theme of building infrastructure. Over the next week I should receive a new quad-core machine to run both the client and OpenCog on. So postponing getting OpenCog running until then, which should be its home.

Week of to 2008-06-13

  • Using VirtualBox built OpenCog. Should port easily to new machine.
  • All parts of new system are in. Requires building.
  • Added socket communications to YP, and made first OpenCyc query from Opensim prims.
  • Some change in Opensim seemed to increase the performance requirements. Hopefully this is only temporary. One option is to host additional regions on other processors. Also some of the projects pointed to by OpenCog are relevant to OpenSim applications (n2n, memcached db ...)
TODO: 2008-06-13
  • Construct and test new user machine (OpenCog's home)
  • Port collected materials over
  • Midweek will be visiting Semantic Web Austin, visiting Cycorp and other Austin related contacts.
  • Collect existing code (just added #opencog to irc client)
Comments: 2008-06-13

The theme of the week has been collection and prep for the new processor. Should spend the weekend moving things over and into it. The big time sink was a performance issue with Opensim which took a two days to find a work around.

Period 2008-06-13 to 2008-07-05

  • Experience and correct Denial of service attack
  • Experience and correct primary OS disk crash
  • Moved data and enviroment to new processor
  • Stablized operation of new processor (intermittent crashes)
  • Examined CUDA for parallel processing
  • Acquired existing specification for existing Petaverse interface
  • Breaking Petaverse spec into protocol matching Opensim requirements and capabilities

Comments:2008-06-13 to 2008-07-05

Recovering from the DDOS attack and hard disk crash took far longer than expected. However the end result is all processes have been moved to the new processor. The new processor appears to handle the load well, loading the different processes into a different core. Where the previous system was thrashing the new one rarely breaks 30% utilization. This is partially due to the improved graphics processor, with also supports CUDA parallel processing (a C programmed SIMD model). Looking at a spreading activation / Pagerank implementation in CUDA for another project. Intermittent crashes affected usability until a means of forcing the GPU cooling fan to 100% was found. Once done the system became very stable.

Currenlty running an instance of ResearchCyc on the system, and examining how OpenSim information would be ontologized. An example would be wearing clothing.

  • (wearsClothing OBJ ITEM) means that OBJ wears ITEM.
  • (wornOn ITEM BODYPART) means that ITEM is being worn on the body part BODYPART.
  • (wearer WEARING OBJ) means that OBJ is the wearer during WEARING, where WEARING is an event of type WearingSomething.
  • (itemWorn WEARING ITEM) means that ITEM is worn during WEARING.
  • SomethingToWear is a collection of things that can be worn including clothing.

So one representation would be:

(ThereExists ?AWearing 
 (and (isa ?AWearing WearningSomething)
      (isa MyBot OpenSimBot)
      (wearer ?AWearing MyBot)
      (itemWorn ?AWearing MyShirt)
      (isa MyShirt SomethingToWear)))
(wearsClothing MyBot MyShirt)
(wornOn MyShirt MyBot-Torso)

Or something like that assuming the other elements like MyBot, OpenSimBot, MyShirt etc are properly entered. At least enough informtion should be transfered to allow a controling bot to generate something similar. Enough exists to generate a logical form like this as a report, it just requires proper encoding.

MidTerm: 2008-07-14

System active and ready to learn Post card from the Sim ... A bumpy ride but all survive. And all those fuzzy green obstacles to avoid.

[Midterm status of OpenSim for OpenCog (pdf)]

Work : 2008-07-14 to 2008-07-24

Processing Begins

Given the Petaverse examples I modified the Avataron Bot client to start accepting external commands given via XML. The commands were stored in a web accessible file, and the Avataron is given the URL. The one problem is matching the nested XML format with the test client flat-text format. Using the C# System.XML library I was able to parse out the information, and construct the final command at the close of each ‘action’ tag. This for each action-plan the system executes each sub-action sequentially. This works for many situations, except those actions of long duration, like ‘follow <avatar> ‘. Also the way follow is executed in the example client results in eventual stack overflow due to recursion (though it should be tail optimized…) So to properly operate the system requires a multi-threaded queue system, with the ability to do more complex processing other than immediate execution.

The current idea under consideration is to use a multi-threaded task processor with embedded interpreter. The actions get translated into an intermediate task language and put on the processing queue. As they are completed they eithr just perform their internal operation/update and quit or they write their messages back onto the message queue to inform the client AI program.

This leads to two steps:

  • Define a task manager
  • Define an intermediate language

After a search I found DotLisp [DOTLISP ] , a smallish C# lisp/scheme interpreter that reuses the existing C# types and connections with code without FFI . Given a definition of a task in lisp, the task manager evaluates each task calling the required bot control code, and returning the proper response and dequeuing when done. Not only does it allow direct commands to be implemented, it also allows the system to have a predefined library. Also the tasks submitted could be more complex.

One area minor area that came to mind was translating between xml & lisp, and a very simple format presents itself.

<op name=”opcode ”> {inner code } </op>  --> ( opcode {inner code} )
<arg> {inner code} </arg> --> {inner code}

So

(def (superset? S1 s2) (subset? S2 s1))

In xml would be represented as

<op name=”def”>
 <op name=”superset?”> <arg>S1</arg> <arg>S2</arg> </op>
 <op name=”?subset?”> <arg>S2</arg> <arg>S1</arg> </op>
</op>

Or

<arg> (def (superset? S1 s2) (subset? S2 s1))<arg>

Which would be fairly easy to transform either the Petaverse based input into, or allow other systems to write directly. Actions can defined as defined functions, and these functions can be updated over time. Once the s-expression is handled by the reader, the evaluatable object is added to task queue. After evaluating each task the task manager checks if the task signaled that it needs to be reinserted in the queue.

Basic Integration

Connecting DotLisp to the client went fairly well, requiring only a few mods to help the interpreter disambiguate the preferred classes to use for references to IEnumerable and IEnumerator and the like.

The current client object is passed into the system allowing all of the existing methods to be called.

interpreter.Intern(“thisClient”,this);

To have the bot say something in world would then be

(thisClient.EvaluateCommand “Say Hello World with a L I S P …”)

Or from a file:

<?xml version=”1.0” encoding=”utf-8” ?>
<op name=”thisClient.EvaluateCommand”>
 “Say Hello with a L I S P …”
</op>

Created a simple class called subtask, an instance of which is passed in

subtask thisTask = new subtask();
thisTask.continueTask=false;
interpreter.Intern(“thisTask”,thisTask);

so by default the tasks are one shot. However the code can manipulate the flag and any other property of thisTask, and thus signal that it wants to be repeated.

<?xml version=”1.0” encoding=”utf-8” ?>
 <op name=”block”>
  <op name=”thisClient.EvaluateCommand”>
   “Say Hello with a L I S P …”
  </op>
 <op name=”set”>
     <arg name=”1”>thisTask.continueTask</arg>
     <arg name=”2”>true</arg>
  </op>
</op>

If the actual code is part of the subtask record then the system can manipulate what is executed from timeslice to timeslice. An if a handle to the task queue is passed in then it could create new entries and effectively “spawn” sub tasks. When done it should make be a nice nano-os pattern…

SubTask Manager

The subtask object currently contains the original lisp code string, the codeTree from converting the code into an evaluatable object, the results of evaluation and the requeue flag. The manager implements simple round robin processing, and sleeps between each task evaluation. It reuses a common task interpreter that is preloaded with the library of support functions. After each evaluation the system checks to see if the code string has been changed and if it has, it “recompiles” it if the task is marked for requeuing.

A few questions come to mind:

  • Will state persistence be important? One simple solution is to just save the interpreter for each task, but if you load the library which adds time, and it takes space. The other option is to add a variable space to thisTask which would accessible and non-volitile, which may be enough for this lightweight work.
  • Adding minimum sleep interval and priority
  • Maybe using threads directly
Current Status
  • System can accept URL to xml encoded lisp file and execute the operation. It can perform the endless repeating message test where it says something in world and resubmits the task to the queue. Also provided "Client" object to the interperter so it can access all the avatar driving functions.
TODO
  • Develop better sensing in the TestClient, and make it available to the task manager. One option becomes having a scanner process submit tracker tasks for each new object in a scene. Each tracker task's job would be to keep the client AI informed.

Work: 2008-07-25 to 2008-08-07

  • Moved away from TestClient to using TextSL, a Secondlife client for blind users. It translates the visual scene information into a Zork like interaction. The key is it handles event notification from the sim.
  • Key problem is TextSL needed to be updated to the latest libsecondlife interface , OpenMetaverse. So after porting the LISP/Task processor over spent time updating TextSL to use the new OpenMetaverse.
  • Problem: OpenSim did not support the current OpenMetaverse LLSD-xml login method. I wrote a patch to Opensim to support the new login method, which should help all libsecondlife based bots in the future connect to the sims.
  • Problem: OpenMetaverse had recent changes that broke the interaction with the inventory system. So the other functions operate (follow avatar, describe object, etc.) but operations that use the inventory are currently broken. This applies to ALL bots using the latest OpenMetaverse code (or so it seems). Once this is fixed things like animations will work.
  • Verified that the nVidia card is not the problem, but the interaction between Vista 64 and the various Second Life clients. So the current plan is to resurect the previous computer as primarily a viewer / client and have the current system be the server (the exact opposite of the original configuration). Opensim / libsecondlife appear to like being on seperate systems. TextSL started showing inventory when moved to the laptop, even though the server system never exceeded 10% utilization. Currently the packet manager is being modified in Opensim and this may improve things.
  • On the suggestion of moving the system to RealXtend: the overall framework of a task manager with interperter could be implemented in say the client side Python. However the current system is in C#, and porting it to below the client level on RealXtend may invalidate my ability to work on the OpenSim codebase (different licenses). It may be possible to make it modular enough so it can be "just connected" without seeing anything of import.
TODO
  • Update Opensim and the TextSL based client, working on trying to get the inventory code to work. Basically similar to what I did for the LLSD login code.
  • Work on having the TextSL objects properly exposed in the LISP engine.

Work: 2008-08-08 to 2008-08-22

  • LibOpenMetaverse changed the inventory system yet again. Updated client to match both TextSL and libomv, then mod system to work with revised inventory. This system seems to work (finds all the inventory).
  • Providing final clean up and documentation on the system.
  • One service for now. Configured using the botconfig.xml file
  <tcpPort>5555</tcpPort>
  <tcpIPAddress>127.0.0.1</tcpIPAddress>
  <firstName>My</firstName>
  <lastName>Bot</lastName>
  <password>MyBotPassword</password>
  <simURL>http://localhost:8002/</simURL>

You would make the appropriate changes for your bot. SimURL is the sim the system will login to, and the tcp port and address are the ports the system listens for commands on.


CONFIGURATION

To configure the system you will need to provide or modify the botconfig.xml file. In particular you will want to have an avatar already defined on the sim you will be using. The "tcp" paramaters define the IP address of the Cogbot service. The other parameters define the login for the virtual viewer used by the sytem to connect to the avatar in the sim. You can find the botconfig.xml file in

.\cogbot\bin\Debug\botconfig.xml
.\cogbot\bin\Release\botconfig.xml
.\cogbot\bin\botconfig.xml

Depending on the build or run options you use.

The key parameters to change are

 <tcpPort>5555</tcpPort>
 <tcpIPAddress>127.0.0.1</tcpIPAddress>
 
 <firstName>My</firstName>
 <lastName>Bot</lastName>
 <password>MyBotPassword</password>
 <simURL>http://localhost:8002/</simURL>
 
 <startupLisp>(thisClient.ExecuteCommand "login")</startupLisp>

Once started you can fill in the login parameters using the login form. Any changes there will be saved to the config file (which will be created if it does not already exist).

New: <startupLisp> allows an initial Lisp expression to be evaluated after the task manager starts executing and has loaded its boot files.


MANUAL OPERATION

Since Cogbot is based on TextSL, you can drive things manually. You can start the system by executing:

\cogbot\bin\cogbot.exe
which uses \cogbot\bin\botconfig.xml

The menu system is fairly simple for now

File -> Exit
Client->Login
Client->Logout

Under "Client->Login" is a form for modifying the username and password. Information on the status of commands and messages are provided in the central text box. User input is in the white text entry box and executed by pressing the submit button.

An example of the report of logging in :

About to initialize port.
Listening for a connection... port=5555
LoginForm Start Attempt 0
TextForm Network_OnLogin : [ConnectingToSim] Connecting to simulator...
LoginForm NOT logged in for reason:0 Timed out
TextForm Network_OnLogin : [ConnectingToSim] Connecting to simulator...
You see 0 people.
TextForm client_OnLogMessage: Info <Kotoko Irata>: Connecting to (127.0.0.1:9011)
TextForm Objects_OnNewAvatar: 
TextForm Objects_OnNewAvatar: 
TextForm client_OnLogMessage: Info <Kotoko Irata>: Received a region handshake for Citadel (127.0.0.1:9011)
TextForm Network_OnSimConnected: Citadel (127.0.0.1:9011)
TextForm Network_OnLogin : [Success] Welcome to OGS
TextForm Objects_OnNewAvatar: 
TextForm Avatars_OnLookAt: 7dbd61c6-90cf-49df-bf77-94f5a7223c19 to 7dbd61c6-90cf-49df-bf77-94f5a7223c19 at 7dbd61c6-90cf-49df-bf77-94f5a7223c19 with type FreeLook duration 2
You see the objects 1: DIR, 2: Deck, 3: Desk, 4: StucoBeachHouse, 5: WallSectionSolid, 6: FW_Steps, 7: Landscape, 8: HellBox, 9: Blue, 10: marble, 11: six, 12: one, 13: DaxSymbol, 14: 2_Walls, 15: ML866, and 16: Window02_Tall,4-Pane.
TextForm Avatars_OnLookAt: 7dbd61c6-90cf-49df-bf77-94f5a7223c19 to 7dbd61c6-90cf-49df-bf77-94f5a7223c19 at 7dbd61c6-90cf-49df-bf77-94f5a7223c19 with type   FreeLook duration 2
Logged in successfully.

The results of typing help:

-----------------------------------------------
login: Login to Secondlife
logout: Logout from Secondlife
stop: Cancels a particular action
teleport: Teleport to a location.
describe: Describe location, people, objects, or buildings.
say: Say a message for everyone to hear.
whisper: Whisper a message to a user.
help: Print this help message.
sit: Sit on the ground or on an object.
stand: Stand up.
jump: Jump.
crouch: Crouch.
mute: Toggle Mute or unmute a user
move: Move to a person or object, or in a direction.
use: Use an item from inventory.
fly: You start flying.
stop-flying: You stop flying.
where: Finds out in which direction an object or a building or a person is.
locate: Gives the coordinates of where you are.
follow: Start or stop following a user.
stop following: Start or stop following a user.
stop-following: Start or stop following a user.
tutorial1: Teaches you how to navigate using basic commands move, sit, stand
--------------------------------------------------
describe
------------------
You are in Citadel.
You see 2 people.
You see the objects 1: CEMA, 2: Sit, 3: Clever, 4: 5_Flooring, 5: Boardman Bedroom, 6: Wood, 7: Keyboard, 8: Medical, 9: House03_PostHickory, 10: Low, 11: CLEAR, 12: marble end, 13: Banana, 14: Banana Plant, 15: Clay, and 16: Imperial.
You see 2 buildings.
You see 2 people.
------------------
describe people
------------------
You see one person: 1: Daxxon Kinoc.
------------------
describe Daxxon
------------------
Daxxon Kinoc is standing in Citadel.
Daxxon Kinoc is 2.112267 distant.
------------------
describe Clever
------------------
Clever Zebra Small Office (left): http://www.cleverzebra.com/
This object is for sale for L10
------------------
----------------------------------------------------------

and so on. To get the inventory you can issue "describe inventory". The system also accepts

 "use <inventory-item-name> to wear"
 "use <inventory-item-name> to animation-start"
 "use <inventory-item-name> to animation-stop"

The system recives events from the sim like what is being looked at.

TextForm Avatars_OnLookAt: 7dbd61c6-90cf-49df-bf77-94f5a7223c19 
             to da717612-e98f-469b-b6c3-f9145ca84e64 
             at da717612-e98f-469b-b6c3-f9145ca84e64 
             with type Focus duration 1.701412E+38
 (TARGET IS SELF)

meaning that the user is looking at the bot.


External Access

The whole point is to be something an AI running as an external process could use. To do this the system accepts commands via the socket it is listening to.

For testing we setup a file on a server with the commands to the lisp interperter XML encoded.

The construct

(thisClient.ExecuteCommand “<command-string>”) 

will execute any command-string you could execute manually from the client.

Another useful fragment to know is:

(set thisTask.requeue (to-bool false))

which means not to requeue this code fragment for later execution.

If you did want this fragment to be constantly requeued you would use

(set thisTask.requeue (to-bool true))

So the dotlisp equivelent of "Hello World" would be:

(block
 (thisClient.ExecuteCommand “say Hello World with a L I S P 2 …”)
 (thisClient.msgClient "(knows world (exists me))" )
 (set thisTask.requeue (to-bool-false))
)

Which we then translate into XML and stick in a URL accessible file.

--------------------------------------------------
testlisp2.xlsp
--------------------------------------------------
<?xml version="1.0" encoding="utf-8" ?>
<op name="block">
  <op name="thisClient.ExecuteCommand">
  "say Hello With a L I S P 2..."
</op>
  <op name="thisClient.msgClient">
   "(knows world (exists me))"
 </op>
<op name="set">
 <arg name="1">thisTask.requeue</arg> 
 <arg name="2">(to-bool false)</arg> 
 </op>
</op>
--------------------------------------------------


Using Putty we connect via a raw socket to the Cogbot server (in our case localhost:5555).

And assuming we have a command stored as an XML file on a server we can simply type in the URL

http://pandor6/temp/testlisp2.xlsp

The system will return

'(enqueued)(knows world (exists me))

execute the command and the bot will say "Hello With a L I S P 2..." in-world.

----------------------------
SockClient:http://pandor6/temp/testlisp2.xlsp
EvaluateXmlCommand :http://pandor6/temp/testlisp2.xlsp
XML2Lisp =>'(block(thisClient.ExecuteCommand 
  "say Hello With a L I S P 2..."
 )(thisClient.msgClient 
   "(knows world (exists me))"
  ) )'
taskTick Results>nil
taskTick continueTask=False
Kotoko Irata says, "Hello With a L I S P 2...".
------------------------------

To find out more about the lisp system see \cogbot\dotlisp\dotlisp.html The system should automatically load

\cogbot\bin\boot.lisp
\cogbot\bin\extra.lisp
TODO: 2008-08-24

At this time all the subprojects "aligned" to the point to allow Cogbot to function. OpenSim and OpenMetaverse are currently moving and OpenCog is being written. So there are many things that are needed besides just keeping up with new releases.

  • (DONE) Provide feedback to the client. This is as simple as adding a "msgClient" method to the "thisClient" object.
  • (INWORK) Patch the events hooks through using the "msgClient" function. Each event would simply post a lisp code fragment to the task queue.

Working on the "heard" listener first by adding to Chat.cs:

   parent.enqueueLispTask("(thisClient.msgClient \"(heard (" + fromName + ") '" + message + "' )\" )");
   

when I say 'hi there' to the bot in-world cogbot returns to the tcp client:

(heard (Daxxon Kinoc) 'hi there' )

This could of course be changed into calls like "(on_chat (fromName) message)" where "on_chat" is a lisp function, and could be redefined by the AI using Cogbot. Also such a function could translate into something like XML or use system provided methods to do so.

In general define all the events that occur, map them to function calls, then have a file for each type of message each type of client AI expects. So OpenCog would have one, OpenCyc its own, Soar its own, etc...

  • (SEMI-DONE) Set the system up for auto-login. The command line does have login and logout. However setting the other parameters would be nice. Also the sim may report that the system is already logged on, and may require a second attempt.
  • (DONE) Lisp initialization string in the config file, to be executed on start up.

Adding

<startupLisp>(thisClient.ExecuteCommand "login")</startupLisp>

to the startup file causes the system to automatically login as it's first action after booting. This addresses the previous TODO but requires more lisp.

Something like:

<startupLisp>
(block
 (set thisClient.config.simURL "http://myothersim.org:8002/")
 (set thisClient.config.firstName "EvilTwin")
 (thisClient.ExecuteCommand "login")
 )
</startupLisp>

or load additional config or operational files.

  • Having multiple bots. Currently the system provides single bot access, with each socket serving one bot. Being able to run multiple bots would be nice.
  • Time based requeuing . The system can requeue a task but it is not timed on an individual task basis, using a simple round robin scheduler.
  • Document the objects the lisp system has access to. These are basically the same as the client object in Cogbot since "lisp thisClinent" == "cogbot client" in the code. However the method set is still evolving.
  • More graceful socket shutdown.
  • RealXtend functions. The system uses OpenMetaverse and thus should work with the systems it works with. However some RealXtend function may go beyond the set supported by Secondlife/Opensim. This is more of a wish for OpenMeteverse extension.
  • Intelligently support lisp over the socket. Ideally, lisp, XML-encoded lisp, and the current pointer to XML files.
  • Port dotLisp to Opensim. I already ported Yield Prolog, and dotLisp is "safely dead", meaning it works but is not being rapidly extended (like everything else). So it would provide an AI-ish scripting language which can access sim methods, being both simple, dynamic and complete. Might provide a method to let AI's script objects...
  • Connecting up OpenCog. In general create the set of "mapping files" in lisp described above for any set of external AI programs.
WORK: 2008-08-27
Cogbot.lisp and events

The simulator sends update messages to clients like TextSL or the Second Life viewer. The Cogbot hooks reception of these messages an updates the appropriate structures. The system can also add lisp code fragements to the queue for processing. The defintion of these functions/methods are in the cogbot.lisp file. The initial version simply maps the parameters into an appropriate message sent to the client AI.

(on-chat agent message)
(on-instantmessage agent message)
(on-avatar-dist agent dist)
(on-avatar-pos agent vector)
(on-avatar-posture agent sitstand)
(on-meanCollision perp victim)
(on-prim-description  obj description)

Extra sensory input simply require adding more hooks and defining the functions. Note that reflexes could be implemented at this level as well as filtering.

Event log of Cogbot seeing "Daxxon Kinoc" logging in:

TextForm Objects_OnNewAvatar: 
taskcode =(on-avatar-dist (@"Daxxon Kinoc") 2.217624 )
taskTick Results>nil
taskTick continueTask=False
taskcode =(on-avatar-pos (@"Daxxon Kinoc") (@"<127.2048, 129.4689, 21.47487>") )
taskTick Results>nil
taskTick continueTask=False
taskcode =(on-avatar-posture (@"Daxxon Kinoc") (@"standing") )
taskTick Results>nil
taskTick continueTask=False
TextForm Avatars_OnLookAt: 7dbd61c6-90cf-49df-bf77-94f5a7223c19 to 7dbd61c6-90cf-49df-bf77-94f5a7223c19 at 7dbd61c6-90cf-49df-bf77-94f5a7223c19 with type FreeLook duration 2
Daxxon Kinoc says, "How are you ?".
taskcode =(on-chat (@"Daxxon Kinoc") (@"How are you ?") )
taskTick Results>nil
taskTick continueTask=False

What was sent to the AI socket

(distance-from ("Daxxon Kinoc") 2.217624)
(position ("Daxxon Kinoc") '"<127.2048, 129.4689, 21.47487>"')
(posture ("Daxxon Kinoc") '"standing"')
(heard ("Daxxon Kinoc") '"How are you ?"')

The Avatars_OnLookAt need to be captured and transformed. Then the system would be able to track objects being pointed to, like itself.