Pipeline (Embodiment)

From OpenCog
Jump to: navigation, search

The Opencog Language Comprehension module intends to allow the pets in the Multiverse (MV) world understanding natural language said by an avatar. Furthermore, the pets will also be able to answer simple questions. The pipeline regarding how the sentences said are interpreted, which involves the MV proxy and the Opencog Embodiment modules, is described as following:

  • The avatar types one or more sentences in MV client;
  • If no pet was selected by the avatar, the sentences are broadcast to all pets in the world. Otherwise, just the selected pet will listen to them;
  • The MV Proxy receives the sentences and do the following for each listener pet:
    • Get the historic of recent listened sentences by the pet (which will be used in Anaphora Resolution)
    • Invoke Relex to parse the sentences. The Relex returns to the proxy an Opencog like format output that represents the parsed sentences as atoms (nodes and links) in scheme language;
    • The proxy then sends a message that encapsulates the Relex output from previous step to each listener pet;
    • The embodiment PAI receives this message and process its content, running the scheme evaluator to add the atoms to the atomspace;
    • The Reference Resolution checks which entities are involved in the sentence;
    • The Command Resolution checks which commands must be executed according to the sentences and whether it is a question or not;
    • If a command was selected, it is send to the pet that then executes it;
    • If a question was identified, then it is processed and the answer is send back to the MV Proxy;