GSoC OpenCog 2009 Brief Summaries

From OpenCog
Jump to: navigation, search


THIS PAGE IS OBSOLETE

The purpose of this project was to extend MOSES, which typically evolves program trees, to evolve recurrent neural networks. The challenge was to find a suitable representation and reduction rules such that MOSES could efficiently evolve solutions to challenging benchmark tasks. While much thought went into choosing appropriate representation and reduction rules, which were then implemented, the results at the end of the project were not better than existing methods.

Preliminary Documentation Media:Moses_rnn_doc.pdf

Code is located at: https://code.launchpad.net/~lehman-154-gmail/opencog/moses-rnn

My goal was to write Python language bindings for the OpenCog Framework's API, so that users could then write Mind Agents (and other applications) in the Python programming language. We had a choice between using a generator and manually writing the bindings using some library. The latter was chosen, and we decided to use Boost.Python to write the bindings. As of today, many of the important classes have been exposed and are ready for experimental use, although we still need tests, nicer documentation, and a more Python-like interface to the classes.

Code: https://code.launchpad.net/~kizzobot/opencog/python-bindings

  • Ruiting Lian - Natural Language Generation using RelEx and the Link Parser

Insert paragraph here

The objective of this project is adding support for Language Comprehension in the Virtual Agents controlled by the OpenCog. With Language Comprehension, the Virtual Agents will be capable of "listen" sentences written in English (receive the phrases in text format) and, using its knowledge about the physical Environment (which the agent is inserted), to identify the elements mentioned in the given sentence. If the sentence was configured as an action request, the Virtual Agent is capable of understanding what that request means (a limited number of requests) and execute it. The project was completely implemented and integrated with the OpenCog.

Code

Opencog: https://code.launchpad.net/~opencog-dev/opencog/trunk
Revisions: 3236, 3237, 3238, 3248 and 3251
Multiverse Proxy: https://code.launchpad.net/~opencog-dev/opencog/embodiment_MV1.5-Proxy
Revisions: 7, 8, 49, 50

Final Documentation: http://www.opencog.org/wiki/Embodiment#Language_Comprehension

The aim of this project is to find classes of words which behave syntactically similar. Link parser is used to describe the syntactic features of a word. Challenge is that a word can have thousands of features. Current status is that we succeeded in forming classes using clustering techniques and dimension reduction with careful analysis. One of the other goals of this project is to increase the coverage of the link parser. My mentor used these classes and integrated code with link parser to achieve this goal.

Code: https://code.launchpad.net/~sivareddy/relex-statistical/testbranch

Insert paragraph here

Insert paragraph here

My project is to make MOSES smarter by integrating the BBHC and implementing the SA to the MOSES in the optimization step.But it is a pity that I have over-estimated the task of this project. I just implemented the SA for the MOSES, but the BBHC method is still not integrated into MOSES.Another task I have did is that make the current hill-climbing to support the contin type, which has been partly done(it could sample the neighborhood but not generate the whole neighborhoods of given center instance). I will make it to generate the whole neighborhoods correctly in the coming days. So it could support the contin type for many optimization algorithms MOSES used. When it is done, I am sure the MOSES will smarter.

You could find the code at the launpad: https://launchpad.net/~xiaohui/opencog/moses