PLN Implementation Guide, 2013

From OpenCog
Jump to: navigation, search

This page breaks down the process of implementing the new PLN (2013) into a series of relatively small steps, in a commonsensical order.


By the way, I was listening to the following music while writing this page: http://www.youtube.com/watch?v=l0thnR045wk

USE OF CONCEPTNET AS TEST DATA

My initial suggestion is to use ConceptNet as test data for much PLN development.... This resource can be downloaded from:

http://conceptnet5.media.mit.edu

For initial experimentation we can learn the use the ConceptNet core, which is smaller. Later on we will want to try the whole ConceptNet.

Translating ConceptNet relations into PLN relations may involve some subtle cases. But for starters,

  • We can map ConceptNet Isa relationships into PLN InheritanceLinks
  • We can make other ConceptNet relationships into PLN EvaluationLinks (e.g. if we have Cat ---Eat---> Mouse, that becomes
EvaluationLink 
     PredicateNode: eat
     ListLink
          ConceptNode: Cat
          ConceptNode: Mouse

Thus there is an implementation step before the steps outlined below, which is to write code that translates ConceptNet into Scheme files suitable for import into OpenCog.

STEP 1: BASIC FIRST-ORDER PLN

For this STEP, we will initially deal only with five inference rules: Deduction, Inversion, Induction, Abduction and Revision. We will ignore VariableNodes and quantifiers. We will deal only with InheritanceLinks.

1) Port the five rules and formulas to C++, using SimpleTruthValues for the formulas

2) Wrap the five rules in SchemaNodes, so that each one is accessible as a GroundedSchemaNode

3) Implement forward chaining using the five rules, on the Atomspace. Implement a simple ForwardInference Agent. Create a forward chaining Unit Test.

4) Test Attention Allocation and the ForwardInference Agent together. The goal is to have the system continually doing ForwardInference on InheritanceLinks that are in the AttentionalFocus. The ForgettingAgent should get rid of Atoms produced that are not useful. Create a Unit Test for integrated functionality of Attention Allocation and ForwardInference. (Note: the parts of AttentionAllocation we need here are: importance updating of STI and LTI, and spreading of STI and LTI along generic links. We don't need HebbianLink formation yet.)

5) Connect the Atomspace Viewer to the CogServer, in such a way that one can visualize in real time the Atoms being formed via ForwardInference.

6) Write code that records all inference steps taken as Atoms, and saves these in the Atomspace. Write a Unit Test.

7) Create a separate InferenceHistory AtomSpace for storing the record of all inference steps taken. Make it so that, when an inference is taken, the Atoms from that inference are stored in the InferenceHistory AtomSpace

STEP 2: ATTENTION-GUIDED ITERATIVE QUERY PROCESS

1) Make it so that, when a pattern-matcher query is submitted, the Atoms involved in the query get a boost to their STI and LTI

2) Test a process via which the same query (regarding basic first-order links) is repeatedly submitted, the answers are different over time as more forward chaining inference is done. Write a Unit Test to test this functionality.


STEP 3: ADDITIONAL FIRST ORDER INFERENCE RULES

NOTE: STEP 3 may be done concurrently with STEP 2

For now we are still working with SimpleTruthValues, and with forward chaining.

1) Add SimilarityLinks and associated rules/formulas. Add associated Unit Tests.

2) Add IntensionalInheritance, IntensionalSimilarity, Subset and ExtensionalSimilarity, and associated rules/formulas. Add associated Unit Tests.

3) Add PredictiveImplication, SimultaneousImplication and associated rules/formulas. Add associated Unit Tests.

4) Add the temporal inference stuff that a GSoC student wrote some time ago (Jade knows where it is). Add associated Unit Tests.

STEP 4: HEURISTIC QUERY RESOLUTION

NOTE: STEP 4.1 may be done concurrently with STEP 2 or 3. STEP 4.2 may be done concurrently with STEP 2.

1) Implement a simple "multistart hill climbing" based heuristic search process that: When given a set of Atoms involving multiple VariableNodes, tries to find Atoms to assign to the variables, that will make the Atoms in the set all true (inasmuch as possible)…

Test this initially using the basic first-order inference used in STEP 1 (five inference rules, only InheritanceLinks…)…. Write a Unit Test of this nature.

2) Experiment with the heuristic search process on the additional link types/rules added in STEP 3. Write some Unit Tests of this nature.

STEP 5: BACKWARD INFERENCE

NOTE: STEP 5 can be done anytime after 1.2

1) Implement a BackwardInference Agent that does a single backward inference step.

2) Test the BackwardInferenceAgent initially on the basic FOI link types and rules from STEP 1. Write a Unit Test of this nature.

3) Test the BackwardInferenceAgent on the additional link types and rules introduced. Write some Unit Tests of this nature.

STEP 6: BACKWARD CHAINING

1) Implement a simple backward chaining process, initially using only the basic FOI link types and Rules from STEP 1. This process should, at each step in the chaining process, make a choice regarding whether to invoke another backward inference step, or to invoke heuristic query resolution. (Note that, unlike the previous PLN backward chainers, this backward chaining process uses the Atomspace as its working memory.) Write a Unit Test.

2) Test the backward chaining process using the other non-temporal link types introduced above. Write some Unit Tests.

3) Test the backward chaining process using the temporal link types introduced above. Write some Unit Tests.

STEP 7: INTEGRATED BACKWARD CHAINING

1) Test backward chaining and attention allocation working together. Measure whether this makes backward chaining arrive at better/faster answers. Write a Unit test for this integrated functionality.

2) Test backward chaining, forward chaining, attention allocation and heuristic search all working together on the same Atomspace. Measure whether this results in backward chaining arriving at better/faster answers. Write a Unit test for this integrated functionality (wow!).

STEP 8: INFERENCE WITH QUANTIFIERS AND LOGICAL OPERATORS

1) Introduce Rules regarding logical operators such as AND, OR, NOT. Write Unit tests.

2) Introduce Rules regarding ForAll and ThereExists (including variable dependencies for ThereExists). Write Unit Tests.

STEP 9: CONTEXTUAL, CAUSAL AND HYPOTHETICAL INFERENCE

Add rules for ContextLink, HypotheticalLink and CausalImplicationLink. Write Unit Tests.

STEP 10: SPATIAL INFERENCE

Integrate Region Connection Calculus (RCC) based inference rules, similar in nature to the temporal inference rules in 3.4 above. Write Unit Tests.

STEP 11: INFERENCE HISTORY STORAGE

Create a process that periodically saves the InferenceHistory Atomspace to a long-term data-store, for future mining purposes.

STEP 13: INDEFINITE PROBABILITIES

Add an option for IndefiniteTruthValue calculations, alongside SimpleTruthValue calculations. Write Unit Tests.

STEP 14: BAYES NET INTEGRATION

Add functionality that builds and maintains a Bayes net model of the Atoms in the AttentionalFocus. PLN or the pattern matcher can then query the Bayes Net to rapidly get answers to simple truth value queries. This requires a Bayes net with dynamically updatable structure. Write Unit Tests.

STEP 15: WHEW!!!

Relax and have a beer…. Then move on to PLN/MOSES integration 8-D …