# Einstein's Puzzle - Fact Definitions

This page is the third page in the Einstein's Puzzle Tutorial series, and describes how to encode the puzzle facts as OpenCog EvaluationLinks.

These facts (such as 'The Swede raises dogs') need to be represented as OpenCog hypergraphs, in order for reasoning to be able to take place. By general convention, such facts are always represented as predicate triples, of the form "For object X, the Predicate P expresses attribute value V". In OpenCog, such triples are represented using EvaluationLinks, that is, as the hierarchical structure

```EvaluationLink
PredicateNode P
ConceptNode X
ConceptNode V
```

The common-sense rules that will act on these facts are developed on the next page (XXX TODO).

## Triples

Facts in OpenCog are defined using triples. Consider the first fact: "The Englishman lives in the red house." Working this out, it means that "There is a person who is an Englishman, and this person lives in a red house". We make the concept of person-hood explicit here, because the second fact "The Swede keeps dogs." refers to a different person; we take care not to accidentally think that the Swede may be English.

Continuing, a more abstract representation for the first fact is "nationality_of(person1, English) AND lives_in(person1, red house)". This can then be transcribed into EvaluationLinks in a straight-forward manner:

```EvaluationLink
PredicateNode "Nationality"
AvatarNode "person 1"
ConceptNode "Englishman"

PredicateNode "LivesIn"
AvatarNode "person 1"
ConceptNode "red house"
```

The above can be very easily expressed as scheme, making it straight-forward to enter into the cogserver scheme shell:

```(EvaluationLink
(PredicateNode "Nationality")
(AvatarNode "person 1")
(ConceptNode "Englishman")
)
)
(PredicateNode "LivesIn")
(AvatarNode "person 1")
(ConceptNode "red house")
)
)
```

Notice that the only difference between the two is the use of parenthesis: the scheme programming language makes heavy use of parenthesis to form groupings. The indentation used here is simply to make the expressions easier to read.

Continuing likewise, the second fact, "The Swede keeps dogs", can be expressed as

```(EvaluationLink
(PredicateNode "Nationality")
(AvatarNode "person 2")
(ConceptNode "Swede")
)
)
(PredicateNode "Keeps")
(AvatarNode "person 2")
(ConceptNode "dogs")
)
)
```

The full set of these relations, fully encoding the entire puzzle, can be found in the file /tests/query/deduct-einstein.scm in the opencog source-tree.

The main point to observe here is that all of these facts are "linguistically natural": they can be extracted, with relative ease, from the equivalent English-language sentences. One could, without much difficulty, write some simple scripts that would parse these sentences (say, with RelEx), and generate these logical forms. Doing so is not terribly useful, as the English language is more complex than that; but, for this limited problem domain, this is a plausible form in which natural language facts could be represented as graphical information triples.

## Background Facts

One can immediately begin reasoning on the above facts, but one does not get very far. So, for example, from fact 9 "The Norwegian lives in the first house." and fact 14: "The Norwegian lives next to the blue house." we can almost immediately deduce that the person being referred to in fact 9 is the same as the person being referred to in fact 14. That is, they are both Norwegian, and we explicitly assume that if two different people are the same nationality, they must in fact be the same person. This is called coreference resolution, see Wikipedia Coreference for more.

To proceed farther with reasoning, one must encode a considerable amount of additional background "common-sense" information. One important part of this is to encode the exclusivity constraints of the problem: there are five houses, five nationalities, five pets, five types of tobacco, five kinds of drinks,and five street addresses. A given person can have one and only one nationality, live in one and only one house, smoke one and only one kind of tobacco, etc. The expression of these mutually-exclusive choices or constraints will be done in two steps. The first step is to provide an ontology for each kind; that is, to create classes that hold the kinds. So, for example, the tobacco kind can be PallMall, Blend, Prince, etc. The second step is to enforce the exclusive distribution of these kinds or choices.

The remainder of this page is devoted to setting up and defining the kinds.

XXX Caution: it is not clear that everything below is actually needed; the sketch below does not obviously lead to a correct mechanism for solving the puzzle; and requires a considerable amount of additional mechanism to move forward. It really belongs on a different page ... Everything below this point is under construction, to be taken under advisement. You have been warned. XXX

### Kinds

Let's define the main kinds used in the puzzle.

```(ConceptNode "nationality" )
(ConceptNode "pet" )
(ConceptNode "drink" )
(ConceptNode "tobacco brand" )
(ConceptNode "house color" )
```

There are six main kinds, which act as supertypes. The last kind, "street address", refers to the ordering of the houses: first, second, third, etc. It is rather different than the other five: the first five kinds are basically attributes that a person may have (Yes, the color of the house that one lives in is as much an attribute as the brand of tocacco that one smokes. One can choose to live in yellow houses in the same way that one chooses to keep dogs.) The last, the street address, can also be thought of as an attribute, but it also encodes location information. It is impossible to determine neighbor-relations without the explicit ordering of a street address. Four of the Einstein-puzzle facts deal with neighbor relations; two facts state explicit locations.

### House color kinds

Next we'll define the house color subtypes.

```(InheritanceLink (stv 1 1)
(ConceptNode "redHouse" (stv 0.01 1))
(ConceptNode "house color")
)
(ConceptNode "greenHouse" (stv 0.01 1))
(ConceptNode "house color")
)
(ConceptNode "whiteHouse" (stv 0.01 1))
(ConceptNode "house color")
)
(ConceptNode "yellowHouse" (stv 0.01 1))
(ConceptNode "house color")
)
(ConceptNode "blueHouse" (stv 0.01 1))
(ConceptNode "house color")
)
```

```(InheritanceLink (stv 1 1)
(ConceptNode "firstHouse" (stv 0.01 1))
)
(ConceptNode "secondHouse" (stv 0.01 1))
)
(ConceptNode "thirdHouse" (stv 0.01 1))
)
(ConceptNode "fourthHouse" (stv 0.01 1))
)
(ConceptNode "fifthHouse" (stv 0.01 1))
)
```

Now the system will know that 1% of the objects in the world is a "redHouse", and that it inherits from "house". It will also know that 1% of the objects in the world is a "firstHouse", and that is also inherits from "house". However, the system does not understand these houses can be the same houses.

Let's continue by defining the inheritance for the different kinds of people, pets, cigarettes and drinks. To save space we'll just write one of each here, you can get the full set from the complete Scheme file.

```(InheritanceLink (stv 1 1)
(ConceptNode "Englishman" (stv 0.01 1))
(ConceptNode "nationality")
)
```
```(InheritanceLink (stv 1 1)
(ConceptNode "tea" (stv 0.01 1))
(ConceptNode "drink")
)
```
```(InheritanceLink (stv 1 1)
(ConceptNode "Pall Mall" (stv 0.01 1))
(ConceptNode "tobacco kind")
)
```
```(InheritanceLink (stv 1 1)
(ConceptNode "dogs" (stv 0.01 1))
(ConceptNode "pet")
)
```

## Next Steps

Now you have the Scheme file, it's time to set up PLN in Python (since the C++ version is deprecated). See Using PLN in PyCharm and Einstein's Puzzle - Preparing Python for details.

## Q&A

Any questions?

Please leave them on the talk page.