OpenCog By Way of Wikipedia
One way to gain general background relevant for understanding OpenCog is via Wikipedia...
(To be clear: the pages linked below don't say anything about OpenCog directly. But they fill you in on ideas related to OpenCog -- stuff that, if you understand it, will help you understand what we're doing with OpenCog a lot better.... No pretense of completeness is made, and Wikipedia pages are always subject to change. Reader beware ... and enjoy!)
-- The "atomspace" is a graphical database ... with types.
-- Many of the Atom types are for representing logical assertions. Thus, it vaguely resembles the DataLog subset of ProLog.
-- Among other knowledge, the Atomspace can/should be able to store the kind of data that Cyc stores, in a vaguely similar way, except that the atomspace generalizes the true-false "truth values" to general floating point truth values.
-- Floating point "truth values" attached to Atoms can be interpreted as "probabilities" (enabling Bayesian probability) and also fuzzy logic, Markovian networks (e.g. hidden markov models), artificial neural networks. By contrast, Cyc has to use "microtheories" to resolve inconsistencies.
The above gives a flavor of what the atomspace can hold/represent. We (the opencog team) played with all the above, but not scaled it up in a big way, yet (except in a few narrow commercial applications). Two big items that work with the atomspace are:
-- PLN: a probabilistic logic reasoner, does forward/backward chaining and inference in a probabilistic manner.
-- MOSES: a genetic program learner. Used for learning. Don't confuse genetic algorithm with genetic programming. Genetic programming uses the genetic algorithm to learn programs. In the case of opencog, it learns graphs (that represent logical expressions, i.e. simple programs).
-- The pattern matcher. This is a lower-level tool. Possibly mis-named, as it is far more powerful than an "ordinary" pattern matcher, and is instead a subgraph isomoprhism solver. Call it a graph satisfiability solver. Or maybe a graph query language, like SQL but for graphs. Vaguely DPLL-like-ish in how it works, kind-of-ish.
-- A natural language subsystem. This includes components for dependency parsing, and turning dependency parses into logical expressions:
and also tools for language generation (e.g. microplanning and surface realization):
-- The flow of attention through the system is regulated by ECAN, Economic Attention Networks, a customized variant of attractor neural networks:
-- Creativity is achieved in the system via many means; one is concept blending:
-- Action selection (how does the system choose what to do?) is handled via OpenPsi, a customized version of the Psi model of motivated action
-- Experimentation is ongoing with DeSTIN and other deep machine learning algorithms, to recognize patterns in perceptual data and feed these patterns into the atomspace
-- Goal-driven learning of procedures is an important aspect of OpenCog learning (along side other learning algorithms); this is a kind of reinforcement learning...