CAUTION: The content here is mostly obsolete! I think the arrow used below refers to the ArrowLink, but I'm not really sure. The general mechanism for specifying the signatures of complex structures requires the use of the SignatureLink.
See atom types for a general description of types; see type hierarchy for a discussion of the current inheritance hierarchy, and see type constructor for a discusion of working with type signatures -- e.g. TypeNode, SignatureLink, TypeChoiceLink, TypeSetLink and so on.
Inheritance Between Higher-Order Types
This page deals with the somewhat subtle matter of Inheritance between higher-order types. This is needed, for example, when one wants to cross over or mutate two complex schemata, in an evolutionary learning context. One encounters questions like: When mutation replaces a schema that takes integer input, can it replace it with one that takes general numerical input? How about vice versa? These questions get more complex when the inputs and outputs of schema may themselves be schema with complex higher-order types. However, they can be dealt with elegantly using some basic mathematical rules.
(NOTE: this material was largely worked out by Lukasz Kaiser and Cassio Pennachin, in collaboration with Ben Goertzel)
Denote the type of a mapping from type T to type S, as T —> S. Use the shorthand inh to mean inherits from. Then the basic rule we use is that
T1 —> S1 inh T2 —> S2 iff T2 inh T1 S1 inh S2
In other words, we assume higher-order type inheritance is contravariant. The reason is that, if R1 = T1 —> S1 is to be a special case of R2 = T2 —> S2, then one has to be able to use the latter everywhere one uses the former. This means that any input R2 takes, has to also be taken by R1 (hence T2 inherits from T1). And it means that the outputs R2 gives must be able to be accepted by any function that accepts outputs of R1 (hence S1 inherits from S2).
This type of issue comes up in programming language design fairly frequently, and there are a number of research papers debating the pros and cons of countervariance versus covariance for complex type inheritance. However, for the purpose of schema type inheritance in OCP, the greater logical consistency of the countervariance approach holds sway.
For instance, in this approach, INT —> INT is not a subtype of NO —> INT (where NO denotes FLOAT), because NO —> INT is the type that includes all functions which take a real and return an int, and an INT —> INT does not take a real. Rather, the containment is the other way around: every NO —> INT function is an example of an int ? int function. For example, consider the NO ? INT that takes every real number and rounds it up to the nearest integer. Considered as an INT —> INT function, this is simply the identity function: it is the function that takes an integer and rounds it up to the nearest integer.
Of course, tupling of types is different, it's covariant. If one has an ordered pair whose elements are of different types, say (T1, T2), then we have
(T1 , S1) inh (T2, S2) iff T1 inh T2 S1 inh S2
As a mnemonic formula, we may say
(general —> specific ) inherits from (specific —> general) (specific, specific) inherits from (general, general)
In schema learning, we will also have use for abstract type constructions, such as
(T1, T2) where T1 inherits from T2
Notationally, we will refer to variable types as Xv1, Xv2, etc., and then denote the inheritance relationships by using numerical indices, e.g. using
[1 inh 2]
to denote that
Xv1 inh Xv2
So for example,
( INT, VOID ) inh ( Xv1 , Xv2 )
is true, because there are no restrictions on the variable types, and we can just assign Xv1 = INT, Xv2 =VOID.
On the other hand,
( INT, VOID ) inh ( Xv1, Xv2 ), [ 1 inh 2 ]
is false because the restriction Xv1 inh Xv2 is imposed, but it's not true that INT inh VOID.
The following list gives some examples of type inheritance, using the elementary types INT, FLOAT (FL), NUMBER (NO), CHAR and STRING (STR), with the elementary type inheritance relationships
- INT inh NUMBER
- FLOAT inh NUMBER
- CHAR inh STRING
- ( NO —> FL ) inh ( INT —> FL )
- ( FL —> INT ) inh ( FL —> NO )
- ( ( INT —> FL ) —> ( FL —> INT ) ) inh ( ( NO —> FL ) —> ( FL —> NO ) )