Difference between revisions of "Directory talk:Jon Awbrey/Papers/Cactus Language"

MyWikiBiz, Author Your Legacy — Friday November 15, 2024
Jump to navigationJump to search
(copy old & new versions for comparison)
(move notes to talk page)
 
(34 intermediate revisions by the same user not shown)
Line 1: Line 1:
==Fragmata==
+
==Notes Found in a Cactus Patch==
 +
 
 +
: '''''Note.'''  This is a collection of fragments from previous discussions that I plan to use in documenting the cactus graph syntax for propositional logic.''
 +
 
 +
===Cactus Language===
 +
 
 +
Table 13 illustrates the ''existential interpretation'' of cactus graphs and cactus expressions by providing English translations for a few of the most basic and commonly occurring forms.
 +
 
 +
Even though I do most of my thinking in the existential interpretation, I will continue to speak of these forms as ''logical graphs'', because I think it is an important fact about them that the formal validity of the axioms and theorems is not dependent on the choice between the entitative and the existential interpretations.
 +
 
 +
The first extension is the ''reflective extension of logical graphs'' (RefLog).  It is obtained by generalizing the negation operator "<math>\texttt{(~)}</math>" in a certain way, calling "<math>\texttt{(~)}</math>" the ''controlled'', ''moderated'', or ''reflective'' negation operator of order 1, then adding another such operator for each finite <math>k = 2, 3, \ldots .</math>
 +
 
 +
In sum, these operators are symbolized by bracketed argument lists as follows:  "<math>\texttt{(~)}</math>", "<math>\texttt{(~,~)}</math>", "<math>\texttt{(~,~,~)}</math>", &hellip;, where the number of slots is the order of the reflective negation operator in question.
 +
             
 +
The cactus graph and the cactus expression shown here are both described as a ''spike''.
 +
 
 +
{| align="center" cellpadding="6" width="90%"
 +
| align="center" |
 +
<pre>
 +
o---------------------------------------o
 +
|                                      |
 +
|                  o                  |
 +
|                  |                  |
 +
|                  @                  |
 +
|                                      |
 +
o---------------------------------------o
 +
|                  ( )                  |
 +
o---------------------------------------o
 +
</pre>
 +
|}
 +
 
 +
The rule of reduction for a lobe is:
 +
 
 +
{| align="center" cellpadding="6" width="90%"
 +
| align="center" |
 +
<pre>
 +
o---------------------------------------o
 +
|                                      |
 +
|  x_1  x_2  ...  x_k                |
 +
|  o-----o--- ... ---o                |
 +
|    \              /                  |
 +
|    \            /                  |
 +
|      \          /                    |
 +
|      \        /                    |
 +
|        \      /                      |
 +
|        \    /                      |
 +
|          \  /                        |
 +
|          \ /                        |
 +
|            @      =      @            |
 +
|                                      |
 +
o---------------------------------------o
 +
</pre>
 +
|}
 +
 
 +
if and only if exactly one of the <math>x_j\!</math> is a spike.
  
* http://suo.ieee.org/ontology/msg03586.html
+
In Ref Log, an expression of the form <math>\texttt{((}~ e_1 ~\texttt{),(}~ e_2 ~\texttt{),(}~ \ldots ~\texttt{),(}~ e_k ~\texttt{))}</math>
* http://stderr.org/pipermail/arisbe/2001-April/000409.html
+
expresses the fact that ''exactly one of the <math>e_j\!</math> is true''. Expressions of this form are called ''universal partition'' expressions, and
* http://stderr.org/pipermail/arisbe/2001-December/001230.html
+
they parse into a type of graph called a ''painted and rooted cactus'' (PARC):
* http://stderr.org/pipermail/inquiry/2003-March/000114.html
 
* http://stderr.org/pipermail/inquiry/2005-February/002360.html
 
  
==Alternative Versions==
+
{| align="center" cellpadding="6" width="90%"
 +
| align="center" |
 +
<pre>
 +
o---------------------------------------o
 +
|                                      |
 +
|  e_1  e_2  ...  e_k                |
 +
|  o    o          o                |
 +
|  |    |          |                |
 +
|  o-----o--- ... ---o                |
 +
|    \              /                  |
 +
|    \            /                  |
 +
|      \          /                    |
 +
|      \        /                    |
 +
|        \      /                      |
 +
|        \    /                      |
 +
|          \  /                        |
 +
|          \ /                        |
 +
|            @                          |
 +
|                                      |
 +
o---------------------------------------o
 +
</pre>
 +
|}
  
'''Old'''
+
{| align="center" cellpadding="6" width="90%"
 +
| align="center" |
 +
<pre>
 +
o---------------------------------------o
 +
|                                      |
 +
| ( x1, x2, ..., xk )  =  [blank]      |
 +
|                                      |
 +
| iff                                  |
 +
|                                      |
 +
| Just one of the arguments            |
 +
| x1, x2, ..., xk  =  ()                |
 +
|                                      |
 +
o---------------------------------------o
 +
</pre>
 +
|}
  
A notation in the style of <math>S :> T\!</math> is now introduced, to be read among many others in this manifold of ways:
+
The interpretation of these operators, read as assertions about the values of their listed arguments, is as follows:
  
{| align="center" cellpadding="4" width="90%"
+
{| align="center" cellpadding="6" width="90%"
 +
| Existential Interpretation:
 +
| Just one of the k argument is false.
 
|-
 
|-
| <math>S\ \operatorname{covers}\ T</math>
+
| Entitative  Interpretation:
|-
+
| Not just one of the k arguments is true.
| <math>S\ \operatorname{governs}\ T</math>
+
|}
|-
+
 
| <math>S\ \operatorname{rules}\ T</math>
+
{| align="center" cellpadding="6" width="90%"
|-
+
| align="center" |
| <math>S\ \operatorname{subsumes}\ T</math>
+
<pre>
|-
+
o-------------------o-------------------o-------------------o
| <math>S\ \operatorname{types~over}\ T</math>
+
|      Graph      |      String      |    Translation    |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|        @        |        " "        |      true.      |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|        o        |                  |                  |
 +
|        |        |                  |                  |
 +
|        @        |        ( )        |      untrue.      |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|        r        |                  |                  |
 +
|        @        |        r        |        r.        |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|        r        |                  |                  |
 +
|        o        |                  |                  |
 +
|        |        |                  |                  |
 +
|        @        |        (r)        |      not r.      |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|      r s t      |                  |                  |
 +
|        @        |      r s t      |  r and s and t.  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|      r s t      |                  |                  |
 +
|      o o o      |                  |                  |
 +
|        \|/        |                  |                  |
 +
|        o        |                  |                  |
 +
|        |        |                  |                  |
 +
|        @        |    ((r)(s)(t))    |    r or s or t.  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |    r implies s.  |
 +
|        r  s    |                  |                  |
 +
|        o---o    |                  |    if r then s.  |
 +
|        |        |                  |                  |
 +
|        @        |      (r (s))      |    no r sans s.  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|      r  s      |                  |                  |
 +
|      o---o      |                  | r exclusive-or s. |
 +
|        \ /        |                  |                  |
 +
|        @        |      (r , s)      | r not equal to s. |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|      r  s      |                  |                  |
 +
|      o---o      |                  |                  |
 +
|        \ /        |                  |                  |
 +
|        o        |                  | r if & only if s. |
 +
|        |        |                  |                  |
 +
|        @        |    ((r , s))    | r equates with s. |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|      r  s  t      |                  |                  |
 +
|      o--o--o      |                  |                  |
 +
|      \   /      |                  |                  |
 +
|        \ /        |                  |  just one false  |
 +
|        @        |    (r , s , t)    |  out of r, s, t.  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|      r  s  t      |                  |                  |
 +
|      o  o  o      |                  |                  |
 +
|      |  |  |      |                  |                  |
 +
|      o--o--o      |                  |                  |
 +
|      \  /      |                  |                  |
 +
|        \ /        |                  |  just one true  |
 +
|        @        |  ((r),(s),(t))  |  among r, s, t.  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |  genus t over    |
 +
|        r  s      |                  |  species r, s.  |
 +
|        o  o      |                  |                  |
 +
|      t  |  |      |                  |  partition t    |
 +
|      o--o--o      |                  |  among r & s.    |
 +
|      \  /      |                  |                  |
 +
|        \ /        |                  |  whole pie t:    |
 +
|        @        |  ( t ,(r),(s))  |  slices r, s.    |
 +
o-------------------o-------------------o-------------------o
 +
</pre>
 +
|}
 +
 
 +
{| align="center" cellpadding="6" width="90%"
 +
| align="center" |
 +
<pre>
 +
Table 13.  The Existential Interpretation
 +
o-------------------o-------------------o-------------------o
 +
|  Cactus Graph    | Cactus Expression |    Existential    |
 +
|                  |                  |  Interpretation  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|        @        |        " "        |      true.      |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|        o        |                  |                  |
 +
|        |        |                  |                  |
 +
|        @        |        ( )        |      untrue.      |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|        a        |                  |                  |
 +
|        @        |        a        |        a.        |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|        a        |                  |                  |
 +
|        o        |                  |                  |
 +
|        |        |                  |                  |
 +
|        @        |        (a)        |      not a.      |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|      a b c      |                  |                  |
 +
|        @        |      a b c      |  a and b and c.  |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|      a b c      |                  |                  |
 +
|      o o o      |                  |                  |
 +
|        \|/        |                  |                  |
 +
|        o        |                  |                  |
 +
|        |        |                  |                  |
 +
|        @        |    ((a)(b)(c))    |    a or b or c.  |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|                  |                  |    a implies b.  |
 +
|        a  b    |                  |                  |
 +
|        o---o    |                  |    if a then b.  |
 +
|        |        |                  |                  |
 +
|        @        |      (a (b))      |    no a sans b.  |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|      a  b      |                  |                  |
 +
|      o---o      |                  | a exclusive-or b. |
 +
|        \ /        |                  |                  |
 +
|        @        |      (a , b)      | a not equal to b. |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|      a  b      |                  |                  |
 +
|      o---o      |                  |                  |
 +
|        \ /        |                  |                  |
 +
|        o        |                  | a if & only if b. |
 +
|        |        |                  |                  |
 +
|        @        |    ((a , b))    | a equates with b. |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|      a  b  c      |                  |                  |
 +
|      o--o--o      |                  |                  |
 +
|      \  /      |                  |                  |
 +
|        \ /        |                  |  just one false  |
 +
|        @        |    (a , b , c)    |  out of a, b, c.  |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|      a  b  c      |                  |                  |
 +
|      o  o  o      |                  |                  |
 +
|      |  |  |      |                  |                  |
 +
|      o--o--o      |                  |                  |
 +
|      \   /      |                  |                  |
 +
|        \ /        |                  |  just one true  |
 +
|        @        |  ((a),(b),(c))  |  among a, b, c.  |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|                  |                  |  genus a over    |
 +
|        b  c      |                  |  species b, c.  |
 +
|        o  o      |                  |                  |
 +
|      a  |  |      |                  |  partition a    |
 +
|      o--o--o      |                  |  among b & c.    |
 +
|      \  /      |                  |                  |
 +
|        \ /        |                  |  whole pie a:    |
 +
|        @        |  ( a ,(b),(c))  |  slices b, c.    |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
</pre>
 +
|}
 +
 
 +
{| align="center" cellpadding="6" width="90%"
 +
| align="center" |
 +
<pre>
 +
Table 14.  The Entitative Interpretation
 +
o-------------------o-------------------o-------------------o
 +
|  Cactus Graph    | Cactus Expression |    Entitative    |
 +
|                  |                  |  Interpretation  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|        @        |        " "        |      untrue.      |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|        o        |                  |                  |
 +
|        |        |                  |                  |
 +
|        @        |        ( )        |      true.      |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|        a        |                  |                  |
 +
|        @        |        a        |        a.        |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|        a        |                  |                  |
 +
|        o        |                  |                  |
 +
|        |        |                  |                  |
 +
|        @        |        (a)        |      not a.      |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|      a b c      |                  |                  |
 +
|        @        |      a b c      |    a or b or c.  |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|      a b c      |                  |                  |
 +
|      o o o      |                  |                  |
 +
|        \|/        |                  |                  |
 +
|        o        |                  |                  |
 +
|        |        |                  |                  |
 +
|        @        |    ((a)(b)(c))    |  a and b and c.  |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|                  |                  |    a implies b.  |
 +
|                  |                  |                  |
 +
|        o a      |                  |    if a then b.  |
 +
|        |        |                  |                  |
 +
|        @ b      |      (a) b        |    not a, or b.  |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|      a  b      |                  |                  |
 +
|      o---o      |                  | a if & only if b. |
 +
|        \ /        |                  |                  |
 +
|        @        |      (a , b)      | a equates with b. |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|      a  b      |                  |                  |
 +
|      o---o      |                  |                  |
 +
|        \ /        |                  |                  |
 +
|        o        |                  | a exclusive-or b. |
 +
|        |        |                  |                  |
 +
|        @        |    ((a , b))    | a not equal to b. |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|      a  b  c      |                  |                  |
 +
|      o--o--o      |                  |                  |
 +
|      \  /      |                  |                  |
 +
|        \ /        |                  | not just one true |
 +
|        @        |    (a , b , c)    | out of a, b, c.  |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|      a  b  c      |                  |                  |
 +
|      o--o--o      |                  |                  |
 +
|      \   /      |                  |                  |
 +
|        \ /        |                  |                  |
 +
|        o        |                  |                  |
 +
|        |        |                  |  just one true  |
 +
|        @        |  ((a , b , c))  |  among a, b, c.  |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
|                  |                  |                  |
 +
|      a            |                  |                  |
 +
|      o            |                  |  genus a over    |
 +
|      |  b  c      |                  |  species b, c.  |
 +
|      o--o--o      |                  |                  |
 +
|      \  /      |                  |  partition a    |
 +
|        \ /        |                  |  among b & c.    |
 +
|        o        |                  |                  |
 +
|        |        |                  |  whole pie a:    |
 +
|        @        |  ( a ,(b),(c))  |  slices b, c.    |
 +
|                  |                  |                  |
 +
o-------------------o-------------------o-------------------o
 +
</pre>
 +
|}
 +
 
 +
{| align="center" cellpadding="6" width="90%"
 +
| align="center" |
 +
<pre>
 +
o-----------------o-----------------o-----------------o-----------------o
 +
|      Graph      |    String      |  Entitative    |  Existential  |
 +
o-----------------o-----------------o-----------------o-----------------o
 +
|                |                |                |                |
 +
|        @        |      " "      |    untrue.    |      true.      |
 +
o-----------------o-----------------o-----------------o-----------------o
 +
|                |                |                |                |
 +
|        o        |                |                |                |
 +
|        |        |                |                |                |
 +
|        @        |      ( )      |      true.      |    untrue.    |
 +
o-----------------o-----------------o-----------------o-----------------o
 +
|                |                |                |                |
 +
|        r        |                |                |                |
 +
|        @        |        r        |        r.      |        r.      |
 +
o-----------------o-----------------o-----------------o-----------------o
 +
|                |                |                |                |
 +
|        r        |                |                |                |
 +
|        o        |                |                |                |
 +
|        |        |                |                |                |
 +
|        @        |      (r)      |      not r.    |      not r.    |
 +
o-----------------o-----------------o-----------------o-----------------o
 +
|                |                |                |                |
 +
|      r s t      |                |                |                |
 +
|        @        |      r s t      |  r or s or t.  |  r and s and t. |
 +
o-----------------o-----------------o-----------------o-----------------o
 +
|                |                |                |                |
 +
|      r s t      |                |                |                |
 +
|      o o o      |                |                |                |
 +
|      \|/      |                |                |                |
 +
|        o        |                |                |                |
 +
|        |        |                |                |                |
 +
|        @        |  ((r)(s)(t))  |  r and s and t. |  r or s or t.  |
 +
o-----------------o-----------------o-----------------o-----------------o
 +
|                |                |                |  r implies s.  |
 +
|                |                |                |                |
 +
|        o r      |                |                |  if r then s.  |
 +
|        |        |                |                |                |
 +
|        @ s      |      (r) s      |  not r, or s    |  no r sans s.  |
 +
o-----------------o-----------------o-----------------o-----------------o
 +
|                |                |                |  r implies s.  |
 +
|        r  s    |                |                |                |
 +
|        o---o    |                |                |  if r then s.  |
 +
|        |        |                |                |                |
 +
|        @        |    (r (s))    |                |  no r sans s.  |
 +
o-----------------o-----------------o-----------------o-----------------o
 +
|                |                |                |                |
 +
|      r  s      |                |                |                |
 +
|      o---o      |                |                |r exclusive-or s.|
 +
|      \ /      |                |                |                |
 +
|        @        |    (r , s)    |                |r not equal to s.|
 +
o-----------------o-----------------o-----------------o-----------------o
 +
|                |                |                |                |
 +
|      r  s      |                |                |                |
 +
|      o---o      |                |                |                |
 +
|      \ /      |                |                |                |
 +
|        o        |                |                |r if & only if s.|
 +
|        |        |                |                |                |
 +
|        @        |    ((r , s))    |                |r equates with s.|
 +
o-----------------o-----------------o-----------------o-----------------o
 +
|                |                |                |                |
 +
|    r  s  t    |                |                |                |
 +
|    o--o--o    |                |                |                |
 +
|      \  /      |                |                |                |
 +
|      \ /       |                |                | just one false  |
 +
|        @        |  (r , s , t)  |                | out of r, s, t. |
 +
o-----------------o-----------------o-----------------o-----------------o
 +
|                |                |                |                |
 +
|    r  s  t    |                |                |                |
 +
|    o  o  o    |                |                |                |
 +
|    |  |  |    |                |                |                |
 +
|     o--o--o    |                |                |                |
 +
|     \   /      |                |                |                |
 +
|      \ /      |                |                |  just one true  |
 +
|        @        |  ((r),(s),(t))  |                |  among r, s, t. |
 +
o-----------------o-----------------o-----------------o-----------------o
 +
|                |                |                |  genus t over   |
 +
|        r  s    |                |                |  species r, s.  |
 +
|        o  o    |                |                |                |
 +
|    t  |  |    |                |                |  partition t    |
 +
|    o--o--o    |                |                |  among r & s.  |
 +
|      \  /      |                |                |                |
 +
|      \ /      |                |                |  whole pie t:  |
 +
|        @        |  ( t ,(r),(s))  |                |  slices r, s.  |
 +
o-----------------o-----------------o-----------------o-----------------o
 +
</pre>
 
|}
 
|}
  
The form <math>S :> T\!</math> is here recruited for polymorphic employment in at least the following types of roles:
+
===Zeroth Order Logic===
 +
 
 +
<pre>
 +
Here is a scaled-down version of one of my very first applications,
 +
having to do with the demographic variables in a survey data base.
 +
 
 +
This Example illustrates the use of 2-variate logical forms
 +
for expressing and reasoning about the logical constraints
 +
that are involved in the following types of situations:
 +
 
 +
1.  Distinction:    A =/= B
 +
    Also known as:  logical inequality, exclusive disjunction
 +
    Represented as:  ( A , B )
 +
    Graphed as:
 +
    |
 +
    |  A  B
 +
    |  o---o
 +
    |    \ /
 +
    |    @
 +
 
 +
2.  Equality:        A = B
 +
    Also known as:  logical equivalence, if and only if, A <=> B
 +
    Represented as:  (( A , B ))
 +
    Graphed as:
 +
    |
 +
    |  A  B
 +
    |  o---o
 +
    |    \ /
 +
    |    o
 +
    |    |
 +
    |    @
 +
 
 +
3.  Implication:    A => B
 +
    Also known as:  entailment, if-then
 +
    Represented as:  ( A ( B ))
 +
    Graphed as:
 +
    |
 +
    |  A  B
 +
    |  o---o
 +
    |  |
 +
    |  @
 +
 
 +
Example of a proposition expressing a "zeroth order theory" (ZOT):
 +
 
 +
Consider the following text, written in what I am calling "Ref Log",
 +
also known as the "Cactus Language" synpropositional logic:
 +
 
 +
|  ( male  , female )
 +
|  (( boy  , male child ))
 +
|  (( girl , female child ))
 +
|  ( child ( human ))
 +
 
 +
Graphed as:
 +
 
 +
|                  boy  male    girl  female
 +
|                    o---o child    o---o child
 +
|  male  female      \ /            \ /          child  human
 +
|    o---o            o              o              o---o
 +
|      \ /              |              |              |
 +
|      @              @              @              @|
 +
 
 +
Nota Bene.  Due to graphic constraints -- no, the other
 +
kind of graphic constraints -- of the immediate medium,
 +
I am forced to string out the logical conjuncts of the
 +
actual cactus graph for this situation, one that might
 +
sufficiently be reasoned out from the exhibit supra by
 +
fusing together the four roots of the severed cactus.
 +
 
 +
Either of these expressions, text or graph, is equivalent to
 +
what would otherwise be written in a more ordinary syntax as:
 +
 
 +
|  male  =/=  female
 +
|  boy  <=>  male child
 +
|  girl  <=>  female child
 +
|  child  =>  human
 +
 
 +
This is a actually a single proposition, a conjunction of four lines:
 +
one distinction, two equations, and one implication.  Together these
 +
amount to a set of definitions conjointly constraining the logical
 +
compatibility of the six feature names that appear.  They may be
 +
thought of as sculpting out a space of models that is some subset
 +
of the 2^6 = 64 possible interpretations, and thereby shaping some
 +
universe of discourse.
 +
 
 +
Once this backdrop is defined, it is possible to "query" this universe,
 +
simply by conjoining additional propositions in further constraint of
 +
the underlying set of models.  This has many uses, as we shall see.
 +
 
 +
We are considering an Example of a propositional expression
 +
that is formed on the following "alphabet" or "lexicon" of
 +
six "logical features" or "boolean variables":
 +
 
 +
$A$  =  {"boy", "child", "female", "girl", "human", "male"}.
 +
 
 +
The expression is this:
 +
 
 +
|  ( male  , female )
 +
|  (( boy  , male child ))
 +
|  (( girl , female child ))
 +
|  ( child ( human ))
 +
 
 +
Putting it very roughly -- and putting off a better description
 +
of it till later -- we may think of this expression as notation
 +
for a boolean function f : %B%^6 -> %B%.  This is what we might
 +
call the "abstract type" of the function, but we will also find
 +
it convenient on many occasions to represent the points of this
 +
particular copy of the space %B%^6 in terms of the positive and
 +
negative versions of the features from $A$ that serve to encase
 +
them as logical "cells", as they are called in the venn diagram
 +
picture of the corresponding universe of discourse X = [$A$].
 +
 
 +
Just for concreteness, this form of representation begins and ends:
 +
 
 +
<0,0,0,0,0,0>  =  (boy)(child)(female)(girl)(human)(male),
 +
<0,0,0,0,0,1>  =  (boy)(child)(female)(girl)(human) male ,
 +
<0,0,0,0,1,0>  =  (boy)(child)(female)(girl) human (male),
 +
<0,0,0,0,1,1>  =  (boy)(child)(female)(girl) human  male ,
 +
...
 +
<1,1,1,1,0,0>  =  boy  child  female  girl (human)(male),
 +
<1,1,1,1,0,1>  =  boy  child  female  girl (human) male ,
 +
<1,1,1,1,1,0>  =  boy  child  female  girl  human (male),
 +
<1,1,1,1,1,1>  =  boy  child  female  girl  human  male .
 +
 
 +
I continue with the previous Example, that I bring forward and sum up here:
 +
 
 +
|                  boy  male          girl  female
 +
|                    o---o child          o---o child
 +
|  male  female      \ /                  \ /              child  human
 +
|    o---o            o                    o                    o--o
 +
|      \ /            |                    |                    |
 +
|      @              @                    @                    @
 +
|
 +
| (male , female)((boy , male child))((girl , female child))(child (human))
 +
 
 +
For my master's piece in Quantitative Psychology (Michigan State, 1989),
 +
I wrote a program, "Theme One" (TO) by name, that among its other duties
 +
operates to process the expressions of the cactus language in many of the
 +
most pressing ways that we need in order to be able to use it effectively
 +
as a propositional calculus.  The operational component of TO where one
 +
does the work of this logical modeling is called "Study", and the core
 +
of the logical calculator deep in the heart of this Study section is
 +
a suite of computational functions that evolve a particular species
 +
of "normal form", analogous to a "disjunctive normal form" (DNF),
 +
from whatever expression they are prebendered as their input.
 +
 
 +
This "canonical", "normal", or "stable" form of logical expression --
 +
I'll refine the distinctions among these subforms all in good time --
 +
permits succinct depiction as an "arboreal boolean expansion" (ABE).
 +
 
 +
Once again, the graphic limitations of this space prevail against
 +
any disposition that I might have to lay out a really substantial
 +
case before you, of the brand that might have a chance to impress
 +
you with the aptitude of this ilk of ABE in rooting out the truth
 +
of many a complexly obscurely subtly adamant whetstone of our wit.
 +
 
 +
So let me just illustrate the way of it with one conjunct of our Example.
 +
What follows will be a sequence of expressions, each one after the first
 +
being logically equal to the one that precedes it:
 +
 
 +
Step 1
 +
 
 +
|    g    fc
 +
|    o---o
 +
|      \ /
 +
|      o
 +
|      |
 +
|      @
 +
 
 +
Step 2
 +
 
 +
|                  o
 +
|        fc        |  fc
 +
|    o---o        o---o
 +
|      \ /          \ /
 +
|      o            o
 +
|      |            |
 +
|    g o-------------o--o g
 +
|        \          /
 +
|        \        /
 +
|          \      /
 +
|          \    /
 +
|            \  /
 +
|            \ /
 +
|              @
 +
 
 +
Step 3
 +
 
 +
|      f c
 +
|      o
 +
|      |            f c
 +
|      o            o
 +
|      |            |
 +
|    g o-------------o--o g
 +
|        \          /
 +
|        \        /
 +
|          \      /
 +
|          \    /
 +
|            \  /
 +
|            \ /
 +
|              @
 +
 
 +
Step 4
 +
 
 +
|          o
 +
|          |
 +
|    c o  o c          o
 +
|      |  |            |
 +
|      o  o      c o  o c
 +
|      |  |        |  |
 +
|    f o---o--o f  f o---o--o f
 +
|        \ /          \ /
 +
|      g o-------------o--o g
 +
|          \          /
 +
|          \        /
 +
|            \      /
 +
|            \    /
 +
|              \  /
 +
|              \ /
 +
|                @
 +
 
 +
Step 5
 +
 
 +
|          o      c o
 +
|      c  |        |
 +
|    f o---o--o f  f o---o--o f
 +
|        \ /          \ /
 +
|      g o-------------o--o g
 +
|          \          /
 +
|          \        /
 +
|            \      /
 +
|            \    /
 +
|              \  /
 +
|              \ /
 +
|                @
 +
 
 +
Step 6
 +
 
 +
|                                      o
 +
|                                      |
 +
|          o                      o  o
 +
|          |                      |  |
 +
|    c o---o--o c      o        c o---o--o c
 +
|        \ /            |            \ /
 +
|      f o-------------o--o f      f o-------------o--o f
 +
|          \          /              \          /
 +
|          \        /                \        /
 +
|            \      /                  \      /
 +
|            \    /                    \    /
 +
|              \  /                      \  /
 +
|              \ /                        \ /
 +
|              g o---------------------------o--o g
 +
|                \                        /
 +
|                  \                      /
 +
|                  \                    /
 +
|                    \                  /
 +
|                    \                /
 +
|                      \              /
 +
|                      \            /
 +
|                        \          /
 +
|                        \        /
 +
|                          \      /
 +
|                          \    /
 +
|                            \  /
 +
|                            \ /
 +
|                              @
 +
 
 +
Step 7
 +
 
 +
|          o                      o
 +
|          |                      |
 +
|    c o---o--o c      o        c o---o--o c
 +
|        \ /            |            \ /
 +
|      f o-------------o--o f      f o-------------o--o f
 +
|          \          /              \          /
 +
|          \        /                \        /
 +
|            \      /                  \      /
 +
|            \    /                    \    /
 +
|              \  /                      \  /
 +
|              \ /                        \ /
 +
|              g o---------------------------o--o g
 +
|                \                        /
 +
|                  \                      /
 +
|                  \                    /
 +
|                    \                  /
 +
|                    \                /
 +
|                      \              /
 +
|                      \            /
 +
|                        \          /
 +
|                        \        /
 +
|                          \      /
 +
|                          \    /
 +
|                            \  /
 +
|                            \ /
 +
|                              @
 +
 
 +
This last expression is the ABE of the input expression.
 +
It can be transcribed into ordinary logical language as:
 +
 
 +
| either girl and
 +
|        either female and
 +
|              either child and true
 +
|              or not child and false
 +
|        or not female and false
 +
| or not girl and
 +
|        either female and
 +
|              either child and false
 +
|              or not child and true
 +
|        or not female and true
 +
 
 +
The expression "((girl , female child))" is sufficiently evaluated
 +
by considering its logical values on the coordinate tuples of %B%^3,
 +
or its indications on the cells of the associated venn diagram that
 +
depicts the universe of discourse, namely, on these eight arguments:
 +
     
 +
<1, 1, 1>  =  girl  female  child ,
 +
<1, 1, 0>  =  girl  female (child),
 +
<1, 0, 1>  =  girl (female) child ,
 +
<1, 0, 0>  =  girl (female)(child),
 +
<0, 1, 1>  =  (girl) female  child ,
 +
<0, 1, 0>  =  (girl) female (child),
 +
<0, 0, 1>  =  (girl)(female) child ,
 +
<0, 0, 0>  =  (girl)(female)(child).
 +
 
 +
The ABE output expression tells us the logical values of
 +
the input expression on each of these arguments, doing so
 +
by attaching the values to the leaves of a tree, and acting
 +
as an "efficient" or "lazy" evaluator in the sense that the
 +
process that generates the tree follows each path only up to
 +
the point in the tree where it can determine the values on the
 +
entire subtree beyond that point.  Thus, the ABE tree tells us:
 +
 
 +
girl  female  child  -> 1
 +
girl  female (child)  -> 0
 +
girl (female) -> 0
 +
(girl) female  child  -> 0
 +
(girl) female (child)  -> 1
 +
(girl)(female) -> 1
 +
 
 +
Picking out the interpretations that yield the truth of the expression,
 +
and expanding the corresponding partial argument tuples, we arrive at
 +
the following interpretations that satisfy the input expression:
 +
 
 +
girl  female  child  -> 1
 +
(girl) female (child)  -> 1
 +
(girl)(female) child  -> 1
 +
(girl)(female)(child)  -> 1
 +
 
 +
In sum, if it's a female and a child, then it's a girl,
 +
and if it's either not a female or not a child or both,
 +
then it's not a girl.
 +
 
 +
Brief Automata
 +
 
 +
By way of providing a simple illustration of Cook's Theorem,
 +
that "Propositional Satisfiability is NP-Complete", here is
 +
an exposition of one way to translate Turing Machine set-ups
 +
into propositional expressions, employing the Ref Log Syntax
 +
for Prop Calc that I described in a couple of earlier notes:
 +
 
 +
Notation:
 +
 
 +
Stilt(k)  =  Space and Time Limited Turing Machine,
 +
            with k units of space and k units of time.
 +
 
 +
Stunt(k)  =  Space and Time Limited Turing Machine,
 +
            for computing the parity of a bit string,
 +
            with Number of Tape cells of input equal to k.
 +
 
 +
I will follow the pattern of the discussion in the book of
 +
Herbert Wilf, 'Algorithms & Complexity' (1986), pages 188-201,
 +
but translate into Ref Log, which is more efficient with respect
 +
to the number of propositional clauses that are required.
 +
 
 +
Parity Machine
 +
 
 +
|                    1/1/+1
 +
|                  ------->
 +
|              /\ /        \ /\
 +
|      0/0/+1  ^  0          1  ^  0/0/+1
 +
|              \/|\        /|\/
 +
|                | <------- |
 +
|        #/#/-1  |  1/1/+1  |  #/#/-1
 +
|                |          |
 +
|                v          v
 +
|                #          *
 +
 
 +
o-------o--------o-------------o---------o------------o
 +
| State | Symbol | Next Symbol | Ratchet | Next State |
 +
|  Q  |  S    |    S'      |  dR    |    Q'    |
 +
o-------o--------o-------------o---------o------------o
 +
|  0  |  0    |    0      |  +1    |    0      |
 +
|  0  |  1    |    1      |  +1    |    1      |
 +
|  0  |  #    |    #      |  -1    |    #      |
 +
|  1  |  0    |    0      |  +1    |    1      |
 +
|  1  |  1    |    1      |  +1    |    0      |
 +
|  1  |  #    |    #      |  -1    |    *      |
 +
o-------o--------o-------------o---------o------------o
 +
 
 +
The TM has a "finite automaton" (FA) as its component.
 +
Let us refer to this particular FA by the name of "M".
 +
 
 +
The "tape-head" (that is, the "read-unit") will be called "H".
 +
The "registers" are also called "tape-cells" or "tape-squares".
 +
 
 +
In order to consider how the finitely "stilted" rendition of this TM
 +
can be translated into the form of a purely propositional description,
 +
one now fixes k and limits the discussion to talking about a Stilt(k),
 +
which is really not a true TM anymore but a finite automaton in disguise.
 +
 
 +
In this example, for the sake of a minimal illustration, we choose k = 2,
 +
and discuss Stunt(2).  Since the zeroth tape cell and the last tape cell
 +
are occupied with bof and eof marks "#", this amounts to only one digit
 +
of significant computation.
 +
 
 +
To translate Stunt(2) into propositional form we use
 +
the following collection of propositional variables:
 +
 
 +
For the "Present State Function" QF : P -> Q,
 +
 
 +
{p0_q#, p0_q*, p0_q0, p0_q1,
 +
p1_q#, p1_q*, p1_q0, p1_q1,
 +
p2_q#, p2_q*, p2_q0, p2_q1,
 +
p3_q#, p3_q*, p3_q0, p3_q1}
 +
 
 +
The propositional expression of the form "pi_qj" says:
 +
 
 +
| At the point-in-time p_i,
 +
| the finite machine M is in the state q_j.
 +
 
 +
For the "Present Register Function" RF : P -> R,
 +
 
 +
{p0_r0, p0_r1, p0_r2, p0_r3,
 +
p1_r0, p1_r1, p1_r2, p1_r3,
 +
p2_r0, p2_r1, p2_r2, p2_r3,
 +
p3_r0, p3_r1, p3_r2, p3_r3}
 +
 
 +
The propositional expression of the form "pi_rj" says:
 +
 
 +
| At the point-in-time p_i,
 +
| the tape-head H is on the tape-cell r_j.
 +
 
 +
For the "Present Symbol Function" SF : P -> (R -> S),
 +
 
 +
{p0_r0_s#, p0_r0_s*, p0_r0_s0, p0_r0_s1,
 +
p0_r1_s#, p0_r1_s*, p0_r1_s0, p0_r1_s1,
 +
p0_r2_s#, p0_r2_s*, p0_r2_s0, p0_r2_s1,
 +
p0_r3_s#, p0_r3_s*, p0_r3_s0, p0_r3_s1,
 +
p1_r0_s#, p1_r0_s*, p1_r0_s0, p1_r0_s1,
 +
p1_r1_s#, p1_r1_s*, p1_r1_s0, p1_r1_s1,
 +
p1_r2_s#, p1_r2_s*, p1_r2_s0, p1_r2_s1,
 +
p1_r3_s#, p1_r3_s*, p1_r3_s0, p1_r3_s1,
 +
p2_r0_s#, p2_r0_s*, p2_r0_s0, p2_r0_s1,
 +
p2_r1_s#, p2_r1_s*, p2_r1_s0, p2_r1_s1,
 +
p2_r2_s#, p2_r2_s*, p2_r2_s0, p2_r2_s1,
 +
p2_r3_s#, p2_r3_s*, p2_r3_s0, p2_r3_s1,
 +
p3_r0_s#, p3_r0_s*, p3_r0_s0, p3_r0_s1,
 +
p3_r1_s#, p3_r1_s*, p3_r1_s0, p3_r1_s1,
 +
p3_r2_s#, p3_r2_s*, p3_r2_s0, p3_r2_s1,
 +
p3_r3_s#, p3_r3_s*, p3_r3_s0, p3_r3_s1}
 +
 
 +
The propositional expression of the form "pi_rj_sk" says:
 +
 
 +
| At the point-in-time p_i,
 +
| the tape-cell r_j bears the mark s_k.
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~INPUTS~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
Here are the Initial Conditions
 +
for the two possible inputs to the
 +
Ref Log redaction of this Parity TM:
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~INPUT~0~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
Initial Conditions:
 +
 
 +
p0_q0
 +
 
 +
p0_r1
 +
 
 +
p0_r0_s#
 +
p0_r1_s0
 +
p0_r2_s#
 +
 
 +
The Initial Conditions are given by a logical conjunction
 +
that is composed of 5 basic expressions, altogether stating:
 +
 
 +
| At the point-in-time p_0, M is in the state q_0, and
 +
| At the point-in-time p_0, H is on the cell  r_1, and
 +
| At the point-in-time p_0, cell r_0 bears the mark "#", and
 +
| At the point-in-time p_0, cell r_1 bears the mark "0", and
 +
| At the point-in-time p_0, cell r_2 bears the mark "#".
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~INPUT~1~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
Initial Conditions:
 +
 
 +
p0_q0
 +
 
 +
p0_r1
 +
 
 +
p0_r0_s#
 +
p0_r1_s1
 +
p0_r2_s#
 +
 
 +
The Initial Conditions are given by a logical conjunction
 +
that is composed of 5 basic expressions, altogether stating:
 +
 
 +
| At the point-in-time p_0, M is in the state q_0, and
 +
| At the point-in-time p_0, H is on the cell  r_1, and
 +
| At the point-in-time p_0, cell r_0 bears the mark "#", and
 +
| At the point-in-time p_0, cell r_1 bears the mark "1", and
 +
| At the point-in-time p_0, cell r_2 bears the mark "#".
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~PROGRAM~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
And here, yet again, just to store it nearby,
 +
is the logical rendition of the TM's program:
 +
 
 +
Mediate Conditions:
 +
 
 +
( p0_q#  ( p1_q# ))
 +
( p0_q*  ( p1_q* ))
 +
 
 +
( p1_q#  ( p2_q# ))
 +
( p1_q*  ( p2_q* ))
 +
 
 +
Terminal Conditions:
 +
 
 +
(( p2_q# )( p2_q* ))
 +
 
 +
State Partition:
 +
 
 +
(( p0_q0 ),( p0_q1 ),( p0_q# ),( p0_q* ))
 +
(( p1_q0 ),( p1_q1 ),( p1_q# ),( p1_q* ))
 +
(( p2_q0 ),( p2_q1 ),( p2_q# ),( p2_q* ))
 +
 
 +
Register Partition:
 +
 
 +
(( p0_r0 ),( p0_r1 ),( p0_r2 ))
 +
(( p1_r0 ),( p1_r1 ),( p1_r2 ))
 +
(( p2_r0 ),( p2_r1 ),( p2_r2 ))
 +
 
 +
Symbol Partition:
 +
 
 +
(( p0_r0_s0 ),( p0_r0_s1 ),( p0_r0_s# ))
 +
(( p0_r1_s0 ),( p0_r1_s1 ),( p0_r1_s# ))
 +
(( p0_r2_s0 ),( p0_r2_s1 ),( p0_r2_s# ))
 +
 
 +
(( p1_r0_s0 ),( p1_r0_s1 ),( p1_r0_s# ))
 +
(( p1_r1_s0 ),( p1_r1_s1 ),( p1_r1_s# ))
 +
(( p1_r2_s0 ),( p1_r2_s1 ),( p1_r2_s# ))
 +
 
 +
(( p2_r0_s0 ),( p2_r0_s1 ),( p2_r0_s# ))
 +
(( p2_r1_s0 ),( p2_r1_s1 ),( p2_r1_s# ))
 +
(( p2_r2_s0 ),( p2_r2_s1 ),( p2_r2_s# ))
 +
 
 +
Interaction Conditions:
 +
 
 +
(( p0_r0 ) p0_r0_s0 ( p1_r0_s0 ))
 +
(( p0_r0 ) p0_r0_s1 ( p1_r0_s1 ))
 +
(( p0_r0 ) p0_r0_s# ( p1_r0_s# ))
 +
 
 +
(( p0_r1 ) p0_r1_s0 ( p1_r1_s0 ))
 +
(( p0_r1 ) p0_r1_s1 ( p1_r1_s1 ))
 +
(( p0_r1 ) p0_r1_s# ( p1_r1_s# ))
 +
 
 +
(( p0_r2 ) p0_r2_s0 ( p1_r2_s0 ))
 +
(( p0_r2 ) p0_r2_s1 ( p1_r2_s1 ))
 +
(( p0_r2 ) p0_r2_s# ( p1_r2_s# ))
 +
 
 +
(( p1_r0 ) p1_r0_s0 ( p2_r0_s0 ))
 +
(( p1_r0 ) p1_r0_s1 ( p2_r0_s1 ))
 +
(( p1_r0 ) p1_r0_s# ( p2_r0_s# ))
 +
 
 +
(( p1_r1 ) p1_r1_s0 ( p2_r1_s0 ))
 +
(( p1_r1 ) p1_r1_s1 ( p2_r1_s1 ))
 +
(( p1_r1 ) p1_r1_s# ( p2_r1_s# ))
 +
 
 +
(( p1_r2 ) p1_r2_s0 ( p2_r2_s0 ))
 +
(( p1_r2 ) p1_r2_s1 ( p2_r2_s1 ))
 +
(( p1_r2 ) p1_r2_s# ( p2_r2_s# ))
 +
 
 +
Transition Relations:
 +
 
 +
( p0_q0  p0_r1  p0_r1_s0  ( p1_q0  p1_r2  p1_r1_s0 ))
 +
( p0_q0  p0_r1  p0_r1_s1  ( p1_q1  p1_r2  p1_r1_s1 ))
 +
( p0_q0  p0_r1  p0_r1_s#  ( p1_q#  p1_r0  p1_r1_s# ))
 +
( p0_q0  p0_r2  p0_r2_s#  ( p1_q#  p1_r1  p1_r2_s# ))
 +
 
 +
( p0_q1  p0_r1  p0_r1_s0  ( p1_q1  p1_r2  p1_r1_s0 ))
 +
( p0_q1  p0_r1  p0_r1_s1  ( p1_q0  p1_r2  p1_r1_s1 ))
 +
( p0_q1  p0_r1  p0_r1_s#  ( p1_q*  p1_r0  p1_r1_s# ))
 +
( p0_q1  p0_r2  p0_r2_s#  ( p1_q*  p1_r1  p1_r2_s# ))
 +
 
 +
( p1_q0  p1_r1  p1_r1_s0  ( p2_q0  p2_r2  p2_r1_s0 ))
 +
( p1_q0  p1_r1  p1_r1_s1  ( p2_q1  p2_r2  p2_r1_s1 ))
 +
( p1_q0  p1_r1  p1_r1_s#  ( p2_q#  p2_r0  p2_r1_s# ))
 +
( p1_q0  p1_r2  p1_r2_s#  ( p2_q#  p2_r1  p2_r2_s# ))
 +
 
 +
( p1_q1  p1_r1  p1_r1_s0  ( p2_q1  p2_r2  p2_r1_s0 ))
 +
( p1_q1  p1_r1  p1_r1_s1  ( p2_q0  p2_r2  p2_r1_s1 ))
 +
( p1_q1  p1_r1  p1_r1_s#  ( p2_q*  p2_r0  p2_r1_s# ))
 +
( p1_q1  p1_r2  p1_r2_s#  ( p2_q*  p2_r1  p2_r2_s# ))
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~INTERPRETATION~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
Interpretation of the Propositional Program:
 +
 
 +
Mediate Conditions:
 +
 
 +
( p0_q#  ( p1_q# ))
 +
( p0_q*  ( p1_q* ))
 +
 
 +
( p1_q#  ( p2_q# ))
 +
( p1_q*  ( p2_q* ))
 +
 
 +
In Ref Log, an expression of the form "( X ( Y ))"
 +
expresses an implication or an if-then proposition:
 +
"Not X without Y",  "If X then Y",  "X => Y",  etc.
 +
 
 +
A text string expression of the form "( X ( Y ))"
 +
parses to a graphical data-structure of the form:
 +
 
 +
    X  Y
 +
    o---o
 +
    |
 +
    @
 +
 
 +
All together, these Mediate Conditions state:
 +
 
 +
| If at p_0  M is in state q_#, then at p_1  M is in state q_#, and
 +
| If at p_0  M is in state q_*, then at p_1  M is in state q_*, and
 +
| If at p_1  M is in state q_#, then at p_2  M is in state q_#, and
 +
| If at p_1  M is in state q_*, then at p_2  M is in state q_*.
 +
 
 +
Terminal Conditions:
 +
 
 +
(( p2_q# )( p2_q* ))
 +
 
 +
In Ref Log, an expression of the form "(( X )( Y ))"
 +
expresses a disjunction "X or Y" and it parses into:
 +
 
 +
    X  Y
 +
    o  o
 +
    \ /
 +
      o
 +
      |
 +
      @
 +
 
 +
In effect, the Terminal Conditions state:
 +
 
 +
| At p_2,  M is in state q_#, or
 +
| At p_2,  M is in state q_*.
 +
 
 +
State Partition:
 +
 
 +
(( p0_q0 ),( p0_q1 ),( p0_q# ),( p0_q* ))
 +
(( p1_q0 ),( p1_q1 ),( p1_q# ),( p1_q* ))
 +
(( p2_q0 ),( p2_q1 ),( p2_q# ),( p2_q* ))
 +
 
 +
In Ref Log, an expression of the form "(( e_1 ),( e_2 ),( ... ),( e_k ))"
 +
expresses the fact that "exactly one of the e_j is true, for j = 1 to k".
 +
Expressions of this form are called "universal partition" expressions, and
 +
they parse into a type of graph called a "painted and rooted cactus" (PARC):
 +
 
 +
    e_1  e_2  ...  e_k
 +
    o    o          o
 +
    |    |          |
 +
    o-----o--- ... ---o
 +
      \              /
 +
      \            /
 +
        \          /
 +
        \        /
 +
          \      /
 +
          \    /
 +
            \  /
 +
            \ /
 +
              @
 +
 
 +
The State Partition expresses the conditions that:
 +
 
 +
| At each of the points-in-time p_i, for i = 0 to 2,
 +
| M can be in exactly one state q_j, for j in the set {0, 1, #, *}.
 +
 
 +
Register Partition:
 +
 
 +
(( p0_r0 ),( p0_r1 ),( p0_r2 ))
 +
(( p1_r0 ),( p1_r1 ),( p1_r2 ))
 +
(( p2_r0 ),( p2_r1 ),( p2_r2 ))
 +
 
 +
The Register Partition expresses the conditions that:
 +
 
 +
| At each of the points-in-time p_i, for i = 0 to 2,
 +
| H can be on exactly one cell  r_j, for j = 0 to 2.
 +
 
 +
Symbol Partition:
 +
 
 +
(( p0_r0_s0 ),( p0_r0_s1 ),( p0_r0_s# ))
 +
(( p0_r1_s0 ),( p0_r1_s1 ),( p0_r1_s# ))
 +
(( p0_r2_s0 ),( p0_r2_s1 ),( p0_r2_s# ))
 +
 
 +
(( p1_r0_s0 ),( p1_r0_s1 ),( p1_r0_s# ))
 +
(( p1_r1_s0 ),( p1_r1_s1 ),( p1_r1_s# ))
 +
(( p1_r2_s0 ),( p1_r2_s1 ),( p1_r2_s# ))
 +
 
 +
(( p2_r0_s0 ),( p2_r0_s1 ),( p2_r0_s# ))
 +
(( p2_r1_s0 ),( p2_r1_s1 ),( p2_r1_s# ))
 +
(( p2_r2_s0 ),( p2_r2_s1 ),( p2_r2_s# ))
 +
 
 +
The Symbol Partition expresses the conditions that:
 +
 
 +
| At each of the points-in-time p_i, for i in {0, 1, 2},
 +
| in each of the tape-registers r_j, for j in {0, 1, 2},
 +
| there can be exactly one sign s_k, for k in {0, 1, #}.
 +
 
 +
Interaction Conditions:
 +
 
 +
(( p0_r0 ) p0_r0_s0 ( p1_r0_s0 ))
 +
(( p0_r0 ) p0_r0_s1 ( p1_r0_s1 ))
 +
(( p0_r0 ) p0_r0_s# ( p1_r0_s# ))
 +
 
 +
(( p0_r1 ) p0_r1_s0 ( p1_r1_s0 ))
 +
(( p0_r1 ) p0_r1_s1 ( p1_r1_s1 ))
 +
(( p0_r1 ) p0_r1_s# ( p1_r1_s# ))
 +
 
 +
(( p0_r2 ) p0_r2_s0 ( p1_r2_s0 ))
 +
(( p0_r2 ) p0_r2_s1 ( p1_r2_s1 ))
 +
(( p0_r2 ) p0_r2_s# ( p1_r2_s# ))
 +
 
 +
(( p1_r0 ) p1_r0_s0 ( p2_r0_s0 ))
 +
(( p1_r0 ) p1_r0_s1 ( p2_r0_s1 ))
 +
(( p1_r0 ) p1_r0_s# ( p2_r0_s# ))
 +
 
 +
(( p1_r1 ) p1_r1_s0 ( p2_r1_s0 ))
 +
(( p1_r1 ) p1_r1_s1 ( p2_r1_s1 ))
 +
(( p1_r1 ) p1_r1_s# ( p2_r1_s# ))
 +
 
 +
(( p1_r2 ) p1_r2_s0 ( p2_r2_s0 ))
 +
(( p1_r2 ) p1_r2_s1 ( p2_r2_s1 ))
 +
(( p1_r2 ) p1_r2_s# ( p2_r2_s# ))
 +
 
 +
In briefest terms, the Interaction Conditions merely express
 +
the circumstance that the sign in a tape-cell cannot change
 +
between two points-in-time unless the tape-head is over the
 +
cell in question at the initial one of those points-in-time.
 +
All that we have to do is to see how they manage to say this.
 +
 
 +
In Ref Log, an expression of the following form:
 +
 
 +
"(( p<i>_r<j> ) p<i>_r<j>_s<k> ( p<i+1>_r<j>_s<k> ))",
 +
 
 +
and which parses as the graph:
 +
 
 +
      p<i>_r<j> o  o  p<i+1>_r<j>_s<k>
 +
                  \ /
 +
    p<i>_r<j>_s<k> o
 +
                  |
 +
                  @
 +
 
 +
can be read in the form of the following implication:
 +
 
 +
| If
 +
| at the point-in-time p<i>, the tape-cell r<j> bears the mark s<k>,
 +
| but it is not the case that
 +
| at the point-in-time p<i>, the tape-head is on the tape-cell r<j>.
 +
| then
 +
| at the point-in-time p<i+1>, the tape-cell r<j> bears the mark s<k>.
 +
 
 +
Folks among us of a certain age and a peculiar manner of acculturation will
 +
recognize these as the "Frame Conditions" for the change of state of the TM.
 +
 
 +
Transition Relations:
 +
 
 +
( p0_q0  p0_r1  p0_r1_s0  ( p1_q0  p1_r2  p1_r1_s0 ))
 +
( p0_q0  p0_r1  p0_r1_s1  ( p1_q1  p1_r2  p1_r1_s1 ))
 +
( p0_q0  p0_r1  p0_r1_s#  ( p1_q#  p1_r0  p1_r1_s# ))
 +
( p0_q0  p0_r2  p0_r2_s#  ( p1_q#  p1_r1  p1_r2_s# ))
 +
 
 +
( p0_q1  p0_r1  p0_r1_s0  ( p1_q1  p1_r2  p1_r1_s0 ))
 +
( p0_q1  p0_r1  p0_r1_s1  ( p1_q0  p1_r2  p1_r1_s1 ))
 +
( p0_q1  p0_r1  p0_r1_s#  ( p1_q*  p1_r0  p1_r1_s# ))
 +
( p0_q1  p0_r2  p0_r2_s#  ( p1_q*  p1_r1  p1_r2_s# ))
 +
 
 +
( p1_q0  p1_r1  p1_r1_s0  ( p2_q0  p2_r2  p2_r1_s0 ))
 +
( p1_q0  p1_r1  p1_r1_s1  ( p2_q1  p2_r2  p2_r1_s1 ))
 +
( p1_q0  p1_r1  p1_r1_s#  ( p2_q#  p2_r0  p2_r1_s# ))
 +
( p1_q0  p1_r2  p1_r2_s#  ( p2_q#  p2_r1  p2_r2_s# ))
 +
 
 +
( p1_q1  p1_r1  p1_r1_s0  ( p2_q1  p2_r2  p2_r1_s0 ))
 +
( p1_q1  p1_r1  p1_r1_s1  ( p2_q0  p2_r2  p2_r1_s1 ))
 +
( p1_q1  p1_r1  p1_r1_s#  ( p2_q*  p2_r0  p2_r1_s# ))
 +
( p1_q1  p1_r2  p1_r2_s#  ( p2_q*  p2_r1  p2_r2_s# ))
 +
 
 +
The Transition Conditions merely serve to express,
 +
by means of 16 complex implication expressions,
 +
the data of the TM table that was given above.
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~OUTPUTS~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
And here are the outputs of the computation,
 +
as emulated by its propositional rendition,
 +
and as actually generated within that form
 +
of transmogrification by the program that
 +
I wrote for finding all of the satisfying
 +
interpretations (truth-value assignments)
 +
of propositional expressions in Ref Log:
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~OUTPUT~0~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
Output Conditions:
 +
 
 +
p0_q0
 +
  p0_r1
 +
  p0_r0_s#
 +
    p0_r1_s0
 +
    p0_r2_s#
 +
      p1_q0
 +
      p1_r2
 +
        p1_r2_s#
 +
        p1_r0_s#
 +
          p1_r1_s0
 +
          p2_q#
 +
            p2_r1
 +
            p2_r0_s#
 +
              p2_r1_s0
 +
              p2_r2_s#
 +
 
 +
The Output Conditions amount to the sole satisfying interpretation,
 +
that is, a "sequence of truth-value assignments" (SOTVA) that make
 +
the entire proposition come out true, and they state the following:
 +
 
 +
| At the point-in-time p_0, M is in the state q_0,      and
 +
| At the point-in-time p_0, H is on the cell  r_1,      and
 +
| At the point-in-time p_0, cell r_0 bears the mark "#", and
 +
| At the point-in-time p_0, cell r_1 bears the mark "0", and
 +
| At the point-in-time p_0, cell r_2 bears the mark "#", and
 +
|
 +
| At the point-in-time p_1, M is in the state q_0,      and
 +
| At the point-in-time p_1, H is on the cell  r_2,      and
 +
| At the point-in-time p_1, cell r_0 bears the mark "#", and
 +
| At the point-in-time p_1, cell r_1 bears the mark "0", and
 +
| At the point-in-time p_1, cell r_2 bears the mark "#", and
 +
|
 +
| At the point-in-time p_2, M is in the state q_#,      and
 +
| At the point-in-time p_2, H is on the cell  r_1,      and
 +
| At the point-in-time p_2, cell r_0 bears the mark "#", and
 +
| At the point-in-time p_2, cell r_1 bears the mark "0", and
 +
| At the point-in-time p_2, cell r_2 bears the mark "#".
 +
 
 +
In brief, the output for our sake being the symbol that rests
 +
under the tape-head H when the machine M gets to a rest state,
 +
we are now amazed by the remarkable result that Parity(0) = 0.
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~OUTPUT~1~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
Output Conditions:
 +
 
 +
p0_q0
 +
  p0_r1
 +
  p0_r0_s#
 +
    p0_r1_s1
 +
    p0_r2_s#
 +
      p1_q1
 +
      p1_r2
 +
        p1_r2_s#
 +
        p1_r0_s#
 +
          p1_r1_s1
 +
          p2_q*
 +
            p2_r1
 +
            p2_r0_s#
 +
              p2_r1_s1
 +
              p2_r2_s#
 +
 
 +
The Output Conditions amount to the sole satisfying interpretation,
 +
that is, a "sequence of truth-value assignments" (SOTVA) that make
 +
the entire proposition come out true, and they state the following:
 +
 
 +
| At the point-in-time p_0, M is in the state q_0,      and
 +
| At the point-in-time p_0, H is on the cell  r_1,      and
 +
| At the point-in-time p_0, cell r_0 bears the mark "#", and
 +
| At the point-in-time p_0, cell r_1 bears the mark "1", and
 +
| At the point-in-time p_0, cell r_2 bears the mark "#", and
 +
|
 +
| At the point-in-time p_1, M is in the state q_1,      and
 +
| At the point-in-time p_1, H is on the cell  r_2,      and
 +
| At the point-in-time p_1, cell r_0 bears the mark "#", and
 +
| At the point-in-time p_1, cell r_1 bears the mark "1", and
 +
| At the point-in-time p_1, cell r_2 bears the mark "#", and
 +
|
 +
| At the point-in-time p_2, M is in the state q_*,      and
 +
| At the point-in-time p_2, H is on the cell  r_1,      and
 +
| At the point-in-time p_2, cell r_0 bears the mark "#", and
 +
| At the point-in-time p_2, cell r_1 bears the mark "1", and
 +
| At the point-in-time p_2, cell r_2 bears the mark "#".
 +
 
 +
In brief, the output for our sake being the symbol that rests
 +
under the tape-head H when the machine M gets to a rest state,
 +
we are now amazed by the remarkable result that Parity(1) = 1.
 +
 
 +
I realized after sending that last bunch of bits that there is room
 +
for confusion about what is the input/output of the Study module of
 +
the Theme One program as opposed to what is the input/output of the
 +
"finitely approximated turing automaton" (FATA).  So here is better
 +
delineation of what's what.  The input to Study is a text file that
 +
is known as LogFile(Whatever) and the output of Study is a sequence
 +
of text files that summarize the various canonical and normal forms
 +
that it generates.  For short, let us call these NormFile(Whatelse).
 +
With that in mind, here are the actual IO's of Study, excluding the
 +
glosses in square brackets:
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~INPUT~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
[Input To Study = FATA Initial Conditions + FATA Program Conditions]
 +
 
 +
[FATA Initial Conditions For Input 0]
  
# To signify that an individually named or quoted string <math>T\!</math> is being typed as a sentence <math>S\!</math> of the language of interest <math>\mathfrak{L}.</math>
+
p0_q0
# To express the fact or to make the assertion that each member of a specified set of strings <math>T \subseteq \mathfrak{A}^*</math> also belongs to the syntactic category <math>S,\!</math> the one that qualifies a string as being a sentence in the relevant formal language <math>\mathfrak{L}.</math>
 
# To specify the intension or to signify the intention that every string that fits the conditions of the abstract type <math>T\!</math> must also fall under the grammatical heading of a sentence, as indicated by the type <math>S,\!</math> all within the target language <math>\mathfrak{L}.</math>
 
  
In these types of situation the letter <math>S,\!</math> that signifies the type of a sentence in the language of interest, is called the ''initial symbol'' or the ''sentence symbol'' of a candidate formal grammar for the language, while any number of letters like <math>T,\!</math> signifying other types of strings that are necessary to a reasonable account or a rational reconstruction of the sentences that belong to the language, are collectively referred to as ''intermediate symbols''.
+
p0_r1
  
Combining the singleton set <math>\{ S \}\!</math> whose sole member is the initial symbol with the set <math>\mathfrak{Q}</math> that assembles together all of the intermediate symbols results in the set <math>\{ S \} \cup \mathfrak{Q}</math> of ''non-terminal symbols''.  Completing the package, the alphabet <math>\mathfrak{A}</math> of the language is also known as the set of ''terminal symbols''.  In this discussion, I will adopt the convention that <math>\mathfrak{Q}</math> is the set of ''intermediate symbols'', but I will often use <math>q\!</math> as a typical variable that ranges over all of the non-terminal symbols, <math>q \in \{ S \} \cup \mathfrak{Q}.</math>  Finally, it is convenient to refer to all of the symbols in <math>\{ S \} \cup \mathfrak{Q} \cup \mathfrak{A}</math> as the ''augmented alphabet'' of the prospective grammar for the language, and accordingly to describe the strings in <math>( \{ S \} \cup \mathfrak{Q} \cup \mathfrak{A} )^*</math> as the ''augmented strings'', in effect, expressing the forms that are superimposed on a language by one of its conceivable grammars.  In certain settings is becomes desirable to separate the augmented strings that contain the symbol <math>S\!</math> from all other sorts of augmented strings.  In these situations, the strings in the disjoint union <math>\{ S \} \cup (\mathfrak{Q} \cup \mathfrak{A} )^*</math> are known as the ''sentential forms'' of the associated grammar.
+
p0_r0_s#
 +
p0_r1_s0
 +
p0_r2_s#
  
In forming a grammar for a language statements of the form <math>W :> W',\!</math>
+
[FATA Program Conditions For Parity Machine]
where <math>W\!</math> and <math>W'\!</math> are augmented strings or sentential forms of specified types that depend on the style of the grammar that is being sought, are variously known as ''characterizations'', ''covering rules'', ''productions'', ''rewrite rules'', ''subsumptions'', ''transformations'', or ''typing rules''.  These are collected together into a set <math>\mathfrak{K}</math> that serves to complete the definition of the formal grammar in question.
 
  
Correlative with the use of this notation, an expression of the form <math>T <: S,\!</math> read to say that <math>T\!</math> is covered by <math>S,\!</math> can be interpreted to say that <math>T\!</math> is of the type <math>S.\!</math>  Depending on the context, this can be taken in either one of two ways:
+
[Mediate Conditions]
  
# Treating <math>T\!</math> as a string variable, it means that the individual string <math>T\!</math> is typed as <math>S.\!</math>
+
( p0_q# ( p1_q# ))
# Treating <math>T\!</math> as a type name, it means that any instance of the type <math>T\!</math> also falls under the type <math>S.\!</math>
+
( p0_q*  ( p1_q* ))
  
In accordance with these interpretations, an expression of the form <math>t <: T\!</math> can be read in all of the ways that one typically reads an expression of the form <math>t : T.\!</math>
+
( p1_q#  ( p2_q# ))
 +
( p1_q*  ( p2_q* ))
  
There are several abuses of notation that commonly tolerated in the use of covering relations.  The worst offense is that of allowing symbols to stand equivocally either for individual strings or else for their types.  There is a measure of consistency to this practice, considering the fact that perfectly individual entities are rarely if ever grasped by means of signs and finite expressions, which entails that every appearance of an apparent token is only a type of more particular tokens, and meaning in the end that there is never any recourse but to the sort of discerning interpretation that can decide just how each sign is intended.  In view of all this, I continue to permit expressions like <math>t <: T\!</math> and <math>T <: S,\!</math> where any of the symbols <math>t, T, S\!</math> can be taken to signify either the tokens or the subtypes of their covering types.
+
[Terminal Conditions]
  
'''New'''
+
(( p2_q# )( p2_q* ))
  
<pre>
+
[State Partition]
A notation in the style of "S :> T" is now introduced,
+
 
to be read among many others in this manifold of ways:
+
(( p0_q0 ),( p0_q1 ),( p0_q# ),( p0_q* ))
 +
(( p1_q0 ),( p1_q1 ),( p1_q# ),( p1_q* ))
 +
(( p2_q0 ),( p2_q1 ),( p2_q# ),( p2_q* ))
 +
 
 +
[Register Partition]
 +
 
 +
(( p0_r0 ),( p0_r1 ),( p0_r2 ))
 +
(( p1_r0 ),( p1_r1 ),( p1_r2 ))
 +
(( p2_r0 ),( p2_r1 ),( p2_r2 ))
 +
 
 +
[Symbol Partition]
 +
 
 +
(( p0_r0_s0 ),( p0_r0_s1 ),( p0_r0_s# ))
 +
(( p0_r1_s0 ),( p0_r1_s1 ),( p0_r1_s# ))
 +
(( p0_r2_s0 ),( p0_r2_s1 ),( p0_r2_s# ))
 +
 
 +
(( p1_r0_s0 ),( p1_r0_s1 ),( p1_r0_s# ))
 +
(( p1_r1_s0 ),( p1_r1_s1 ),( p1_r1_s# ))
 +
(( p1_r2_s0 ),( p1_r2_s1 ),( p1_r2_s# ))
 +
 
 +
(( p2_r0_s0 ),( p2_r0_s1 ),( p2_r0_s# ))
 +
(( p2_r1_s0 ),( p2_r1_s1 ),( p2_r1_s# ))
 +
(( p2_r2_s0 ),( p2_r2_s1 ),( p2_r2_s# ))
 +
 
 +
[Interaction Conditions]
 +
 
 +
(( p0_r0 ) p0_r0_s0 ( p1_r0_s0 ))
 +
(( p0_r0 ) p0_r0_s1 ( p1_r0_s1 ))
 +
(( p0_r0 ) p0_r0_s# ( p1_r0_s# ))
 +
 
 +
(( p0_r1 ) p0_r1_s0 ( p1_r1_s0 ))
 +
(( p0_r1 ) p0_r1_s1 ( p1_r1_s1 ))
 +
(( p0_r1 ) p0_r1_s# ( p1_r1_s# ))
 +
 
 +
(( p0_r2 ) p0_r2_s0 ( p1_r2_s0 ))
 +
(( p0_r2 ) p0_r2_s1 ( p1_r2_s1 ))
 +
(( p0_r2 ) p0_r2_s# ( p1_r2_s# ))
 +
 
 +
(( p1_r0 ) p1_r0_s0 ( p2_r0_s0 ))
 +
(( p1_r0 ) p1_r0_s1 ( p2_r0_s1 ))
 +
(( p1_r0 ) p1_r0_s# ( p2_r0_s# ))
 +
 
 +
(( p1_r1 ) p1_r1_s0 ( p2_r1_s0 ))
 +
(( p1_r1 ) p1_r1_s1 ( p2_r1_s1 ))
 +
(( p1_r1 ) p1_r1_s# ( p2_r1_s# ))
 +
 
 +
(( p1_r2 ) p1_r2_s0 ( p2_r2_s0 ))
 +
(( p1_r2 ) p1_r2_s1 ( p2_r2_s1 ))
 +
(( p1_r2 ) p1_r2_s# ( p2_r2_s# ))
 +
 
 +
[Transition Relations]
 +
 
 +
( p0_q0  p0_r1  p0_r1_s0  ( p1_q0  p1_r2  p1_r1_s0 ))
 +
( p0_q0  p0_r1  p0_r1_s1  ( p1_q1  p1_r2  p1_r1_s1 ))
 +
( p0_q0  p0_r1  p0_r1_s#  ( p1_q#  p1_r0  p1_r1_s# ))
 +
( p0_q0  p0_r2  p0_r2_s#  ( p1_q#  p1_r1  p1_r2_s# ))
 +
 
 +
( p0_q1  p0_r1  p0_r1_s0  ( p1_q1  p1_r2  p1_r1_s0 ))
 +
( p0_q1  p0_r1  p0_r1_s1  ( p1_q0  p1_r2  p1_r1_s1 ))
 +
( p0_q1  p0_r1  p0_r1_s#  ( p1_q*  p1_r0  p1_r1_s# ))
 +
( p0_q1  p0_r2  p0_r2_s#  ( p1_q*  p1_r1  p1_r2_s# ))
 +
 
 +
( p1_q0  p1_r1  p1_r1_s0  ( p2_q0  p2_r2  p2_r1_s0 ))
 +
( p1_q0  p1_r1  p1_r1_s1  ( p2_q1  p2_r2  p2_r1_s1 ))
 +
( p1_q0  p1_r1  p1_r1_s#  ( p2_q#  p2_r0  p2_r1_s# ))
 +
( p1_q0  p1_r2  p1_r2_s#  ( p2_q#  p2_r1  p2_r2_s# ))
 +
 
 +
( p1_q1  p1_r1  p1_r1_s0  ( p2_q1  p2_r2  p2_r1_s0 ))
 +
( p1_q1  p1_r1  p1_r1_s1  ( p2_q0  p2_r2  p2_r1_s1 ))
 +
( p1_q1  p1_r1  p1_r1_s#  ( p2_q*  p2_r0  p2_r1_s# ))
 +
( p1_q1  p1_r2  p1_r2_s#  ( p2_q*  p2_r1  p2_r2_s# ))
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~OUTPUT~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
[Output Of Study = FATA Output For Input 0]
 +
 
 +
p0_q0
 +
  p0_r1
 +
  p0_r0_s#
 +
    p0_r1_s0
 +
    p0_r2_s#
 +
      p1_q0
 +
      p1_r2
 +
        p1_r2_s#
 +
        p1_r0_s#
 +
          p1_r1_s0
 +
          p2_q#
 +
            p2_r1
 +
            p2_r0_s#
 +
              p2_r1_s0
 +
              p2_r2_s#
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
Turing automata, finitely approximated or not, make my head spin and
 +
my tape go loopy, and I still believe 'twere a far better thing I do
 +
if I work up to that level of complexity in a more gracile graduated
 +
manner.  So let us return to our Example in this gradual progress to
 +
that vastly more well-guarded grail of our long-term pilgrim's quest:
 +
 
 +
|                  boy  male          girl  female
 +
|                    o---o child          o---o child
 +
|  male  female      \ /                  \ /              child  human
 +
|    o---o            o                    o                    o--o
 +
|      \ /            |                    |                    |
 +
|      @              @                    @                    @
 +
|
 +
| (male , female)((boy , male child))((girl , female child))(child (human))
 +
 
 +
One section of the Theme One program has a suite of utilities that fall
 +
under the title "Theme One Study" ("To Study", or just "TOS" for short).
 +
To Study is to read and to parse a so-called and a generally so-suffixed
 +
"log" file, and then to conjoin what is called a "query", which is really
 +
just an additional propositional expression that imposes a further logical
 +
constraint on the input expression.
 +
 
 +
The Figure roughly sketches the conjuncts of the graph-theoretic
 +
data structure that the parser would commit to memory on reading
 +
the appropriate log file that contains the text along the bottom.
 +
 
 +
I will now explain the various sorts of things that the TOS utility
 +
can do with the log file that describes the universe of discourse in
 +
our present Example.
 +
 
 +
Theme One Study is built around a suite of four successive generators
 +
of "normal forms" for propositional expressions, just to use that term
 +
in a very approximate way.  The functions that compute these normal forms
 +
are called "Model", "Tenor", "Canon", and "Sense", and so we may refer to
 +
to their text-style outputs as the "mod", "ten", "can", and "sen" files.
 +
 
 +
Though it could be any propositional expression on the same vocabulary
 +
$A$ = {"boy", "child", "female", "girl", "human", "male"}, more usually
 +
the query is a simple conjunction of one or more positive features that
 +
we want to focus on or perhaps to filter out of the logical model space.
 +
On our first run through this Example, we take the log file proposition
 +
as it is, with no extra riders.
 +
 
 +
| Procedural Note.  TO Study Model displays a running tab of how much
 +
| free memory space it has left.  On some of the harder problems that
 +
| you may think of to give it, Model may run out of free memory and
 +
| terminate, abnormally exiting Theme One.  Sometimes it helps to:
 +
|
 +
| 1.  Rephrase the problem in logically equivalent
 +
|    but rhetorically increasedly felicitous ways.
 +
|
 +
| 2.  Think of additional facts that are taken for granted but not
 +
|    made explicit and that cannot be logically inferred by Model.
 +
 
 +
After Model has finished, it is ready to write out its mod file,
 +
which you may choose to show on the screen or save to a named file.
 +
Mod files are usually too long to see (or to care to see) all at once
 +
on the screen, so it is very often best to save them for later replay.
 +
In our Example the Model function yields a mod file that looks like so:
 +
 
 +
Model Output and
 +
Mod File Example
 +
o-------------------o
 +
| male              |
 +
|  female -        |  1
 +
|  (female )        |
 +
|  girl -          |  2
 +
|  (girl )        |
 +
|    child          |
 +
|    boy          |
 +
|      human *      |  3 *
 +
|      (human ) -  |  4
 +
|    (boy ) -      |  5
 +
|    (child )      |
 +
|    boy -        |  6
 +
|    (boy ) *      |  7 *
 +
| (male )          |
 +
|  female          |
 +
|  boy -          |  8
 +
|  (boy )          |
 +
|    child          |
 +
|    girl          |
 +
|      human *      |  9 *
 +
|      (human ) -  | 10
 +
|    (girl ) -    | 11
 +
|    (child )      |
 +
|    girl -        | 12
 +
|    (girl ) *    | 13 *
 +
|  (female ) -      | 14
 +
o-------------------o
 +
 
 +
Counting the stars "*" that indicate true interpretations
 +
and the bars "-" that indicate false interpretations of
 +
the input formula, we can see that the Model function,
 +
out of the 64 possible interpretations, has actually
 +
gone through the work of making just 14 evaluations,
 +
all in order to find the 4 models that are allowed
 +
by the input definitions.
 +
 
 +
To be clear about what this output means, the starred paths
 +
indicate all of the complete specifications of objects in the
 +
universe of discourse, that is, all of the consistent feature
 +
conjunctions of maximum length, as permitted by the definitions
 +
that are given in the log file.
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
Let's take a little break from the Example in progress
 +
and look at where we are and what we have been doing from
 +
computational, logical, and semiotic perspectives.  Because,
 +
after all, as is usually the case, we should not let our focus
 +
and our fascination with this particular Example prevent us from
 +
recognizing it, and all that we do with it, as just an Example of
 +
much broader paradigms and predicaments and principles, not to say
 +
but a glimmer of ultimately more concernful and fascinating objects.
 +
 
 +
I chart the progression that we have just passed through in this way:
 +
 
 +
|                    Parse
 +
|      Sign A  o-------------->o  Sign 1
 +
|            ^                |
 +
|            /                |
 +
|          /                  |
 +
|          /                  |
 +
| Object  o                    |  Transform
 +
|          ^                  |
 +
|          \                  |
 +
|            \                |
 +
|            \                v
 +
|      Sign B  o<--------------o  Sign 2
 +
|                    Verse
 +
|
 +
| Figure.  Computation As Sign Transformation
 +
 
 +
In the present case, the Object is an objective situation
 +
or a state of affairs, in effect, a particular pattern of
 +
feature concurrences occurring to us in that world through
 +
which we find ourselves most frequently faring, wily nily,
 +
and the Signs are different tokens and different types of
 +
data structures that we somehow or other find it useful
 +
to devise or to discover for the sake of representing
 +
current objects to ourselves on a recurring basis.
 +
 
 +
But not all signs, not even signs of a single object, are alike
 +
in every other respect that one might name, not even with respect
 +
to their powers of relating, significantly, to that common object.
 +
 
 +
And that is what our whole business of computation busies itself about,
 +
when it minds its business best, that is, transmuting signs into signs
 +
in ways that augment their powers of relating significantly to objects.
 +
 
 +
We have seen how the Model function and the mod output format
 +
indicate all of the complete specifications of objects in the
 +
universe of discourse, that is, all of the consistent feature
 +
conjunctions of maximal specificity that are permitted by the
 +
constraints or the definitions that are given in the log file.
 +
 
 +
To help identify these specifications of particular cells in
 +
the universe of discourse, the next function and output format,
 +
called "Tenor", edits the mod file to give only the true paths,
 +
in effect, the "positive models", that are by default what we
 +
usually mean when we say "models", and not the "anti-models"
 +
or the "negative models" that fail to satisfy the formula
 +
in question.
 +
 
 +
In the present Example the Tenor function
 +
generates a Ten file that looks like this:
 +
 
 +
Tenor Output and
 +
Ten File Example
 +
o-------------------o
 +
| male              |
 +
|  (female )        |
 +
|  (girl )        |
 +
|    child          |
 +
|    boy          |
 +
|      human *      | <1>
 +
|    (child )      |
 +
|    (boy ) *      | <2>
 +
| (male )          |
 +
|  female          |
 +
|  (boy )          |
 +
|    child          |
 +
|    girl          |
 +
|      human *      | <3>
 +
|    (child )      |
 +
|    (girl ) *    | <4>
 +
o-------------------o
 +
 
 +
As I said, the Tenor function just abstracts a transcript of the models,
 +
that is, the satisfying interpretations, that were already interspersed
 +
throughout the complete Model output.  These specifications, or feature
 +
conjunctions, with the positive and the negative features listed in the
 +
order of their actual budding on the "arboreal boolean expansion" twigs,
 +
may be gathered and arranged in this antherypulogical flowering bouquet:
 +
 
 +
1.  male  (female ) (girl )  child    boy    human  *
 +
2.  male  (female ) (girl ) (child ) (boy  )          *
 +
3.  (male )  female  (boy  )  child    girl    human  *
 +
4.  (male )  female  (boy  ) (child ) (girl )          *
 +
 
 +
Notice that Model, as reflected in this abstract, did not consider
 +
the six positive features in the same order along each path.  This
 +
is because the algorithm was designed to proceed opportunistically
 +
in its attempt to reduce the original proposition through a series
 +
of case-analytic considerations and the resulting simplifications.
 +
 
 +
Notice, too, that Model is something of a lazy evaluator, quitting work
 +
when and if a value is determined by less than the full set of variables.
 +
This is the reason why paths <2> and <4> are not ostensibly of the maximum
 +
length.  According to this lazy mode of understanding, any path that is not
 +
specified on a set of features really stands for the whole bundle of paths
 +
that are derived by freely varying those features.  Thus, specifications
 +
<2> and <4> summarize four models altogether, with the logical choice
 +
between "human" and "not human" being left open at the point where
 +
they leave off their branches in the releavent deciduous tree.
 +
 
 +
The last two functions in the Study section, "Canon" and "Sense",
 +
extract further derivatives of the normal forms that are produced
 +
by Model and Tenor.  Both of these functions take the set of model
 +
paths and simply throw away the negative labels.  You may think of
 +
these as the "rose colored glasses" or "job interview" normal forms,
 +
in that they try to say everything that's true, so long as it can be
 +
expressed in positive terms.  Generally, this would mean losing a lot
 +
of information, and the result could no longer be expected to have the
 +
property of remaining logically equivalent to the original proposition.
 +
 
 +
Fortunately, however, it seems that this type of positive projection of
 +
the whole truth is just what is possible, most needed, and most clear in
 +
many of the "natural" examples, that is, in examples that arise from the
 +
domains of natural language and natural conceptual kinds.  In these cases,
 +
where most of the logical features are redundantly coded, for example, in
 +
the way that "adult" = "not child" and "child" = "not adult", the positive
 +
feature bearing redacts are often sufficiently expressive all by themselves.
 +
 
 +
Canon merely censors its printing of the negative labels as it traverses the
 +
model tree.  This leaves the positive labels in their original columns of the
 +
outline form, giving it a slightly skewed appearance.  This can be misleading
 +
unless you already know what you are looking for.  However, this Canon format
 +
is computationally quick, and frequently suffices, especially if you already
 +
have a likely clue about what to expect in the way of a question's outcome.
 +
 
 +
In the present Example the Canon function
 +
generates a Can file that looks like this:
 +
 
 +
Canon Output and
 +
Can File Example
 +
o-------------------o
 +
| male              |
 +
|    child          |
 +
|    boy          |
 +
|      human        |
 +
|  female          |
 +
|    child          |
 +
|    girl          |
 +
|      human        |
 +
o-------------------o
 +
 
 +
The Sense function does the extra work that is required
 +
to place the positive labels of the model tree at their
 +
proper level in the outline.
 +
 
 +
In the present Example the Sense function
 +
generates a Sen file that looks like this:
 +
 
 +
Sense Output and
 +
Sen File Example
 +
o-------------------o
 +
| male              |
 +
|  child            |
 +
|  boy            |
 +
|    human          |
 +
| female            |
 +
|  child            |
 +
|  girl            |
 +
|    human          |
 +
o-------------------o
 +
 
 +
The Canon and Sense outlines for this Example illustrate a certain
 +
type of general circumstance that needs to be noted at this point.
 +
Recall the model paths or the feature specifications that were
 +
numbered <2> and <4> in the listing of the output for Tenor.
 +
These paths, in effect, reflected Model's discovery that
 +
the venn diagram cells for male or female non-children
 +
and male or female non-humans were not excluded by
 +
the definitions that were given in the Log file.
 +
In the abstracts given by Canon and Sense, the
 +
specifications <2> and <4> have been subsumed,
 +
or absorbed unmarked, under the general topics
 +
of their respective genders, male or female.
 +
This happens because no purely positive
 +
features were supplied to distinguish
 +
the non-child and non-human cases.
 +
 
 +
That completes the discussion of
 +
this six-dimensional Example.
 +
 
 +
Nota Bene, for possible future use.  In the larger current of work
 +
with respect to which this meander of a conduit was initially both
 +
diversionary and tributary, before those high and dry regensquirm
 +
years when it turned into an intellectual interglacial oxbow lake,
 +
I once had in mind a scape in which expressions in a definitional
 +
lattice were ordered according to their simplicity on some scale
 +
or another, and in this setting the word "sense" was actually an
 +
acronym for "semantically equivalent next-simplest expression".
 +
 
 +
| If this is starting to sound a little bit familiar,
 +
| it may be because the relationship between the two
 +
| kinds of pictures of propositions, namely:
 +
|
 +
| 1.  Propositions about things in general, here,
 +
|    about the times when certain facts are true,
 +
|    having the form of functions f : X -> B,
 +
|
 +
| 2.  Propositions about binary codes, here, about
 +
|    the bit-vector labels on venn diagram cells,
 +
|    having the form of functions f' : B^k -> B,
 +
|
 +
| is an epically old story, one that I, myself,
 +
| have related one or twice upon a time before,
 +
| to wit, at least, at the following two cites:
 +
|
 +
| http://suo.ieee.org/email/msg01251.html
 +
| http://suo.ieee.org/email/msg01293.html
 +
|
 +
| There, and now here, once more, and again, it may be observed
 +
| that the relation is one whereby the proposition f : X -> B,
 +
| the one about things and times and mores in general, factors
 +
| into a coding function c : X -> B^k, followed by a derived
 +
| proposition f' : B^k -> B that judges the resulting codes.
 +
|
 +
|                        f
 +
|                  X o------>o B
 +
|                      \    ^
 +
|  c = <x_1, ..., x_k> \  / f'
 +
|                        v /
 +
|                        o
 +
|                        B^k
 +
|
 +
| You may remember that this was supposed to illustrate
 +
| the "factoring" of a proposition f : X -> B = {0, 1}
 +
| into the composition f'(c(x)), where c : X -> B^k is
 +
| the "coding" of each x in X as an k-bit string in B^k,
 +
| and where f' is the mapping of codes into a co-domain
 +
| that we interpret as t-f-values, B = {0, 1} = {F, T}.
 +
 
 +
In short, there is the standard equivocation ("systematic ambiguity"?) as to
 +
whether we are talking about the "applied" and concretely typed proposition
 +
f : X -> B or the "pure" and abstractly typed proposition f' : B^k -> B.
 +
Or we can think of the latter object as the approximate code icon of
 +
the former object.
 +
 
 +
Anyway, these types of formal objects are the sorts of things that
 +
I take to be the denotational objects of propositional expressions.
 +
These objects, along with their invarious and insundry mathematical
 +
properties, are the orders of things that I am talking about when
 +
I refer to the "invariant structures in these objects themselves".
 +
 
 +
"Invariant" means "invariant under a suitable set of transformations",
 +
in this case the translations between various languages that preserve
 +
the objects and the structures in question.  In extremest generality,
 +
this is what universal constructions in category theory are all about.
 +
 
 +
In summation, the functions f : X -> B and f' : B* -> B have invariant, formal,
 +
mathematical, objective properties that any adequate language might eventually
 +
evolve to express, only some languages express them more obscurely than others.
 +
 
 +
To be perfectly honest, I continue to be surprised that anybody in this group
 +
has trouble with this.  There are perfectly apt and familiar examples in the
 +
contrast between roman numerals and arabic numerals, or the contrast between
 +
redundant syntaxes, like those that use the pentalphabet {~, &, v, =>, <=>},
 +
and trimmer syntaxes, like those used in existential and conceptual graphs.
 +
Every time somebody says "Let's take {~, &, v, =>, <=>} as an operational
 +
basis for logic" it's just like that old joke that mathematicians tell on
 +
engineers where the ingenue in question says "1 is a prime, 2 is a prime,
 +
3 is a prime, 4 is a prime, ..." -- and I know you think that I'm being
 +
hyperbolic, but I'm really only up to parabolas here ...
 +
 
 +
I have already refined my criticism so that it does not apply to
 +
the spirit of FOL or KIF or whatever, but only to the letters of
 +
specific syntactic proposals.  There is a fact of the matter as
 +
to whether a concrete language provides a clean or a cluttered
 +
basis for representing the identified set of formal objects.
 +
And it shows up in pragmatic realities like the efficiency
 +
of real time concept formation, concept use, learnability,
 +
reasoning power, and just plain good use of real time.
 +
These are the dire consequences that I learned in my
 +
very first tries at mathematically oriented theorem
 +
automation, and the only factor that has obscured
 +
them in mainstream work since then is the speed
 +
with which folks can now do all of the same
 +
old dumb things that they used to do on
 +
their way to kludging out the answers.
 +
 
 +
It seems to be darn near impossible to explain to the
 +
centurion all of the neat stuff that he's missing by
 +
sticking to his old roman numerals.  He just keeps
 +
on reckoning that what he can't count must be of
 +
no account at all.  There is way too much stuff
 +
that these original syntaxes keep us from even
 +
beginning to discuss, like differential logic,
 +
just for starters.
 +
 
 +
Our next Example illustrates the use of the Cactus Language
 +
for representing "absolute" and "relative" partitions, also
 +
known as "complete" and "contingent" classifications of the
 +
universe of discourse, all of which amounts to divvying it
 +
up into mutually exclusive regions, exhaustive or not, as
 +
one frequently needs in situations involving a genus and
 +
its sundry species, and frequently pictures in the form
 +
of a venn diagram that looks just like a "pie chart".
 +
 
 +
Example.  Partition, Genus & Species
 +
 
 +
The idea that one needs for expressing partitions
 +
in cactus expressions can be summed up like this:
 +
 
 +
| If the propositional expression
 +
|
 +
| "( p , q , r , ... )"
 +
|
 +
| means that just one of
 +
|
 +
| p, q, r, ... is false,
 +
|
 +
| then the propositional expression
 +
|
 +
| "((p),(q),(r), ... )"
 +
|
 +
| must mean that just one of
 +
|
 +
| (p), (q), (r), ... is false,
 +
|
 +
| in other words, that just one of
 +
|
 +
| p, q, r, ... is true.
 +
 
 +
Thus we have an efficient means to express and to enforce
 +
a partition of the space of models, in effect, to maintain
 +
the condition that a number of features or propositions are
 +
to be held in mutually exclusive and exhaustive disjunction.
 +
This supplies a much needed bridge between the binary domain
 +
of two values and any other domain with a finite number of
 +
feature values.
 +
 
 +
Another variation on this theme allows one to maintain the
 +
subsumption of many separate species under an explicit genus.
 +
To see this, let us examine the following form of expression:
 +
 
 +
( q , ( q_1 ) , ( q_2 ) , ( q_3 ) ).
 +
 
 +
Now consider what it would mean for this to be true.  We see two cases:
 +
 
 +
1.  If the proposition q is true, then exactly one of the
 +
    propositions (q_1), (q_2), (q_3) must be false, and so
 +
    just one of the propositions q_1, q_2, q_3 must be true.
 +
 
 +
2.  If the proposition q is false, then every one of the
 +
    propositions (q_1), (q_2), (q_2) must be true, and so
 +
    each one of the propositions q_1, q_2, q_3 must be false.
 +
    In short, if q is false then all of the other q's are also.
 +
 
 +
Figures 1 and 2 illustrate this type of situation.
 +
 
 +
Figure 1 is the venn diagram of a 4-dimensional universe of discourse
 +
X = [q, q_1, q_2, q_3], conventionally named after the gang of four
 +
logical features that generate it.  Strictly speaking, X is made up
 +
of two layers, the position space X of abstract type %B%^4, and the
 +
proposition space X^ = (X -> %B%) of abstract type %B%^4 -> %B%,
 +
but it is commonly lawful enough to sign the signature of both
 +
spaces with the same X, and thus to give the power of attorney
 +
for the propositions to the so-indicted position space thereof.
 +
 
 +
Figure 1 also makes use of the convention whereby the regions
 +
or the subsets of the universe of discourse that correspond
 +
to the basic features q, q_1, q_2, q_3 are labelled with
 +
the parallel set of upper case letters Q, Q_1, Q_2, Q_3.
 +
 
 +
|                        o
 +
|                      / \
 +
|                      /  \
 +
|                    /    \
 +
|                    /      \
 +
|                  o        o
 +
|                  /%\      /%\
 +
|                /%%%\    /%%%\
 +
|                /%%%%%\  /%%%%%\
 +
|              /%%%%%%%\ /%%%%%%%\
 +
|              o%%%%%%%%%o%%%%%%%%%o
 +
|            / \%%%%%%%/ \%%%%%%%/ \
 +
|            /  \%%%%%/  \%%%%%/  \
 +
|          /    \%%%/    \%%%/    \
 +
|          /      \%/      \%/      \
 +
|        o        o        o        o
 +
|        / \      /%\      / \      / \
 +
|      /  \    /%%%\    /  \    /  \
 +
|      /    \  /%%%%%\  /    \  /    \
 +
|    /      \ /%%%%%%%\ /      \ /      \
 +
|    o        o%%%%%%%%%o        o        o
 +
|    ·\      / \%%%%%%%/ \      / \      /·
 +
|    · \    /  \%%%%%/  \    /  \    / ·
 +
|    ·  \  /    \%%%/    \  /    \  /  ·
 +
|    ·  \ /      \%/      \ /      \ /  ·
 +
|    ·    o        o        o        o    ·
 +
|    ·    ·\      / \      / \      /·    ·
 +
|    ·    · \    /  \    /  \    / ·    ·
 +
|    ·    ·  \  /    \  /    \  /  ·    ·
 +
|    · Q  ·  \ /      \ /      \ /  ·Q_3 ·
 +
|    ··········o        o        o··········
 +
|        ·    \      /%\      /    ·
 +
|        ·      \    /%%%\    /      ·
 +
|        ·      \  /%%%%%\  /      ·
 +
|        · Q_1    \ /%%%%%%%\ /    Q_2 ·
 +
|        ··········o%%%%%%%%%o··········
 +
|                    \%%%%%%%/
 +
|                    \%%%%%/
 +
|                      \%%%/
 +
|                      \%/
 +
|                        o
 +
|
 +
| Figure 1.  Genus Q and Species Q_1, Q_2, Q_3
 +
 
 +
Figure 2 is another form of venn diagram that one often uses,
 +
where one collapses the unindited cells and leaves only the
 +
models of the proposition in question.  Some people would
 +
call the transformation that changes from the first form
 +
to the next form an operation of "taking the quotient",
 +
but I tend to think of it as the "soap bubble picture"
 +
or more exactly the "wire & thread & soap film" model
 +
of the universe of discourse, where one pops out of
 +
consideration the sections of the soap film that
 +
stretch across the anti-model regions of space.
 +
 
 +
o-------------------------------------------------o
 +
|                                                |
 +
|  X                                              |
 +
|                                                |
 +
|                        o                        |
 +
|                      / \                      |
 +
|                      /  \                      |
 +
|                    /    \                    |
 +
|                    /      \                    |
 +
|                  /        \                  |
 +
|                  o    Q_1    o                  |
 +
|                / \        / \                |
 +
|                /  \      /  \                |
 +
|              /    \    /    \              |
 +
|              /      \  /      \              |
 +
|            /        \ /        \            |
 +
|            /          Q          \            |
 +
|          /            |            \          |
 +
|          /            |            \          |
 +
|        /      Q_2    |    Q_3      \        |
 +
|        /              |              \        |
 +
|      /                |                \      |
 +
|      o-----------------o-----------------o      |
 +
|                                                |
 +
|                                                |
 +
|                                                |
 +
o-------------------------------------------------o
 +
 
 +
Figure 2.  Genus Q and Species Q_1, Q_2, Q_3
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
Example.  Partition, Genus & Species (cont.)
 +
 
 +
Last time we considered in general terms how the forms
 +
of complete partition and contingent partition operate
 +
to maintain mutually disjoint and possibly exhaustive
 +
categories of positions in a universe of discourse.
 +
 
 +
This time we contemplate another concrete Example of
 +
near minimal complexity, designed to demonstrate how
 +
the forms of partition and subsumption can interact
 +
in structuring a space of feature specifications.
 +
 
 +
In this Example, we describe a universe of discourse
 +
in terms of the following vocabulary of five features:
 +
 
 +
| L.  living_thing
 +
|
 +
| N.  non_living
 +
|
 +
| A.  animal
 +
|
 +
| V.  vegetable
 +
|
 +
| M.  mineral
 +
 
 +
Let us construe these features as being subject to four constraints:
 +
 
 +
| 1.  Everything is either a living_thing or non_living, but not both.
 +
|
 +
| 2.  Everything is either animal, vegetable, or mineral,
 +
|    but no two of these together.
 +
|
 +
| 3.  A living_thing is either animal or vegetable, but not both,
 +
|    and everything animal or vegetable is a living_thing.
 +
|
 +
| 4.  Everything mineral is non_living.
 +
 
 +
These notions and constructions are expressed in the Log file shown below:
 +
 
 +
Logical Input File
 +
o-------------------------------------------------o
 +
|                                                |
 +
|  ( living_thing , non_living )                |
 +
|                                                |
 +
|  (( animal ),( vegetable ),( mineral ))        |
 +
|                                                |
 +
|  ( living_thing ,( animal ),( vegetable ))    |
 +
|                                                |
 +
|  ( mineral ( non_living ))                    |
 +
|                                                |
 +
o-------------------------------------------------o
 +
 
 +
The cactus expression in this file is the expression
 +
of a "zeroth order theory" (ZOT), one that can be
 +
paraphrased in more ordinary language to say:
 +
 
 +
Translation
 +
o-------------------------------------------------o
 +
|                                                |
 +
|  living_thing  =/=  non_living                |
 +
|                                                |
 +
|  par : all -> {animal, vegetable, mineral}    |
 +
|                                                |
 +
|  par : living_thing -> {animal, vegetable}    |
 +
|                                                |
 +
|  mineral => non_living                        |
 +
|                                                |
 +
o-------------------------------------------------o
 +
 
 +
Here, "par : all -> {p, q, r}" is short for an assertion
 +
that the universe as a whole is partitioned into subsets
 +
that correspond to the features p, q, r.
 +
 
 +
Also, "par : q -> {r, s}" asserts that "Q partitions into R and S.
 +
 
 +
It is probably enough just to list the outputs of Model, Tenor, and Sense
 +
when run on the preceding Log file.  Using the same format and labeling as
 +
before, we may note that Model has, from 2^5 = 32 possible interpretations,
 +
made 11 evaluations, and found 3 models answering the generic descriptions
 +
that were imposed by the logical input file.
 +
 
 +
Model Outline
 +
o------------------------o
 +
| living_thing          |
 +
|  non_living -          |  1
 +
|  (non_living )        |
 +
|  mineral -            |  2
 +
|  (mineral )          |
 +
|    animal              |
 +
|    vegetable -        |  3
 +
|    (vegetable ) *    |  4 *
 +
|    (animal )          |
 +
|    vegetable *        |  5 *
 +
|    (vegetable ) -    |  6
 +
| (living_thing )        |
 +
|  non_living            |
 +
|  animal -            |  7
 +
|  (animal )            |
 +
|    vegetable -        |  8
 +
|    (vegetable )        |
 +
|    mineral *          |  9 *
 +
|    (mineral ) -      | 10
 +
|  (non_living ) -      | 11
 +
o------------------------o
 +
 
 +
Tenor Outline
 +
o------------------------o
 +
| living_thing          |
 +
|  (non_living )        |
 +
|  (mineral )          |
 +
|    animal              |
 +
|    (vegetable ) *    | <1>
 +
|    (animal )          |
 +
|    vegetable *        | <2>
 +
| (living_thing )        |
 +
|  non_living            |
 +
|  (animal )            |
 +
|    (vegetable )        |
 +
|    mineral *          | <3>
 +
o------------------------o
 +
 
 +
Sense Outline
 +
o------------------------o
 +
| living_thing          |
 +
|  animal                |
 +
|  vegetable            |
 +
| non_living            |
 +
|  mineral              |
 +
o------------------------o
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
Example.  Molly's World
 +
 
 +
I think that we are finally ready to tackle a more respectable example.
 +
The Example known as "Molly's World" is borrowed from the literature on
 +
computational learning theory, adapted with a few changes from the example
 +
called "Molly’s Problem" in the paper "Learning With Hints" by Dana Angluin.
 +
By way of setting up the problem, I quote Angluin's motivational description:
 +
 
 +
| Imagine that you have become acquainted with an alien named Molly from the
 +
| planet Ornot, who is currently employed in a day-care center.  She is quite
 +
| good at propositional logic, but a bit weak on knowledge of Earth.  So you
 +
| decide to formulate the beginnings of a propositional theory to help her
 +
| label things in her immediate environment.
 +
|
 +
| Angluin, Dana, "Learning With Hints", pages 167-181, in:
 +
| David Haussler & Leonard Pitt (eds.), 'Proceedings of the 1988 Workshop
 +
| on Computational Learning Theory', Morgan Kaufmann, San Mateo, CA, 1989.
 +
 
 +
The purpose of this quaint pretext is, of course, to make sure that the
 +
reader appreciates the constraints of the problem:  that no extra savvy
 +
is fair, all facts must be presumed or deduced on the immediate premises.
 +
 
 +
My use of this example is not directly relevant to the purposes of the
 +
discussion from which it is taken, so I simply give my version of it
 +
without comment on those issues.
 +
 
 +
Here is my rendition of the initial knowledge base delimiting Molly’s World:
 +
 
 +
Logical Input File:  Molly.Log
 +
o---------------------------------------------------------------------o
 +
|                                                                    |
 +
| ( object ,( toy ),( vehicle ))                                      |
 +
| (( small_size ),( medium_size ),( large_size ))                    |
 +
| (( two_wheels ),( three_wheels ),( four_wheels ))                  |
 +
| (( no_seat ),( one_seat ),( few_seats ),( many_seats ))            |
 +
| ( object ,( scooter ),( bike ),( trike ),( car ),( bus ),( wagon )) |
 +
| ( two_wheels    no_seat            ,( scooter ))                    |
 +
| ( two_wheels    one_seat    pedals ,( bike ))                      |
 +
| ( three_wheels  one_seat    pedals ,( trike ))                      |
 +
| ( four_wheels  few_seats  doors  ,( car ))                        |
 +
| ( four_wheels  many_seats  doors  ,( bus ))                        |
 +
| ( four_wheels  no_seat    handle ,( wagon ))                      |
 +
| ( scooter          ( toy  small_size ))                            |
 +
| ( wagon            ( toy  small_size ))                            |
 +
| ( trike            ( toy  small_size ))                            |
 +
| ( bike  small_size  ( toy ))                                        |
 +
| ( bike  medium_size ( vehicle ))                                    |
 +
| ( bike  large_size  )                                              |
 +
| ( car              ( vehicle  large_size ))                        |
 +
| ( bus              ( vehicle  large_size ))                        |
 +
| ( toy              ( object ))                                    |
 +
| ( vehicle          ( object ))                                    |
 +
|                                                                    |
 +
o---------------------------------------------------------------------o
 +
 
 +
All of the logical forms that are used in the preceding Log file
 +
will probably be familiar from earlier discussions.  The purpose
 +
of one or two constructions may, however, be a little obscure,
 +
so I will insert a few words of additional explanation here:
 +
 
 +
The rule "( bike large_size )", for example, merely
 +
says that nothing can be both a bike and large_size.
 +
 
 +
The rule "( three_wheels one_seat pedals ,( trike ))" says that anything
 +
with all the features of three_wheels, one_seat, and pedals is excluded
 +
from being anything but a trike.  In short, anything with just those
 +
three features is equivalent to a trike.
 +
 
 +
Recall that the form "( p , q )" may be interpreted to assert either
 +
the exclusive disjunction or the logical inequivalence of p and q.
 +
 
 +
The rules have been stated in this particular way simply
 +
to imitate the style of rules in the reference example.
 +
 
 +
This last point does bring up an important issue, the question
 +
of "rhetorical" differences in expression and their potential
 +
impact on the "pragmatics" of computation.  Unfortunately,
 +
I will have to abbreviate my discussion of this topic for
 +
now, and only mention in passing the following facts.
 +
 
 +
Logically equivalent expressions, even though they must lead
 +
to logically equivalent normal forms, may have very different
 +
characteristics when it comes to the efficiency of processing.
 +
 
 +
For instance, consider the following four forms:
 +
 
 +
| 1.  (( p , q ))
 +
|
 +
| 2.  ( p ,( q ))
 +
|
 +
| 3.  (( p ), q )
 +
|
 +
| 4.  (( p , q ))
 +
 
 +
All of these are equally succinct ways of maintaining that
 +
p is logically equivalent to q, yet each can have different
 +
effects on the route that Model takes to arrive at an answer.
 +
Apparently, some equalities are more equal than others.
 +
 
 +
These effects occur partly because the algorithm chooses to make cases
 +
of variables on a basis of leftmost shallowest first, but their impact
 +
can be complicated by the interactions that each expression has with
 +
the context that it occupies.  The main lesson to take away from all
 +
of this, at least, for the time being, is that it is probably better
 +
not to bother too much about these problems, but just to experiment
 +
with different ways of expressing equivalent pieces of information
 +
until you get a sense of what works best in various situations.
 +
 
 +
I think that you will be happy to see only the
 +
ultimate Sense of Molly’s World, so here it is:
 +
 
 +
Sense Outline:  Molly.Sen
 +
o------------------------o
 +
| object                |
 +
|  two_wheels            |
 +
|  no_seat              |
 +
|    scooter            |
 +
|    toy                |
 +
|      small_size        |
 +
|  one_seat            |
 +
|    pedals              |
 +
|    bike              |
 +
|      small_size        |
 +
|      toy              |
 +
|      medium_size      |
 +
|      vehicle          |
 +
|  three_wheels          |
 +
|  one_seat            |
 +
|    pedals              |
 +
|    trike              |
 +
|      toy              |
 +
|      small_size      |
 +
|  four_wheels          |
 +
|  few_seats            |
 +
|    doors              |
 +
|    car                |
 +
|      vehicle          |
 +
|      large_size      |
 +
|  many_seats          |
 +
|    doors              |
 +
|    bus                |
 +
|      vehicle          |
 +
|      large_size      |
 +
|  no_seat              |
 +
|    handle              |
 +
|    wagon              |
 +
|      toy              |
 +
|      small_size      |
 +
o------------------------o
 +
 
 +
This outline is not the Sense of the unconstrained Log file,
 +
but the result of running Model with a query on the single
 +
feature "object".  Using this focus helps the Modeler
 +
to make more relevant Sense of Molly’s World.
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
DM = Douglas McDavid
 +
 
 +
DM: This, again, is an example of how real issues of ontology are
 +
    so often trivialized at the expense of technicalities.  I just
 +
    had a burger, some fries, and a Coke.  I would say all that was
 +
    non-living and non-mineral.  A virus, I believe is non-animal,
 +
    non-vegetable, but living (and non-mineral).  Teeth, shells,
 +
    and bones are virtually pure mineral, but living.  These are
 +
    the kinds of issues that are truly "ontological," in my
 +
    opinion.  You are not the only one to push them into
 +
    the background as of lesser importance.  See the
 +
    discussion of "18-wheelers" in John Sowa's book.
 +
 
 +
it's not my example, and from you say, it's not your example either.
 +
copied it out of a book or a paper somewhere, too long ago to remember.
 +
i am assuming that the author or tardition from which it came must have
 +
seen some kind of sense in it.  tell you what, write out your own theory
 +
of "what is" in so many variables, more or less, publish it in a book or
 +
a paper, and then folks will tell you that they dispute each and every
 +
thing that you have just said, and it won't really matter all that much
 +
how complex it is or how subtle you are.  that has been the way of all
 +
ontology for about as long as anybody can remember or even read about.
 +
me?  i don't have sufficient arrogance to be an ontologist, and you
 +
know that's saying a lot, as i can't even imagine a way to convince
 +
myself that i believe i know "what is", really and truly for sure
 +
like some folks just seem to do.  so i am working to improve our
 +
technical ability to do logic, which is mostly a job of shooting
 +
down the more serious delusions that we often get ourselves into.
 +
can i be of any use to ontologists?  i dunno.  i guess it depends
 +
on how badly they are attached to some of the delusions of knowing
 +
what their "common" sense tells them everybody ought to already know,
 +
but that every attempt to check that out in detail tells them it just
 +
ain't so.  a problem for which denial was just begging to be invented,
 +
and so it was.
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
Example.  Molly's World (cont.)
 +
 
 +
In preparation for a contingently possible future discussion,
 +
I need to attach a few parting thoughts to the case workup
 +
of Molly's World that may not seem terribly relevant to
 +
the present setting, but whose pertinence I hope will
 +
become clearer in time.
 +
 
 +
The logical paradigm from which this Example was derived is that
 +
of "Zeroth Order Horn Clause Theories".  The clauses at issue
 +
in these theories are allowed to be of just three kinds:
 +
 
 +
| 1.  p & q & r & ... => z
 +
|
 +
| 2.  z
 +
|
 +
| 3.  ~[p & q & r & ...]
 +
 
 +
Here, the proposition letters "p", "q", "r", ..., "z"
 +
are restricted to being single positive features, not
 +
themselves negated or otherwise complex expressions.
 +
 
 +
In the Cactus Language or Existential Graph syntax
 +
these forms would take on the following appearances:
 +
 
 +
| 1.  ( p q r ... ( z ))
 +
|
 +
| 2.    z
 +
|
 +
| 3.  ( p q r ... )
 +
 
 +
The style of deduction in Horn clause logics is essentially
 +
proof-theoretic in character, with the main burden of proof
 +
falling on implication relations ("=>") and on "projective"
 +
forms of inference, that is, information-losing inferences
 +
like modus ponens and resolution.  Cf. [Llo], [MaW].
 +
 
 +
In contrast, the method used here is substantially model-theoretic,
 +
the stress being to start from more general forms of expression for
 +
laying out facts (for example, distinctions, equations, partitions)
 +
and to work toward results that maintain logical equivalence with
 +
their origins.
 +
 
 +
What all of this has to do with the output above is this:
 +
>From the perspective that is adopted in the present work,
 +
almost any theory, for example, the one that is founded
 +
on the postulates of Molly's World, will have far more
 +
models than the implicational and inferential mode of
 +
reasoning is designed to discover.  We will be forced
 +
to confront them, however, if we try to run Model on
 +
a large set of implications.
 +
 
 +
The typical Horn clause interpreter gets around this
 +
difficulty only by a stratagem that takes clauses to
 +
mean something other than what they say, that is, by
 +
distorting the principles of semantics in practice.
 +
Our Model, on the other hand, has no such finesse.
 +
 
 +
This explains why it was necessary to impose the
 +
prerequisite "object" constraint on the Log file
 +
for Molly's World.  It supplied no more than what
 +
we usually take for granted, in order to obtain
 +
a set of models that we would normally think of
 +
as being the intended import of the definitions.
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
Example.  Jets & Sharks
 +
 
 +
The propositional calculus based on the boundary operator, that is,
 +
the multigrade logical connective of the form "( , , , ... )" can be
 +
interpreted in a way that resembles the logic of activation states and
 +
competition constraints in certain neural network models.  One way to do
 +
this is by interpreting the blank or unmarked state as the resting state
 +
of a neural pool, the bound or marked state as its activated state, and
 +
by representing a mutually inhibitory pool of neurons p, q, r by means
 +
of the expression "( p , q , r )".  To illustrate this possibility,
 +
I transcribe into cactus language expressions a notorious example
 +
from the "parallel distributed processing" (PDP) paradigm [McR]
 +
and work through two of the associated exercises as portrayed
 +
in this format.
 +
 
 +
Logical Input File:  JAS  =  ZOT(Jets And Sharks)
 +
o----------------------------------------------------------------o
 +
|                                                                |
 +
|  (( art    ),( al  ),( sam  ),( clyde ),( mike  ),            |
 +
|  ( jim    ),( greg ),( john ),( doug  ),( lance ),            |
 +
|  ( george ),( pete ),( fred ),( gene  ),( ralph ),            |
 +
|  ( phil  ),( ike  ),( nick ),( don  ),( ned  ),( karl ),  |
 +
|  ( ken    ),( earl ),( rick ),( ol    ),( neal  ),( dave ))  |
 +
|                                                                |
 +
|  ( jets , sharks )                                            |
 +
|                                                                |
 +
|  ( jets ,                                                      |
 +
|    ( art    ),( al  ),( sam  ),( clyde ),( mike  ),          |
 +
|    ( jim    ),( greg ),( john ),( doug  ),( lance ),          |
 +
|    ( george ),( pete ),( fred ),( gene  ),( ralph ))          |
 +
|                                                                |
 +
|  ( sharks ,                                                    |
 +
|    ( phil ),( ike  ),( nick ),( don ),( ned  ),( karl ),      |
 +
|    ( ken  ),( earl ),( rick ),( ol  ),( neal ),( dave ))      |
 +
|                                                                |
 +
|  (( 20's ),( 30's ),( 40's ))                                  |
 +
|                                                                |
 +
|  ( 20's ,                                                      |
 +
|    ( sam    ),( jim  ),( greg ),( john ),( lance ),            |
 +
|    ( george ),( pete ),( fred ),( gene ),( ken  ))            |
 +
|                                                                |
 +
|  ( 30's ,                                                      |
 +
|    ( al  ),( mike ),( doug ),( ralph ),                      |
 +
|    ( phil ),( ike  ),( nick ),( don  ),                      |
 +
|    ( ned  ),( rick ),( ol  ),( neal  ),( dave ))              |
 +
|                                                                |
 +
|  ( 40's ,                                                      |
 +
|    ( art ),( clyde ),( karl ),( earl ))                        |
 +
|                                                                |
 +
|  (( junior_high ),( high_school ),( college ))                |
 +
|                                                                |
 +
|  ( junior_high ,                                              |
 +
|    ( art  ),( al    ),( clyde  ),( mike  ),( jim ),            |
 +
|    ( john ),( lance ),( george ),( ralph ),( ike ))            |
 +
|                                                                |
 +
|  ( high_school ,                                              |
 +
|    ( greg ),( doug ),( pete ),( fred ),( nick ),              |
 +
|    ( karl ),( ken  ),( earl ),( rick ),( neal ),( dave ))      |
 +
|                                                                |
 +
|  ( college ,                                                  |
 +
|    ( sam ),( gene ),( phil ),( don ),( ned ),( ol ))          |
 +
|                                                                |
 +
|  (( single ),( married ),( divorced ))                        |
 +
|                                                                |
 +
|  ( single ,                                                    |
 +
|    ( art  ),( sam  ),( clyde ),( mike ),                      |
 +
|    ( doug  ),( pete ),( fred  ),( gene ),                      |
 +
|    ( ralph ),( ike  ),( nick  ),( ken  ),( neal ))            |
 +
|                                                                |
 +
|  ( married ,                                                  |
 +
|    ( al  ),( greg ),( john ),( lance ),( phil ),              |
 +
|    ( don ),( ned  ),( karl ),( earl  ),( ol  ))              |
 +
|                                                                |
 +
|  ( divorced ,                                                  |
 +
|    ( jim ),( george ),( rick ),( dave ))                      |
 +
|                                                                |
 +
|  (( bookie ),( burglar ),( pusher ))                          |
 +
|                                                                |
 +
|  ( bookie ,                                                    |
 +
|    ( sam  ),( clyde ),( mike ),( doug ),                      |
 +
|    ( pete ),( ike  ),( ned  ),( karl ),( neal ))              |
 +
|                                                                |
 +
|  ( burglar ,                                                  |
 +
|    ( al    ),( jim ),( john ),( lance ),                      |
 +
|    ( george ),( don ),( ken  ),( earl  ),( rick ))            |
 +
|                                                                |
 +
|  ( pusher ,                                                    |
 +
|    ( art  ),( greg ),( fred ),( gene ),                      |
 +
|    ( ralph ),( phil ),( nick ),( ol  ),( dave ))              |
 +
|                                                                |
 +
o----------------------------------------------------------------o
 +
 
 +
We now apply Study to the proposition that
 +
defines the Jets and Sharks knowledge base,
 +
that is to say, the knowledge that we are
 +
given about the Jets and Sharks, not the
 +
knowledge that the Jets and Sharks have.
 +
 
 +
With a query on the name "ken" we obtain the following
 +
output, giving all of the features associated with Ken:
 +
 
 +
Sense Outline:  JAS & Ken
 +
o---------------------------------------o
 +
| ken                                  |
 +
|  sharks                              |
 +
|  20's                                |
 +
|    high_school                        |
 +
|    single                            |
 +
|      burglar                          |
 +
o---------------------------------------o
 +
 
 +
With a query on the two features "college" and "sharks"
 +
we obtain the following outline of all of the features
 +
that satisfy these constraints:
 +
 
 +
Sense Outline:  JAS & College & Sharks
 +
o---------------------------------------o
 +
| college                              |
 +
|  sharks                              |
 +
|  30's                                |
 +
|    married                            |
 +
|    bookie                            |
 +
|      ned                              |
 +
|    burglar                          |
 +
|      don                              |
 +
|    pusher                            |
 +
|      phil                            |
 +
|      ol                              |
 +
o---------------------------------------o
 +
 
 +
>From this we discover that all college Sharks
 +
are 30-something and married.  Furthermore,
 +
we have a complete listing of their names
 +
broken down by occupation, as I have no
 +
doubt that all of them will be in time.
 +
 
 +
| Reference:
 +
|
 +
| McClelland, James L. & Rumelhart, David E.,
 +
|'Explorations in Parallel Distributed Processing:
 +
| A Handbook of Models, Programs, and Exercises',
 +
| MIT Press, Cambridge, MA, 1988.
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
One of the issues that my pondering weak and weary over
 +
has caused me to burn not a few barrels of midnight oil
 +
over the past elventeen years or so is the relationship
 +
among divers and sundry "styles of inference", by which
 +
I mean particular choices of inference paradigms, rules,
 +
or schemata.  The chief breakpoint seems to lie between
 +
information-losing and information-maintaining modes of
 +
inference, also called "implicational" and "equational",
 +
or "projective" and "preservative" brands, respectively.
 +
 
 +
Since it appears to be mostly the implicational and projective
 +
styles of inference that are more familiar to folks hereabouts,
 +
I will start off this subdiscussion by introducing a number of
 +
risibly simple but reasonably manageable examples of the other
 +
brand of inference, treated as equational reasoning approaches
 +
to problems about satisfying "zeroth order constraints" (ZOC's).
 +
 
 +
Applications of a Propositional Calculator:
 +
Constraint Satisfaction Problems.
 +
Jon Awbrey, April 24, 1995.
 +
 
 +
The Four Houses Puzzle
 +
 
 +
Constructed on the model of the "Five Houses Puzzle" in [VaH, 132-136].
 +
 
 +
Problem Statement.  Four people with different nationalities live in the
 +
first four houses of a street.  They practice four distinct professions,
 +
and each of them has a favorite animal, all of them different.  The four
 +
houses are painted different colors.  The following facts are known:
 +
 
 +
|  1.  The Englander lives in the first house on the left.
 +
|  2.  The doctor lives in the second house.
 +
|  3.  The third house is painted red.
 +
|  4.  The zebra is a favorite in the fourth house.
 +
|  5.  The person in the first house has a dog.
 +
|  6.  The Japanese lives in the third house.
 +
|  7.  The red house is on the left of the yellow one.
 +
|  8.  They breed snails in the house to right of the doctor.
 +
|  9.  The Englander lives next to the green house.
 +
| 10.  The fox is in the house next to to the diplomat.
 +
| 11.  The Spaniard likes zebras.
 +
| 12.  The Japanese is a painter.
 +
| 13.  The Italian lives in the green house.
 +
| 14.  The violinist lives in the yellow house.
 +
| 15.  The dog is a pet in the blue house.
 +
| 16.  The doctor keeps a fox.
 +
 
 +
The problem is to find all of the assignments of
 +
features to houses that satisfy these requirements.
 +
 
 +
Logical Input File:  House^4.Log
 +
o---------------------------------------------------------------------o
 +
|                                                                    |
 +
|  eng_1  doc_2  red_3  zeb_4  dog_1  jap_3                          |
 +
|                                                                    |
 +
|  (( red_1  yel_2 ),( red_2  yel_3 ),( red_3  yel_4 ))              |
 +
|  (( doc_1  sna_2 ),( doc_2  sna_3 ),( doc_3  sna_4 ))              |
 +
|                                                                    |
 +
|  (( eng_1  gre_2 ),                                                |
 +
|  ( eng_2  gre_3 ),( eng_2  gre_1 ),                                |
 +
|  ( eng_3  gre_4 ),( eng_3  gre_2 ),                                |
 +
|                    ( eng_4  gre_3 ))                                |
 +
|                                                                    |
 +
|  (( dip_1  fox_2 ),                                                |
 +
|  ( dip_2  fox_3 ),( dip_2  fox_1 ),                                |
 +
|  ( dip_3  fox_4 ),( dip_3  fox_2 ),                                |
 +
|                    ( dip_4  fox_3 ))                                |
 +
|                                                                    |
 +
|  (( spa_1 zeb_1 ),( spa_2 zeb_2 ),( spa_3 zeb_3 ),( spa_4 zeb_4 ))  |
 +
|  (( jap_1 pai_1 ),( jap_2 pai_2 ),( jap_3 pai_3 ),( jap_4 pai_4 ))  |
 +
|  (( ita_1 gre_1 ),( ita_2 gre_2 ),( ita_3 gre_3 ),( ita_4 gre_4 ))  |
 +
|                                                                    |
 +
|  (( yel_1 vio_1 ),( yel_2 vio_2 ),( yel_3 vio_3 ),( yel_4 vio_4 ))  |
 +
|  (( blu_1 dog_1 ),( blu_2 dog_2 ),( blu_3 dog_3 ),( blu_4 dog_4 ))  |
 +
|                                                                    |
 +
|  (( doc_1 fox_1 ),( doc_2 fox_2 ),( doc_3 fox_3 ),( doc_4 fox_4 ))  |
 +
|                                                                    |
 +
|  ((                                                                |
 +
|                                                                    |
 +
|  (( eng_1 ),( eng_2 ),( eng_3 ),( eng_4 ))                          |
 +
|  (( spa_1 ),( spa_2 ),( spa_3 ),( spa_4 ))                          |
 +
|  (( jap_1 ),( jap_2 ),( jap_3 ),( jap_4 ))                          |
 +
|  (( ita_1 ),( ita_2 ),( ita_3 ),( ita_4 ))                          |
 +
|                                                                    |
 +
|  (( eng_1 ),( spa_1 ),( jap_1 ),( ita_1 ))                          |
 +
|  (( eng_2 ),( spa_2 ),( jap_2 ),( ita_2 ))                          |
 +
|  (( eng_3 ),( spa_3 ),( jap_3 ),( ita_3 ))                          |
 +
|  (( eng_4 ),( spa_4 ),( jap_4 ),( ita_4 ))                          |
 +
|                                                                    |
 +
|  (( gre_1 ),( gre_2 ),( gre_3 ),( gre_4 ))                          |
 +
|  (( red_1 ),( red_2 ),( red_3 ),( red_4 ))                          |
 +
|  (( yel_1 ),( yel_2 ),( yel_3 ),( yel_4 ))                          |
 +
|  (( blu_1 ),( blu_2 ),( blu_3 ),( blu_4 ))                          |
 +
|                                                                    |
 +
|  (( gre_1 ),( red_1 ),( yel_1 ),( blu_1 ))                          |
 +
|  (( gre_2 ),( red_2 ),( yel_2 ),( blu_2 ))                          |
 +
|  (( gre_3 ),( red_3 ),( yel_3 ),( blu_3 ))                          |
 +
|  (( gre_4 ),( red_4 ),( yel_4 ),( blu_4 ))                          |
 +
|                                                                    |
 +
|  (( pai_1 ),( pai_2 ),( pai_3 ),( pai_4 ))                          |
 +
|  (( dip_1 ),( dip_2 ),( dip_3 ),( dip_4 ))                          |
 +
|  (( vio_1 ),( vio_2 ),( vio_3 ),( vio_4 ))                          |
 +
|  (( doc_1 ),( doc_2 ),( doc_3 ),( doc_4 ))                          |
 +
|                                                                    |
 +
|  (( pai_1 ),( dip_1 ),( vio_1 ),( doc_1 ))                          |
 +
|  (( pai_2 ),( dip_2 ),( vio_2 ),( doc_2 ))                          |
 +
|  (( pai_3 ),( dip_3 ),( vio_3 ),( doc_3 ))                          |
 +
|  (( pai_4 ),( dip_4 ),( vio_4 ),( doc_4 ))                          |
 +
|                                                                    |
 +
|  (( dog_1 ),( dog_2 ),( dog_3 ),( dog_4 ))                          |
 +
|  (( zeb_1 ),( zeb_2 ),( zeb_3 ),( zeb_4 ))                          |
 +
|  (( fox_1 ),( fox_2 ),( fox_3 ),( fox_4 ))                          |
 +
|  (( sna_1 ),( sna_2 ),( sna_3 ),( sna_4 ))                          |
 +
|                                                                    |
 +
|  (( dog_1 ),( zeb_1 ),( fox_1 ),( sna_1 ))                          |
 +
|  (( dog_2 ),( zeb_2 ),( fox_2 ),( sna_2 ))                          |
 +
|  (( dog_3 ),( zeb_3 ),( fox_3 ),( sna_3 ))                          |
 +
|  (( dog_4 ),( zeb_4 ),( fox_4 ),( sna_4 ))                          |
 +
|                                                                    |
 +
|  ))                                                                |
 +
|                                                                    |
 +
o---------------------------------------------------------------------o
 +
 
 +
Sense Outline:  House^4.Sen
 +
o-----------------------------o
 +
| eng_1                      |
 +
|  doc_2                      |
 +
|  red_3                    |
 +
|    zeb_4                    |
 +
|    dog_1                  |
 +
|      jap_3                  |
 +
|      yel_4                |
 +
|        sna_3                |
 +
|        gre_2              |
 +
|          dip_1              |
 +
|          fox_2            |
 +
|            spa_4            |
 +
|            pai_3          |
 +
|              ita_2          |
 +
|              vio_4        |
 +
|                blu_1        |
 +
o-----------------------------o
 +
 
 +
Table 1.  Solution to the Four Houses Puzzle
 +
o------------o------------o------------o------------o------------o
 +
|            | House 1    | House 2    | House 3    | House 4    |
 +
o------------o------------o------------o------------o------------o
 +
| Nation    | England    | Italy      | Japan      | Spain      |
 +
| Color      | blue      | green      | red        | yellow    |
 +
| Profession | diplomat  | doctor    | painter    | violinist  |
 +
| Animal    | dog        | fox        | snails    | zebra      |
 +
o------------o------------o------------o------------o------------o
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
First off, I do not trivialize the "real issues of ontology", indeed,
 +
it is precisely my estimate of the non-trivial difficulty of this task,
 +
of formulating the types of "generic ontology" that we propose to do here,
 +
that forces me to choose and to point out the inescapability of the approach
 +
that I am currently taking, which is to enter on the necessary preliminary of
 +
building up the logical tools that we need to tackle the ontology task proper.
 +
And I would say, to the contrary, that it is those who think we can arrive at
 +
a working general ontology by sitting on the porch shooting the breeze about
 +
"what it is" until the cows come home -- that is, the method for which it
 +
has become cliche to indict the Ancient Greeks, though, if truth be told,
 +
we'd have to look to the pre-socratics and the pre-stoics to find a good
 +
match for the kinds of revelation that are common hereabouts -- I would
 +
say that it's those folks who trivialize the "real issues of ontology".
 +
 
 +
A person, living in our times, who is serious about knowing the being of things,
 +
really only has one choice -- to pick what tiny domain of things he or she just
 +
has to know about the most, thence to hie away to the adept gurus of the matter
 +
in question, forgeting the rest, cause "general ontology" is a no-go these days.
 +
It is presently in a state like astronomy before telescopes, and that means not
 +
entirely able to discern itself from astrology and other psychically projective
 +
exercises of wishful and dreadful thinking like that.
 +
 
 +
So I am busy grinding lenses ...
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
DM = Douglas McDavid
 +
 
 +
DM: Thanks for both the original and additional response.  I'm not trying to
 +
    single you out, as I have been picking  on various postings in a similar
 +
    manner ever since I started contributing to this discussion.  I agree with
 +
    you that the task of this working group is non-trivially difficult.  In fact,
 +
    I believe we are still a long way from a clear and useful agreement about what
 +
    constitutes "upper" ontology, and what it would mean to standardize it.  However,
 +
    I don't agree that the only place to make progress is in tiny domains of things.
 +
    I've contributed the thought that a fundamental, upper-level concept is the
 +
    concept of system, and that that would be a good place to begin.  And I'll
 +
    never be able to refrain from evaluating the content as well as the form
 +
    of any examples presented for consideration here.  Probably should
 +
    accompany these comments with a ;-)
 +
 
 +
There will never be a standard universal ontology
 +
of the absolute essential impertubable monolithic
 +
variety that some people still dream of in their
 +
fantasies of spectating on and speculating about
 +
a pre-relativistically non-participatory universe
 +
from their singular but isolated gods'eye'views.
 +
The bells tolled for that one many years ago,
 +
but some of the more blithe of the blissful
 +
islanders have just not gotten the news yet.
 +
 
 +
But there is still a lot to do that would be useful
 +
under the banner of a "standard upper ontology",
 +
if only we stay loose in our interpretation
 +
of what that implies in practical terms.
 +
 
 +
One likely approach to the problem would be to take
 +
a hint from the afore-allusioned history of physics --
 +
to inquire for whom, else, the bell tolls -- and to
 +
see if there are any bits of wisdom from that prior
 +
round of collective experience that can be adapted
 +
by dint of analogy to our present predicament.
 +
I happen to think that there are.
 +
 
 +
And there the answer was, not to try and force a return,
 +
though lord knows they all gave it their very best shot,
 +
to an absolute and imperturbable framework of existence,
 +
but to see the reciprocal participant relation that all
 +
partakers have to the constitution of that framing, yes,
 +
even unto those who would abdictators and abstainees be.
 +
 
 +
But what does that imply about some shred of a standard?
 +
It means that we are better off seeking, not a standard,
 +
one-size-fits-all ontology, but more standard resources
 +
for trying to interrelate diverse points of view and to
 +
transform the data that's gathered from one perspective
 +
in ways that it can most appropriately be compared with
 +
the data that is gathered from other standpoints on the
 +
splendorous observational scenes and theorematic stages.
 +
 
 +
That is what I am working on.
 +
And it hasn't been merely
 +
for a couple of years.
 +
 
 +
As to this bit:
 +
 
 +
o-------------------------------------------------o
 +
|                                                |
 +
|  ( living_thing , non_living )                |
 +
|                                                |
 +
|  (( animal ),( vegetable ),( mineral ))        |
 +
|                                                |
 +
|  ( living_thing ,( animal ),( vegetable ))    |
 +
|                                                |
 +
|  ( mineral ( non_living ))                    |
 +
|                                                |
 +
o-------------------------------------------------o
 +
 
 +
My 5-dimensional Example, that I borrowed from some indifferent source
 +
of what is commonly recognized as "common sense" -- and I think rather
 +
obviously designed more for the classification of pre-modern species
 +
of whole critters and pure matters of natural substance than the
 +
motley mixture of un/natural and in/organic conglouterites that
 +
we find served up on the menu of modernity -- was not intended
 +
even so much as a toy ontology, but simply as an expository
 +
example, concocted for the sake of illustrating the sorts
 +
of logical interaction that occur among four different
 +
patterns of logical constraint, all of which types
 +
arise all the time no matter what the domain, and
 +
which I believe that my novel forms of expression,
 +
syntactically speaking, express quite succinctly,
 +
especially when you contemplate the complexities
 +
of the computation that may flow and must follow
 +
from even these meagre propositional expressions.
 +
 
 +
Yes, systems -- but -- even here usage differs in significant ways.
 +
I have spent ten years now trying to integrate my earlier efforts
 +
under an explicit systems banner, but even within the bounds of
 +
a systems engineering programme at one site there is a wide
 +
semantic dispersion that issues from this word "system".
 +
I am committed, and in writing, to taking what we so
 +
glibly and prospectively call "intelligent systems"
 +
seriously as dynamical systems.  That has many
 +
consequences, and I have to pick and choose
 +
which of those I may be suited to follow.
 +
 
 +
But that is too long a story for now ...
 +
 
 +
";-)"?
 +
 
 +
Somehow that has always looked like
 +
the Chesshire Cat's grin to me ...
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
By way of catering to popular demand, I have decided to
 +
render this symposium a bit more à la carte, and thus to
 +
serve up as faster food than heretofore a choice selection
 +
of the more sumptuous bits that I have in my logical larder,
 +
not yet full fare, by any means, but a sample of what might
 +
one day approach to being an abundantly moveable feast of
 +
ontological contents and general metaphysical delights.
 +
I'll leave it to you to name your poison, as it were.
 +
 
 +
Applications of a Propositional Calculator:
 +
Constraint Satisfaction Problems.
 +
Jon Awbrey, April 24, 1995.
 +
 
 +
Fabric Knowledge Base
 +
Based on the example in [MaW, pages 8-16].
 +
 
 +
Logical Input File: Fab.Log
 +
o---------------------------------------------------------------------o
 +
|                                                                    |
 +
| (has_floats , plain_weave )                                        |
 +
| (has_floats ,(twill_weave ),(satin_weave ))                        |
 +
|                                                                    |
 +
| (plain_weave ,                                                      |
 +
|  (plain_weave  one_color ),                                        |
 +
|  (color_groups  ),                                                  |
 +
|  (grouped_warps ),                                                  |
 +
|  (some_thicker  ),                                                  |
 +
|  (crossed_warps ),                                                  |
 +
|  (loop_threads  ),                                                  |
 +
|  (plain_weave  flannel ))                                          |
 +
|                                                                    |
 +
| (plain_weave  one_color  cotton  balanced  smooth  ,(percale ))    |
 +
| (plain_weave  one_color  cotton            sheer  ,(organdy ))    |
 +
| (plain_weave  one_color  silk              sheer  ,(organza ))    |
 +
|                                                                    |
 +
| (plain_weave  color_groups  warp_stripe  fill_stripe ,(plaid  ))  |
 +
| (plaid        equal_stripe                          ,(gingham ))  |
 +
|                                                                    |
 +
| (plain_weave  grouped_warps ,(basket_weave ))                      |
 +
|                                                                    |
 +
| (basket_weave  typed ,                                              |
 +
|  (type_2_to_1 ),                                                    |
 +
|  (type_2_to_2 ),                                                    |
 +
|  (type_4_to_4 ))                                                    |
 +
|                                                                    |
 +
| (basket_weave  typed  type_2_to_1  thicker_fill  ,(oxford      )) |
 +
| (basket_weave  typed (type_2_to_2 ,                                |
 +
|                      type_4_to_4 ) same_thickness ,(monks_cloth )) |
 +
| (basket_weave (typed )              rough  open    ,(hopsacking  )) |
 +
|                                                                    |
 +
| (typed (basket_weave ))                                            |
 +
|                                                                    |
 +
| (basket_weave ,(oxford ),(monks_cloth ),(hopsacking ))              |
 +
|                                                                    |
 +
| (plain_weave  some_thicker ,(ribbed_weave ))                      |
 +
|                                                                    |
 +
| (ribbed_weave ,(small_rib ),(medium_rib ),(heavy_rib ))            |
 +
| (ribbed_weave ,(flat_rib  ),(round_rib ))                          |
 +
|                                                                    |
 +
| (ribbed_weave  thicker_fill          ,(cross_ribbed ))              |
 +
| (cross_ribbed  small_rib  flat_rib  ,(faille      ))              |
 +
| (cross_ribbed  small_rib  round_rib ,(grosgrain    ))              |
 +
| (cross_ribbed  medium_rib  round_rib ,(bengaline    ))              |
 +
| (cross_ribbed  heavy_rib  round_rib ,(ottoman      ))              |
 +
|                                                                    |
 +
| (cross_ribbed ,(faille ),(grosgrain ),(bengaline ),(ottoman ))      |
 +
|                                                                    |
 +
| (plain_weave  crossed_warps ,(leno_weave  ))                        |
 +
| (leno_weave  open          ,(marquisette ))                        |
 +
| (plain_weave  loop_threads  ,(pile_weave ))                        |
 +
|                                                                    |
 +
| (pile_weave ,(fill_pile ),(warp_pile ))                            |
 +
| (pile_weave ,(cut ),(uncut ))                                      |
 +
|                                                                    |
 +
| (pile_weave  warp_pile  cut                  ,(velvet    ))        |
 +
| (pile_weave  fill_pile  cut    aligned_pile  ,(corduroy  ))        |
 +
| (pile_weave  fill_pile  cut    staggered_pile ,(velveteen ))        |
 +
| (pile_weave  fill_pile  uncut  reversible    ,(terry    ))        |
 +
|                                                                    |
 +
| (pile_weave  fill_pile  cut ( (aligned_pile , staggered_pile ) ))  |
 +
|                                                                    |
 +
| (pile_weave ,(velvet ),(corduroy ),(velveteen ),(terry ))          |
 +
|                                                                    |
 +
| (plain_weave ,                                                      |
 +
|  (percale    ),(organdy    ),(organza    ),(plaid  ),            |
 +
|  (oxford    ),(monks_cloth ),(hopsacking ),                        |
 +
|  (faille    ),(grosgrain  ),(bengaline  ),(ottoman ),            |
 +
|  (leno_weave ),(pile_weave  ),(plain_weave  flannel ))            |
 +
|                                                                    |
 +
| (twill_weave ,                                                      |
 +
|  (warp_faced ),                                                    |
 +
|  (filling_faced ),                                                  |
 +
|  (even_twill ),                                                    |
 +
|  (twill_weave  flannel ))                                          |
 +
|                                                                    |
 +
| (twill_weave  warp_faced  colored_warp  white_fill ,(denim ))      |
 +
| (twill_weave  warp_faced  one_color                ,(drill ))      |
 +
| (twill_weave  even_twill  diagonal_rib            ,(serge ))      |
 +
|                                                                    |
 +
| (twill_weave  warp_faced (                                          |
 +
|  (one_color ,                                                      |
 +
|  ((colored_warp )(white_fill )) )                                  |
 +
| ))                                                                  |
 +
|                                                                    |
 +
| (twill_weave  warp_faced ,(denim ),(drill ))                        |
 +
| (twill_weave  even_twill ,(serge ))                                |
 +
|                                                                    |
 +
| ((                                                                  |
 +
|    (  ((plain_weave )(twill_weave ))                              |
 +
|        ((cotton      )(wool        )) napped ,(flannel ))          |
 +
| ))                                                                  |
 +
|                                                                    |
 +
| (satin_weave ,(warp_floats ),(fill_floats ))                        |
 +
|                                                                    |
 +
| (satin_weave ,(satin_weave smooth ),(satin_weave napped ))          |
 +
| (satin_weave ,(satin_weave cotton ),(satin_weave silk  ))          |
 +
|                                                                    |
 +
| (satin_weave  warp_floats  smooth        ,(satin    ))            |
 +
| (satin_weave  fill_floats  smooth        ,(sateen  ))            |
 +
| (satin_weave              napped  cotton ,(moleskin ))            |
 +
|                                                                    |
 +
| (satin_weave ,(satin ),(sateen ),(moleskin ))                      |
 +
|                                                                    |
 +
o---------------------------------------------------------------------o
 +
 
 +
| Reference [MaW]
 +
|
 +
| Maier, David & Warren, David S.,
 +
|'Computing with Logic:  Logic Programming with Prolog',
 +
| Benjamin/Cummings, Menlo Park, CA, 1988.
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
I think that it might be a good idea to go back to a simpler example
 +
of a constraint satisfaction problem, and to discuss the elements of
 +
its expression as a ZOT in a less cluttered setting before advancing
 +
onward once again to problems on the order of the Four Houses Puzzle.
 +
 
 +
| Applications of a Propositional Calculator:
 +
| Constraint Satisfaction Problems.
 +
| Jon Awbrey, April 24, 1995.
 +
 
 +
Graph Coloring
 +
 
 +
Based on the discussion in [Wil, page 196].
 +
 
 +
One is given three colors, say, orange, silver, indigo,
 +
and a graph on four nodes that has the following shape:
 +
 
 +
|          1
 +
|          o
 +
|        / \
 +
|        /  \
 +
|    4 o-----o 2
 +
|        \  /
 +
|        \ /
 +
|          o
 +
|          3
 +
 
 +
The problem is to color the nodes of the graph
 +
in such a way that no pair of nodes that are
 +
adjacent in the graph, that is, linked by
 +
an edge, get the same color.
 +
 
 +
The objective situation that is to be achieved can be represented
 +
in a so-called "declarative" fashion, in effect, by employing the
 +
cactus language as a very simple sort of declarative programming
 +
language, and depicting the prospective solution to the problem
 +
as a ZOT.
 +
 
 +
To do this, begin by declaring the following set of
 +
twelve boolean variables or "zeroth order features":
 +
 
 +
{1_orange, 1_silver, 1_indigo,
 +
2_orange, 2_silver, 2_indigo,
 +
3_orange, 3_silver, 3_indigo,
 +
4_orange, 4_silver, 4_indigo}
 +
 
 +
The interpretation to keep in mind will be such that
 +
the feature name of the form "<node i>_<color j>"
 +
says that the node i is assigned the color j.
 +
 
 +
Logical Input File:  Color.Log
 +
o----------------------------------------------------------------------o
 +
|                                                                      |
 +
|  (( 1_orange ),( 1_silver ),( 1_indigo ))                            |
 +
|  (( 2_orange ),( 2_silver ),( 2_indigo ))                            |
 +
|  (( 3_orange ),( 3_silver ),( 3_indigo ))                            |
 +
|  (( 4_orange ),( 4_silver ),( 4_indigo ))                            |
 +
|                                                                      |
 +
|  ( 1_orange  2_orange )( 1_silver  2_silver )( 1_indigo  2_indigo )  |
 +
|  ( 1_orange  4_orange )( 1_silver  4_silver )( 1_indigo  4_indigo )  |
 +
|  ( 2_orange  3_orange )( 2_silver  3_silver )( 2_indigo  3_indigo )  |
 +
|  ( 2_orange  4_orange )( 2_silver  4_silver )( 2_indigo  4_indigo )  |
 +
|  ( 3_orange  4_orange )( 3_silver  4_silver )( 3_indigo  4_indigo )  |
 +
|                                                                      |
 +
o----------------------------------------------------------------------o
 +
 
 +
The first stanza of verses declares that
 +
every node is assigned exactly one color.
 +
 
 +
The second stanza of verses declares that
 +
no adjacent nodes get the very same color.
 +
 
 +
Each satisfying interpretation of this ZOT
 +
that is also a program corresponds to what
 +
graffitists call a "coloring" of the graph.
 +
 
 +
Theme One's Model interpreter, when we set
 +
it to work on this ZOT, will array  before
 +
our eyes all of the colorings of the graph.
 +
 
 +
Sense Outline:  Color.Sen
 +
o-----------------------------o
 +
| 1_orange                    |
 +
|  2_silver                  |
 +
|  3_orange                  |
 +
|    4_indigo                |
 +
|  2_indigo                  |
 +
|  3_orange                  |
 +
|    4_silver                |
 +
| 1_silver                    |
 +
|  2_orange                  |
 +
|  3_silver                  |
 +
|    4_indigo                |
 +
|  2_indigo                  |
 +
|  3_silver                  |
 +
|    4_orange                |
 +
| 1_indigo                    |
 +
|  2_orange                  |
 +
|  3_indigo                  |
 +
|    4_silver                |
 +
|  2_silver                  |
 +
|  3_indigo                  |
 +
|    4_orange                |
 +
o-----------------------------o
 +
 
 +
| Reference [Wil]
 +
|
 +
| Wilf, Herbert S.,
 +
|'Algorithms and Complexity',
 +
| Prentice-Hall, Englewood Cliffs, NJ, 1986.
 +
|
 +
| Nota Bene.  There is a wrong Figure in some
 +
| printings of the book, that does not match
 +
| the description of the Example that is
 +
| given in the text.
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
Let us continue to examine the properties of the cactus language
 +
as a minimal style of declarative programming language.  Even in
 +
the likes of this zeroth order microcosm one can observe, and on
 +
a good day still more clearly for the lack of other distractions,
 +
many of the buzz worlds that will spring into full bloom, almost
 +
as if from nowhere, to become the first order of business in the
 +
latter day logical organa, plus combinators, plus lambda calculi.
 +
 
 +
By way of homage to the classics of the art, I can hardly pass
 +
this way without paying my dues to the next sample of examples.
 +
 
 +
N Queens Problem
 +
 
 +
I will give the ZOT that describes the N Queens Problem for N = 5,
 +
since that is the most that I and my old 286 could do when last I
 +
wrote up this Example.
 +
 
 +
The problem is now to write a "zeroth order program" (ZOP) that
 +
describes the following objective:  To place 5 chess queens on
 +
a 5 by 5 chessboard so that no queen attacks any other queen.
 +
 
 +
It is clear that there can be at most one queen on each row
 +
of the board and so by dint of regal necessity, exactly one
 +
queen in each row of the desired array.  This gambit allows
 +
us to reduce the problem to one of picking a permutation of
 +
five things in fives places, and this affords us sufficient
 +
clue to begin down a likely path toward the intended object,
 +
by recruiting the following phalanx of 25 logical variables:
 +
 
 +
Literal Input File:  Q5.Lit
 +
o---------------------------------------o
 +
|                                      |
 +
|  q1_r1, q1_r2, q1_r3, q1_r4, q1_r5,  |
 +
|  q2_r1, q2_r2, q2_r3, q2_r4, q2_r5,  |
 +
|  q3_r1, q3_r2, q3_r3, q3_r4, q3_r5,  |
 +
|  q4_r1, q4_r2, q4_r3, q4_r4, q4_r5,  |
 +
|  q5_r1, q5_r2, q5_r3, q5_r4, q5_r5.  |
 +
|                                      |
 +
o---------------------------------------o
 +
 
 +
Thus we seek to define a function, of abstract type f : %B%^25 -> %B%,
 +
whose fibre of truth f^(-1)(%1%) is a set of interpretations, each of
 +
whose elements bears the abstract type of a point in the space %B%^25,
 +
and whose reading will inform us of our desired set of configurations.
 +
 
 +
Logical Input File:  Q5.Log
 +
o------------------------------------------------------------o
 +
|                                                            |
 +
|  ((q1_r1 ),(q1_r2 ),(q1_r3 ),(q1_r4 ),(q1_r5 ))            |
 +
|  ((q2_r1 ),(q2_r2 ),(q2_r3 ),(q2_r4 ),(q2_r5 ))            |
 +
|  ((q3_r1 ),(q3_r2 ),(q3_r3 ),(q3_r4 ),(q3_r5 ))            |
 +
|  ((q4_r1 ),(q4_r2 ),(q4_r3 ),(q4_r4 ),(q4_r5 ))            |
 +
|  ((q5_r1 ),(q5_r2 ),(q5_r3 ),(q5_r4 ),(q5_r5 ))            |
 +
|                                                            |
 +
|  ((q1_r1 ),(q2_r1 ),(q3_r1 ),(q4_r1 ),(q5_r1 ))            |
 +
|  ((q1_r2 ),(q2_r2 ),(q3_r2 ),(q4_r2 ),(q5_r2 ))            |
 +
|  ((q1_r3 ),(q2_r3 ),(q3_r3 ),(q4_r3 ),(q5_r3 ))            |
 +
|  ((q1_r4 ),(q2_r4 ),(q3_r4 ),(q4_r4 ),(q5_r4 ))            |
 +
|  ((q1_r5 ),(q2_r5 ),(q3_r5 ),(q4_r5 ),(q5_r5 ))            |
 +
|                                                            |
 +
|  ((                                                        |
 +
|                                                            |
 +
|  (q1_r1 q2_r2 )(q1_r1 q3_r3 )(q1_r1 q4_r4 )(q1_r1 q5_r5 )  |
 +
|                (q2_r2 q3_r3 )(q2_r2 q4_r4 )(q2_r2 q5_r5 )  |
 +
|                              (q3_r3 q4_r4 )(q3_r3 q5_r5 )  |
 +
|                                            (q4_r4 q5_r5 )  |
 +
|                                                            |
 +
|  (q1_r2 q2_r3 )(q1_r2 q3_r4 )(q1_r2 q4_r5 )                |
 +
|                (q2_r3 q3_r4 )(q2_r3 q4_r5 )                |
 +
|                              (q3_r4 q4_r5 )                |
 +
|                                                            |
 +
|  (q1_r3 q2_r4 )(q1_r3 q3_r5 )                              |
 +
|                (q2_r4 q3_r5 )                              |
 +
|                                                            |
 +
|  (q1_r4 q2_r5 )                                            |
 +
|                                                            |
 +
|  (q2_r1 q3_r2 )(q2_r1 q4_r3 )(q2_r1 q5_r4 )                |
 +
|                (q3_r2 q4_r3 )(q3_r2 q5_r4 )                |
 +
|                              (q4_r3 q5_r4 )                |
 +
|                                                            |
 +
|  (q3_r1 q4_r2 )(q3_r1 q5_r3 )                              |
 +
|                (q4_r2 q5_r3 )                              |
 +
|                                                            |
 +
|  (q4_r1 q5_r2 )                                            |
 +
|                                                            |
 +
|  (q1_r5 q2_r4 )(q1_r5 q3_r3 )(q1_r5 q4_r2 )(q1_r5 q5_r1 )  |
 +
|                (q2_r4 q3_r3 )(q2_r4 q4_r2 )(q2_r4 q5_r1 )  |
 +
|                              (q3_r3 q4_r2 )(q3_r3 q5_r1 )  |
 +
|                                            (q4_r2 q5_r1 )  |
 +
|                                                            |
 +
|  (q2_r5 q3_r4 )(q2_r5 q4_r3 )(q2_r5 q5_r2 )                |
 +
|                (q3_r4 q4_r3 )(q3_r4 q5_r2 )                |
 +
|                              (q4_r3 q5_r2 )                |
 +
|                                                            |
 +
|  (q3_r5 q4_r4 )(q3_r5 q5_r3 )                              |
 +
|                (q4_r4 q5_r3 )                              |
 +
|                                                            |
 +
|  (q4_r5 q5_r4 )                                            |
 +
|                                                            |
 +
|  (q1_r4 q2_r3 )(q1_r4 q3_r2 )(q1_r4 q4_r1 )                |
 +
|                (q2_r3 q3_r2 )(q2_r3 q4_r1 )                |
 +
|                              (q3_r2 q4_r1 )                |
 +
|                                                            |
 +
|  (q1_r3 q2_r2 )(q1_r3 q3_r1 )                              |
 +
|                (q2_r2 q3_r1 )                              |
 +
|                                                            |
 +
|  (q1_r2 q2_r1 )                                            |
 +
|                                                            |
 +
|  ))                                                        |
 +
|                                                            |
 +
o------------------------------------------------------------o
 +
 
 +
The vanguard of this logical regiment consists of two
 +
stock'a'block platoons, the pattern of whose features
 +
is the usual sort of array for conveying permutations.
 +
Between the stations of their respective offices they
 +
serve to warrant that all of the interpretations that
 +
are left standing on the field of valor at the end of
 +
the day will be ones that tell of permutations 5 by 5.
 +
The rest of the ruck and the runt of the mill in this
 +
regimental logos are there to cover the diagonal bias
 +
against attacking queens that is our protocol to suit.
 +
 
 +
And here is the issue of the day:
 +
 
 +
Sense Output:  Q5.Sen
 +
o-------------------o
 +
| q1_r1            |
 +
|  q2_r3            |
 +
|  q3_r5          |
 +
|    q4_r2          |
 +
|    q5_r4        | <1>
 +
|  q2_r4            |
 +
|  q3_r2          |
 +
|    q4_r5          |
 +
|    q5_r3        | <2>
 +
| q1_r2            |
 +
|  q2_r4            |
 +
|  q3_r1          |
 +
|    q4_r3          |
 +
|    q5_r5        | <3>
 +
|  q2_r5            |
 +
|  q3_r3          |
 +
|    q4_r1          |
 +
|    q5_r4        | <4>
 +
| q1_r3            |
 +
|  q2_r1            |
 +
|  q3_r4          |
 +
|    q4_r2          |
 +
|    q5_r5        | <5>
 +
|  q2_r5            |
 +
|  q3_r2          |
 +
|    q4_r4          |
 +
|    q5_r1        | <6>
 +
| q1_r4            |
 +
|  q2_r1            |
 +
|  q3_r3          |
 +
|    q4_r5          |
 +
|    q5_r2        | <7>
 +
|  q2_r2            |
 +
|  q3_r5          |
 +
|    q4_r3          |
 +
|    q5_r1        | <8>
 +
| q1_r5            |
 +
|  q2_r2            |
 +
|  q3_r4          |
 +
|    q4_r1          |
 +
|    q5_r3        | <9>
 +
|  q2_r3            |
 +
|  q3_r1          |
 +
|    q4_r4          |
 +
|    q5_r2        | <A>
 +
o-------------------o
 +
 
 +
The number at least checks with all of the best authorities,
 +
so I can breathe a sigh of relief on that account, at least.
 +
I am sure that there just has to be a more clever way to do
 +
this, that is to say, within the bounds of ZOT reason alone,
 +
but the above is the best that I could figure out with the
 +
time that I had at the time.
 +
 
 +
References:  [BaC, 166], [VaH, 122], [Wir, 143].
 +
 
 +
[BaC]  Ball, W.W. Rouse, & Coxeter, H.S.M.,
 +
      'Mathematical Recreations and Essays',
 +
      13th ed., Dover, New York, NY, 1987.
 +
 
 +
[VaH]  Van Hentenryck, Pascal,
 +
      'Constraint Satisfaction in Logic Programming,
 +
      MIT Press, Cambridge, MA, 1989.
 +
 
 +
[Wir]  Wirth, Niklaus,
 +
      'Algorithms + Data Structures = Programs',
 +
      Prentice-Hall, Englewood Cliffs, NJ, 1976.
 +
 
 +
http://mathworld.wolfram.com/QueensProblem.html
 +
http://www.research.att.com/cgi-bin/access.cgi/as/njas/sequences/eisA.cgi?Anum=000170
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
I turn now to another golden oldie of a constraint satisfaction problem
 +
that I would like to give here a slightly new spin, but not so much for
 +
the sake of these trifling novelties as from a sense of old time's ache
 +
and a duty to -- well, what's the opposite of novelty?
 +
 
 +
Phobic Apollo
  
S covers T
+
| Suppose Peter, Paul, and Jane are musicians. One of them plays
 +
| saxophone, another plays guitar, and the third plays drums.  As
 +
| it happens, one of them is afraid of things associated with the
 +
| number 13, another of them is afraid of cats, and the third is
 +
| afraid of heights.  You also know that Peter and the guitarist
 +
| skydive, that Paul and the saxophone player enjoy cats, and
 +
| that the drummer lives in apartment 13 on the 13th floor.
 
|
 
|
S governs T
+
| Soon we will want to use these facts to reason
 +
| about whether or not certain identity relations
 +
| hold or are excluded. Assume X(Peter, Guitarist)
 +
| means "the person who is Peter is not the person who
 +
| plays the guitar".  In this notation, the facts become:
 
|
 
|
S rules T
+
| 1. X(Peter, Guitarist)
 +
| 2.  X(Peter, Fears Heights)
 +
| 3.  X(Guitarist, Fears Heights)
 +
| 4.  X(Paul, Fears Cats)
 +
| 5.  X(Paul, Saxophonist)
 +
| 6.  X(Saxophonist, Fears Cats)
 +
| 7.  X(Drummer, Fears 13)
 +
| 8.  X(Drummer, Fears Heights)
 +
|
 +
| Exercise attributed to Kenneth D. Forbus, pages 449-450 in:
 +
| Patrick Henry Winston, 'Artificial Intelligence', 2nd ed.,
 +
| Addison-Wesley, Reading, MA, 1984.
 +
 
 +
Here is one way to represent these facts in the form of a ZOT
 +
and use it as a logical program to draw a succinct conclusion:
 +
 
 +
Logical Input File:  ConSat.Log
 +
o-----------------------------------------------------------------------o
 +
|                                                                      |
 +
|  (( pete_plays_guitar ),( pete_plays_sax ),( pete_plays_drums ))      |
 +
|  (( paul_plays_guitar ),( paul_plays_sax ),( paul_plays_drums ))      |
 +
|  (( jane_plays_guitar ),( jane_plays_sax ),( jane_plays_drums ))      |
 +
|                                                                      |
 +
|  (( pete_plays_guitar ),( paul_plays_guitar ),( jane_plays_guitar ))  |
 +
|  (( pete_plays_sax    ),( paul_plays_sax    ),( jane_plays_sax    ))  |
 +
|  (( pete_plays_drums  ),( paul_plays_drums  ),( jane_plays_drums  ))  |
 +
|                                                                      |
 +
|  (( pete_fears_13 ),( pete_fears_cats ),( pete_fears_height ))        |
 +
|  (( paul_fears_13 ),( paul_fears_cats ),( paul_fears_height ))        |
 +
|  (( jane_fears_13 ),( jane_fears_cats ),( jane_fears_height ))        |
 +
|                                                                      |
 +
|  (( pete_fears_13    ),( paul_fears_13    ),( jane_fears_13    ))  |
 +
|  (( pete_fears_cats  ),( paul_fears_cats  ),( jane_fears_cats  ))  |
 +
|  (( pete_fears_height ),( paul_fears_height ),( jane_fears_height ))  |
 +
|                                                                      |
 +
|  ((                                                                  |
 +
|                                                                      |
 +
|  ( pete_plays_guitar )                                                |
 +
|  ( pete_fears_height )                                                |
 +
|                                                                      |
 +
|  ( pete_plays_guitar  pete_fears_height )                            |
 +
|  ( paul_plays_guitar  paul_fears_height )                            |
 +
|  ( jane_plays_guitar  jane_fears_height )                            |
 +
|                                                                      |
 +
|  ( paul_fears_cats )                                                  |
 +
|  ( paul_plays_sax  )                                                  |
 +
|                                                                      |
 +
|  ( pete_plays_sax  pete_fears_cats )                                  |
 +
|  ( paul_plays_sax  paul_fears_cats )                                  |
 +
|  ( jane_plays_sax  jane_fears_cats )                                  |
 +
|                                                                      |
 +
|  ( pete_plays_drums  pete_fears_13 )                                  |
 +
|  ( paul_plays_drums  paul_fears_13 )                                  |
 +
|  ( jane_plays_drums  jane_fears_13 )                                  |
 +
|                                                                      |
 +
|  ( pete_plays_drums  pete_fears_height )                              |
 +
|  ( paul_plays_drums  paul_fears_height )                              |
 +
|  ( jane_plays_drums  jane_fears_height )                              |
 +
|                                                                      |
 +
|  ))                                                                  |
 +
|                                                                      |
 +
o-----------------------------------------------------------------------o
 +
 
 +
Sense Outline:  ConSat.Sen
 +
o-----------------------------o
 +
| pete_plays_drums            |
 +
|  paul_plays_guitar          |
 +
|  jane_plays_sax            |
 +
|    pete_fears_cats          |
 +
|    paul_fears_13          |
 +
|      jane_fears_height      |
 +
o-----------------------------o
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
Phobic Apollo (cont.)
 +
 
 +
It might be instructive to review various aspects
 +
of how the Theme One Study function actually went
 +
about arriving at its answer to that last problem.
 +
Just to prove that my program and I really did do
 +
our homework on that Phobic Apollo ConSat problem,
 +
and didn't just provoke some Oracle or other data
 +
base server to give it away, here is the middling
 +
output of the Model function as run on ConSat.Log:
 +
 
 +
Model Outline:  ConSat.Mod
 +
o-------------------------------------------------o
 +
| pete_plays_guitar -                            |
 +
| (pete_plays_guitar )                            |
 +
|  pete_plays_sax                                |
 +
|  pete_plays_drums -                            |
 +
|  (pete_plays_drums )                          |
 +
|    paul_plays_sax -                            |
 +
|    (paul_plays_sax )                            |
 +
|    jane_plays_sax -                            |
 +
|    (jane_plays_sax )                          |
 +
|      paul_plays_guitar                          |
 +
|      paul_plays_drums -                        |
 +
|      (paul_plays_drums )                      |
 +
|        jane_plays_guitar -                      |
 +
|        (jane_plays_guitar )                    |
 +
|        jane_plays_drums                        |
 +
|          pete_fears_13                          |
 +
|          pete_fears_cats -                    |
 +
|          (pete_fears_cats )                    |
 +
|            pete_fears_height -                  |
 +
|            (pete_fears_height )                |
 +
|            paul_fears_13 -                    |
 +
|            (paul_fears_13 )                    |
 +
|              jane_fears_13 -                    |
 +
|              (jane_fears_13 )                  |
 +
|              paul_fears_cats -                |
 +
|              (paul_fears_cats )                |
 +
|                paul_fears_height -              |
 +
|                (paul_fears_height ) -          |
 +
|          (pete_fears_13 )                      |
 +
|          pete_fears_cats -                    |
 +
|          (pete_fears_cats )                    |
 +
|            pete_fears_height -                  |
 +
|            (pete_fears_height ) -              |
 +
|        (jane_plays_drums ) -                  |
 +
|      (paul_plays_guitar )                      |
 +
|      paul_plays_drums                          |
 +
|        jane_plays_drums -                      |
 +
|        (jane_plays_drums )                      |
 +
|        jane_plays_guitar                      |
 +
|          pete_fears_13                          |
 +
|          pete_fears_cats -                    |
 +
|          (pete_fears_cats )                    |
 +
|            pete_fears_height -                  |
 +
|            (pete_fears_height )                |
 +
|            paul_fears_13 -                    |
 +
|            (paul_fears_13 )                    |
 +
|              jane_fears_13 -                    |
 +
|              (jane_fears_13 )                  |
 +
|              paul_fears_cats -                |
 +
|              (paul_fears_cats )                |
 +
|                paul_fears_height -              |
 +
|                (paul_fears_height ) -          |
 +
|          (pete_fears_13 )                      |
 +
|          pete_fears_cats -                    |
 +
|          (pete_fears_cats )                    |
 +
|            pete_fears_height -                  |
 +
|            (pete_fears_height ) -              |
 +
|        (jane_plays_guitar ) -                  |
 +
|      (paul_plays_drums ) -                    |
 +
|  (pete_plays_sax )                              |
 +
|  pete_plays_drums                              |
 +
|    paul_plays_drums -                          |
 +
|    (paul_plays_drums )                          |
 +
|    jane_plays_drums -                          |
 +
|    (jane_plays_drums )                        |
 +
|      paul_plays_guitar                          |
 +
|      paul_plays_sax -                          |
 +
|      (paul_plays_sax )                        |
 +
|        jane_plays_guitar -                      |
 +
|        (jane_plays_guitar )                    |
 +
|        jane_plays_sax                          |
 +
|          pete_fears_13 -                        |
 +
|          (pete_fears_13 )                      |
 +
|          pete_fears_cats                      |
 +
|            pete_fears_height -                  |
 +
|            (pete_fears_height )                |
 +
|            paul_fears_cats -                  |
 +
|            (paul_fears_cats )                  |
 +
|              jane_fears_cats -                  |
 +
|              (jane_fears_cats )                |
 +
|              paul_fears_13                    |
 +
|                paul_fears_height -              |
 +
|                (paul_fears_height )            |
 +
|                jane_fears_13 -                |
 +
|                (jane_fears_13 )                |
 +
|                  jane_fears_height *            |
 +
|                  (jane_fears_height ) -        |
 +
|              (paul_fears_13 )                  |
 +
|                paul_fears_height -              |
 +
|                (paul_fears_height ) -          |
 +
|          (pete_fears_cats )                    |
 +
|            pete_fears_height -                  |
 +
|            (pete_fears_height ) -              |
 +
|        (jane_plays_sax ) -                    |
 +
|      (paul_plays_guitar )                      |
 +
|      paul_plays_sax -                          |
 +
|      (paul_plays_sax ) -                      |
 +
|  (pete_plays_drums ) -                        |
 +
o-------------------------------------------------o
 +
 
 +
This is just the traverse of the "arboreal boolean expansion" (ABE) tree
 +
that Model function germinates from the propositional expression that we
 +
planted in the file Consat.Log, which works to describe the facts of the
 +
situation in question.  Since there are 18 logical feature names in this
 +
propositional expression, we are literally talking about a function that
 +
enjoys the abstract type f : %B%^18 -> %B%.  If I had wanted to evaluate
 +
this function by expressly writing out its truth table, then it would've
 +
required 2^18 = 262144 rows.  Now I didn't bother to count, but I'm sure
 +
that the above output does not have anywhere near that many lines, so it
 +
must be that my program, and maybe even its author, has done a couple of
 +
things along the way that are moderately intelligent.  At least, we hope.
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
AK = Antti Karttunen
 +
JA = Jon Awbrey
 +
 
 +
AK: Am I (and other SeqFanaticians) missing something from this thread?
 +
 
 +
AK: Your previous message on seqfan (headers below) is a bit of the same topic,
 +
    but does it belong to the same thread?  Where I could obtain the other
 +
    messages belonging to those two threads?  (I'm just now starting to
 +
    study "mathematical logic", and its relations to combinatorics are
 +
    very interesting.)  Is this "cactus" language documented anywhere?
 +
 
 +
here i was just following a courtesy of copying people
 +
when i reference their works, in this case neil's site:
 +
 
 +
http://www.research.att.com/cgi-bin/access.cgi/as/njas/sequences/eisA.cgi?Anum=000170
 +
 
 +
but then i thought that the seqfantasians might be amused, too.
 +
 
 +
the bit on higher order propositions, in particular,
 +
those of type h : (B^2 -> B) -> B, i sent because
 +
of the significance that 2^2^2^2 = 65536 took on
 +
for us around that time.  & the ho, ho, ho joke.
 +
 
 +
"zeroth order logic" (zol) is just another name for
 +
the propositional calculus or the sentential logic
 +
that comes before "first order logic" (fol), aka
 +
first intens/tional logic, quantificational logic,
 +
or predicate calculus, depending on who you talk to.
 +
 
 +
the line of work that i have been doing derives from
 +
the ideas of c.s. peirce (1839-1914), who developed
 +
a couple of systems of "logical graphs", actually,
 +
two variant interpretations of the same abstract
 +
structures, called "entitative" and "existential"
 +
graphs.  he organized his system into "alpha",
 +
"beta", and "gamma" layers, roughly equivalent
 +
to our propositional, quantificational, and
 +
modal levels of logic today.
 +
 
 +
on the more contemporary scene, peirce's entitative interpretation
 +
of logical graphs was revived and extended by george spencer brown
 +
in his book 'laws of form', while the existential interpretation
 +
has flourished in the development of "conceptual graphs" by
 +
john f sowa and a community of growing multitudes.
 +
 
 +
a passel of links:
 +
 
 +
http://members.door.net/arisbe/
 +
http://www.enolagaia.com/GSB.html
 +
http://www.cs.uah.edu/~delugach/CG/
 +
http://www.jfsowa.com/
 +
http://www.jfsowa.com/cg/
 +
http://www.jfsowa.com/peirce/ms514w.htm
 +
http://users.bestweb.net/~sowa/
 +
http://users.bestweb.net/~sowa/peirce/ms514.htm
 +
 
 +
i have mostly focused on "alpha" (prop calc or zol) --
 +
though the "func conception of quant logic" thread was
 +
a beginning try at saying how the same line of thought
 +
might be extended to 1st, 2nd, & higher order logics --
 +
and i devised a particular graph & string syntax that
 +
is based on a species of cacti, officially described as
 +
the "reflective extension of logical graphs" (ref log),
 +
but more lately just referred to as "cactus language".
 +
 
 +
it turns out that one can do many interesting things
 +
with prop calc if one has an efficient enough syntax
 +
and a powerful enough interpreter for it, even using
 +
it as a very minimal sort of declarative programming
 +
language, hence, the current thread was directed to
 +
applying "zeroth order theories" (zot's) as brands
 +
of "zeroth order programs" (zop's) to a set of old
 +
constraint satisfaction and knowledge rep examples.
 +
 
 +
more recent expositions of the cactus language have been directed
 +
toward what some people call "ontology engineering" -- it sounds
 +
so much cooler than "taxonomy" -- and so these are found in the
 +
ieee standard upper ontology working group discussion archives.
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
Let's now pause and reflect on the mix of abstract and concrete material
 +
that we have cobbled together in spectacle of this "World Of Zero" (WOZ),
 +
since I believe that we may have seen enough, if we look at it right, to
 +
illustrate a few of the more salient phenomena that would normally begin
 +
to weigh in as a major force only on a much larger scale.  Now, it's not
 +
exactly like this impoverished sample, all by itself, could determine us
 +
to draw just the right generalizations, or force us to see the shape and
 +
flow of its immanent law -- it is much too sparse a scattering of points
 +
to tease out the lines of its up and coming generations quite so clearly --
 +
but it can be seen to exemplify many of the more significant themes that
 +
we know evolve in more substantial environments, that is, On Beyond Zero,
 +
since we have already seen them, "tho' obscur'd", in these higher realms.
 +
 
 +
One the the themes that I want to to keep an eye on as this discussion
 +
develops is the subject that might be called "computation as semiosis".
 +
 
 +
In this light, any calculus worth its salt must be capable of helping
 +
us do two things, calculation, of course, but also analysis.  This is
 +
probably one of the reasons why the ordinary sort of differential and
 +
integral calculus over quantitative domains is frequently referred to
 +
as "real analysis", or even just "analysis".  It seems quite clear to
 +
me that any adequate logical calculus, in many ways expected to serve
 +
as a qualitative analogue of analytic geometry in the way that it can
 +
be used to describe configurations in logically circumscribed domains,
 +
ought to qualify in both dimensions, namely, analysis and computation.
 +
 
 +
With all of these various features of the situation in mind, then, we come
 +
to the point of viewing analysis and computation as just so many different
 +
kinds of "sign transformations in respect of pragmata" (STIROP's).  Taking
 +
this insight to heart, let us next work to assemble a comprehension of our
 +
concrete examples, set in the medium of the abstract calculi that allow us
 +
to express their qualitative patterns, that may hope to be an increment or
 +
two less inchoate than we have seen so far, and that may even permit us to
 +
catch the action of these fading fleeting sign transformations on the wing.
 +
 
 +
Here is how I picture our latest round of examples
 +
as filling out the framework of this investigation:
 +
 
 +
o-----------------------------o-----------------------------o
 +
|    Objective Framework    |  Interpretive Framework    |
 +
o-----------------------------o-----------------------------o
 +
|                                                          |
 +
|                              s_1 = Logue(o)      |      |
 +
|                              /                    |      |
 +
|                            /                      |      |
 +
|                            @                      |      |
 +
|                          ·  \                      |      |
 +
|                        ·    \                    |      |
 +
|                      ·        i_1 = Model(o)      v      |
 +
|                    ·          s_2 = Model(o)      |      |
 +
|                  ·          /                    |      |
 +
|                ·            /                      |      |
 +
|    Object = o · · · · · · @                      |      |
 +
|                ·            \                      |      |
 +
|                  ·          \                    |      |
 +
|                    ·          i_2 = Tenor(o)      v      |
 +
|                      ·        s_3 = Tenor(o)      |      |
 +
|                        ·    /                    |      |
 +
|                          ·  /                      |      |
 +
|                            @                      |      |
 +
|                            \                      |      |
 +
|                              \                    |      |
 +
|                              i_3 = Sense(o)      v      |
 +
|                                                          |
 +
o-----------------------------------------------------------o
 +
Figure.  Computation As Semiotic Transformation
 +
 
 +
The Figure shows three distinct sign triples of the form <o, s, i>, where
 +
o = ostensible objective = the observed, indicated, or intended situation.
 +
 
 +
| A.  <o, Logue(o), Model(o)>
 
|
 
|
S subsumes T
+
| B. <o, Model(o), Tenor(o)>
 
|
 
|
S types over T
+
| C. <o, Tenor(o), Sense(o)>
 +
 
 +
Let us bring these several signs together in one place,
 +
to compare and contrast their common and their diverse
 +
characters, and to think about why we make such a fuss
 +
about passing from one to the other in the first place.
 +
 
 +
1.  Logue(o)  =  Consat.Log
 +
o-----------------------------------------------------------------------o
 +
|                                                                      |
 +
|  (( pete_plays_guitar ),( pete_plays_sax ),( pete_plays_drums ))      |
 +
|  (( paul_plays_guitar ),( paul_plays_sax ),( paul_plays_drums ))      |
 +
|  (( jane_plays_guitar ),( jane_plays_sax ),( jane_plays_drums ))      |
 +
|                                                                      |
 +
|  (( pete_plays_guitar ),( paul_plays_guitar ),( jane_plays_guitar ))  |
 +
|  (( pete_plays_sax    ),( paul_plays_sax    ),( jane_plays_sax    ))  |
 +
|  (( pete_plays_drums  ),( paul_plays_drums  ),( jane_plays_drums  ))  |
 +
|                                                                      |
 +
|  (( pete_fears_13 ),( pete_fears_cats ),( pete_fears_height ))        |
 +
|  (( paul_fears_13 ),( paul_fears_cats ),( paul_fears_height ))        |
 +
|  (( jane_fears_13 ),( jane_fears_cats ),( jane_fears_height ))        |
 +
|                                                                      |
 +
|  (( pete_fears_13    ),( paul_fears_13    ),( jane_fears_13    ))  |
 +
|  (( pete_fears_cats  ),( paul_fears_cats  ),( jane_fears_cats  ))  |
 +
|  (( pete_fears_height ),( paul_fears_height ),( jane_fears_height ))  |
 +
|                                                                      |
 +
|  ((                                                                  |
 +
|                                                                      |
 +
|  ( pete_plays_guitar )                                                |
 +
|  ( pete_fears_height )                                                |
 +
|                                                                      |
 +
|  ( pete_plays_guitar  pete_fears_height )                            |
 +
|  ( paul_plays_guitar  paul_fears_height )                            |
 +
|  ( jane_plays_guitar  jane_fears_height )                            |
 +
|                                                                      |
 +
|  ( paul_fears_cats )                                                  |
 +
|  ( paul_plays_sax  )                                                  |
 +
|                                                                      |
 +
|  ( pete_plays_sax  pete_fears_cats )                                  |
 +
|  ( paul_plays_sax  paul_fears_cats )                                  |
 +
|  ( jane_plays_sax  jane_fears_cats )                                  |
 +
|                                                                      |
 +
|  ( pete_plays_drums  pete_fears_13 )                                  |
 +
|  ( paul_plays_drums  paul_fears_13 )                                  |
 +
|  ( jane_plays_drums  jane_fears_13 )                                  |
 +
|                                                                      |
 +
|  ( pete_plays_drums  pete_fears_height )                              |
 +
|  ( paul_plays_drums  paul_fears_height )                              |
 +
|  ( jane_plays_drums  jane_fears_height )                              |
 +
|                                                                      |
 +
|  ))                                                                  |
 +
|                                                                      |
 +
o-----------------------------------------------------------------------o
 +
 
 +
2.  Model(o)  =  Consat.Mod  ><>  http://suo.ieee.org/ontology/msg03718.html
 +
 
 +
3.  Tenor(o)  =  Consat.Ten (Just The Gist Of It)
 +
o-------------------------------------------------o
 +
| (pete_plays_guitar )                            | <01> -
 +
|  (pete_plays_sax )                              | <02> -
 +
|  pete_plays_drums                              | <03> +
 +
|    (paul_plays_drums )                          | <04> -
 +
|    (jane_plays_drums )                        | <05> -
 +
|      paul_plays_guitar                          | <06> +
 +
|      (paul_plays_sax )                        | <07> -
 +
|        (jane_plays_guitar )                    | <08> -
 +
|        jane_plays_sax                          | <09> +
 +
|          (pete_fears_13 )                      | <10> -
 +
|          pete_fears_cats                      | <11> +
 +
|            (pete_fears_height )                | <12> -
 +
|            (paul_fears_cats )                  | <13> -
 +
|              (jane_fears_cats )                | <14> -
 +
|              paul_fears_13                    | <15> +
 +
|                (paul_fears_height )            | <16> -
 +
|                (jane_fears_13 )                | <17> -
 +
|                  jane_fears_height *            | <18> +
 +
o-------------------------------------------------o
 +
 
 +
4.  Sense(o)  =  Consat.Sen
 +
o-------------------------------------------------o
 +
| pete_plays_drums                                | <03>
 +
|  paul_plays_guitar                              | <06>
 +
|  jane_plays_sax                                | <09>
 +
|    pete_fears_cats                              | <11>
 +
|    paul_fears_13                              | <15>
 +
|      jane_fears_height                          | <18>
 +
o-------------------------------------------------o
 +
 
 +
As one proceeds through the subsessions of the Theme One Study session,
 +
the computation transforms its larger "signs", in this case text files,
 +
from one to the next, in the sequence:  Logue, Model, Tenor, and Sense.
 +
 
 +
Let us see if we can pin down, on sign-theoretic grounds,
 +
why this very sort of exercise is so routinely necessary.
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
We were in the middle of pursuing several questions about
 +
sign relational transformations in general, in particular,
 +
the following Example of a sign transformation that arose
 +
in the process of setting up and solving a classical sort
 +
of constraint satisfaction problem.
 +
 
 +
o-----------------------------o-----------------------------o
 +
|    Objective Framework    |  Interpretive Framework    |
 +
o-----------------------------o-----------------------------o
 +
|                                                          |
 +
|                              s_1 = Logue(o)      |      |
 +
|                              /                    |      |
 +
|                            /                      |      |
 +
|                            @                      |      |
 +
|                          ·  \                      |      |
 +
|                        ·    \                    |      |
 +
|                      ·        i_1 = Model(o)      v      |
 +
|                    ·          s_2 = Model(o)      |      |
 +
|                  ·          /                    |      |
 +
|                ·            /                      |      |
 +
|    Object = o · · · · · · @                      |      |
 +
|                ·            \                      |      |
 +
|                  ·          \                    |      |
 +
|                    ·          i_2 = Tenor(o)      v      |
 +
|                      ·        s_3 = Tenor(o)      |      |
 +
|                        ·    /                    |      |
 +
|                          ·  /                      |      |
 +
|                            @                      |      |
 +
|                            \                      |      |
 +
|                              \                    |      |
 +
|                              i_3 = Sense(o)      v      |
 +
|                                                          |
 +
o-----------------------------------------------------------o
 +
Figure.  Computation As Semiotic Transformation
 +
 
 +
1.  Logue(o)  =  Consat.Log
 +
o-----------------------------------------------------------------------o
 +
|                                                                      |
 +
|  (( pete_plays_guitar ),( pete_plays_sax ),( pete_plays_drums ))      |
 +
|  (( paul_plays_guitar ),( paul_plays_sax ),( paul_plays_drums ))      |
 +
|  (( jane_plays_guitar ),( jane_plays_sax ),( jane_plays_drums ))      |
 +
|                                                                      |
 +
|  (( pete_plays_guitar ),( paul_plays_guitar ),( jane_plays_guitar ))  |
 +
|  (( pete_plays_sax    ),( paul_plays_sax    ),( jane_plays_sax    ))  |
 +
|  (( pete_plays_drums  ),( paul_plays_drums  ),( jane_plays_drums  ))  |
 +
|                                                                      |
 +
|  (( pete_fears_13 ),( pete_fears_cats ),( pete_fears_height ))        |
 +
|  (( paul_fears_13 ),( paul_fears_cats ),( paul_fears_height ))        |
 +
|  (( jane_fears_13 ),( jane_fears_cats ),( jane_fears_height ))        |
 +
|                                                                      |
 +
|  (( pete_fears_13    ),( paul_fears_13    ),( jane_fears_13    ))  |
 +
|  (( pete_fears_cats  ),( paul_fears_cats  ),( jane_fears_cats  ))  |
 +
|  (( pete_fears_height ),( paul_fears_height ),( jane_fears_height ))  |
 +
|                                                                      |
 +
|  ((                                                                  |
 +
|                                                                      |
 +
|  ( pete_plays_guitar )                                                |
 +
|  ( pete_fears_height )                                                |
 +
|                                                                      |
 +
|  ( pete_plays_guitar  pete_fears_height )                            |
 +
|  ( paul_plays_guitar  paul_fears_height )                            |
 +
|  ( jane_plays_guitar  jane_fears_height )                            |
 +
|                                                                      |
 +
|  ( paul_fears_cats )                                                  |
 +
|  ( paul_plays_sax  )                                                  |
 +
|                                                                      |
 +
|  ( pete_plays_sax  pete_fears_cats )                                  |
 +
|  ( paul_plays_sax  paul_fears_cats )                                  |
 +
|  ( jane_plays_sax  jane_fears_cats )                                  |
 +
|                                                                      |
 +
|  ( pete_plays_drums  pete_fears_13 )                                  |
 +
|  ( paul_plays_drums  paul_fears_13 )                                  |
 +
|  ( jane_plays_drums  jane_fears_13 )                                  |
 +
|                                                                      |
 +
|  ( pete_plays_drums  pete_fears_height )                              |
 +
|  ( paul_plays_drums  paul_fears_height )                              |
 +
|  ( jane_plays_drums  jane_fears_height )                              |
 +
|                                                                      |
 +
|  ))                                                                  |
 +
|                                                                      |
 +
o-----------------------------------------------------------------------o
 +
 
 +
2.  Model(o)  =  Consat.Mod  ><>  http://suo.ieee.org/ontology/msg03718.html
 +
 
 +
3.  Tenor(o)  =  Consat.Ten (Just The Gist Of It)
 +
o-------------------------------------------------o
 +
| (pete_plays_guitar )                            | <01> -
 +
|  (pete_plays_sax )                              | <02> -
 +
|  pete_plays_drums                              | <03> +
 +
|    (paul_plays_drums )                          | <04> -
 +
|    (jane_plays_drums )                        | <05> -
 +
|      paul_plays_guitar                          | <06> +
 +
|      (paul_plays_sax )                        | <07> -
 +
|        (jane_plays_guitar )                    | <08> -
 +
|        jane_plays_sax                          | <09> +
 +
|          (pete_fears_13 )                      | <10> -
 +
|          pete_fears_cats                      | <11> +
 +
|            (pete_fears_height )                | <12> -
 +
|            (paul_fears_cats )                  | <13> -
 +
|              (jane_fears_cats )                | <14> -
 +
|              paul_fears_13                    | <15> +
 +
|                (paul_fears_height )            | <16> -
 +
|                (jane_fears_13 )                | <17> -
 +
|                  jane_fears_height *            | <18> +
 +
o-------------------------------------------------o
 +
 
 +
4.  Sense(o)  =  Consat.Sen
 +
o-------------------------------------------------o
 +
| pete_plays_drums                                | <03>
 +
|  paul_plays_guitar                              | <06>
 +
|  jane_plays_sax                                | <09>
 +
|    pete_fears_cats                              | <11>
 +
|    paul_fears_13                              | <15>
 +
|      jane_fears_height                          | <18>
 +
o-------------------------------------------------o
 +
 
 +
We can worry later about the proper use of quotation marks
 +
in discussing such a case, where the file name "Yada.Yak"
 +
denotes a piece of text that expresses a proposition that
 +
describes an objective situation or an intentional object,
 +
but whatever the case it is clear that we are knee & neck
 +
deep in a sign relational situation of a modest complexity.
 +
 
 +
I think that the right sort of analogy might help us
 +
to sort it out, or at least to tell what's important
 +
from the things that are less so.  The paradigm that
 +
comes to mind for me is the type of context in maths
 +
where we talk about the "locus" or the "solution set"
 +
of an equation, and here we think of the equation as
 +
denoting its solution set or describing a locus, say,
 +
a point or a curve or a surface or so on up the scale.
 +
 
 +
In this figure of speech, we might say for instance:
 +
 
 +
| o is
 +
| what "x^3 - 3x^2 + 3x - 1 = 0" denotes is
 +
| what "(x-1)^3 = 0" denotes is
 +
| what "1" denotes
 +
| is 1.
 +
 
 +
Making explicit the assumptive interpretations
 +
that the context probably enfolds in this case,
 +
we assume this description of the solution set:
 +
 
 +
{x in the Reals : x^3 - 3x^2 + 3x -1 = 0} = {1}.
 +
 
 +
In sign relational terms, we have the 3-tuples:
 +
 
 +
| <o, "x^3 - 3x^2 + 3x - 1 = 0", "(x-1)^3 = 0">
 +
|
 +
| <o, "(x-1)^3 = 0", "1">
 +
|
 +
| <o, "1", "1">
 +
 
 +
As it turns out we discover that the
 +
object o was really just 1 all along.
 +
 
 +
But why do we put ourselves through the rigors of these
 +
transformations at all?  If 1 is what we mean, why not
 +
just say "1" in the first place and be done with it?
 +
A person who asks a question like that has forgetten
 +
how we keep getting ourselves into these quandaries,
 +
and who it is that assigns the problems, for it is
 +
Nature herself who is the taskmistress here and the
 +
problems are set in the manner that she determines,
 +
not in the style to which we would like to become
 +
accustomed.  The best that we can demand of our
 +
various and sundry calculi is that they afford
 +
us with the nets and the snares more readily
 +
to catch the shape of the problematic game
 +
as it flies up before us on its own wings,
 +
and only then to tame it to the amenable
 +
demeanors that we find to our liking.
 +
 
 +
In sum, the first place is not ours to take.
 +
We are but poor second players in this game.
 +
 
 +
That understood, I can now lay out our present Example
 +
along the lines of this familiar mathematical exercise.
 +
 
 +
| o is
 +
| what Consat.Log denotes is
 +
| what Consat.Mod denotes is
 +
| what Consat.Ten denotes is
 +
| what Consat.Sen denotes.
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
It will be good to keep this picture before us a while longer:
 +
 
 +
o-----------------------------o-----------------------------o
 +
|    Objective Framework    |  Interpretive Framework    |
 +
o-----------------------------o-----------------------------o
 +
|                                                          |
 +
|                              s_1  =  Logue(o)    |      |
 +
|                              /                    |      |
 +
|                            /                      |      |
 +
|                            @                      |      |
 +
|                          ·  \                      |      |
 +
|                        ·    \                    |      |
 +
|                      ·        i_1  =  Model(o)    v      |
 +
|                    ·          s_2  =  Model(o)    |      |
 +
|                  ·          /                    |      |
 +
|                ·            /                      |      |
 +
|  Object  =  o · · · · · · @                      |      |
 +
|                ·            \                      |      |
 +
|                  ·          \                    |      |
 +
|                    ·          i_2  =  Tenor(o)    v      |
 +
|                      ·        s_3  =  Tenor(o)    |      |
 +
|                        ·    /                    |      |
 +
|                          ·  /                      |      |
 +
|                            @                      |      |
 +
|                            \                      |      |
 +
|                              \                    |      |
 +
|                              i_3  =  Sense(o)    v      |
 +
|                                                          |
 +
o-----------------------------------------------------------o
 +
Figure.  Computation As Semiotic Transformation
 +
 
 +
The labels that decorate the syntactic plane and indicate
 +
the semiotic transitions in the interpretive panel of the
 +
framework point us to text files whose contents rest here:
 +
 
 +
http://suo.ieee.org/ontology/msg03722.html
 +
 
 +
The reason that I am troubling myself -- and no doubt you --
 +
with the details of this Example is because it highlights
 +
a number of the thistles that we will have to grasp if we
 +
ever want to escape from the traps of YARNBOL and YARWARS
 +
in which so many of our fairweather fiends are seeking to
 +
ensnare us, and not just us -- the whole web of the world.
 +
 
 +
YARNBOL  =  Yet Another Roman Numeral Based Ontology Language.
 +
YARWARS  =  Yet Another Representation Without A Reasoning System.
 +
 
 +
In order to avoid this, or to reverse the trend once it gets started,
 +
we just have to remember what a dynamic living process a computation
 +
really is, precisely because it is meant to serve as an iconic image
 +
of dynamic, deliberate, purposeful transformations that we are bound
 +
to go through and to carry out in a hopeful pursuit of the solutions
 +
to the many real live problems that life and society place before us.
 +
So I take it rather seriously.
 +
 
 +
Okay, back to the grindstone.
 +
 
 +
The question is:  "Why are these trips necessary?"
 +
 
 +
How come we don't just have one proper expression
 +
for each situation under the sun, or all possible
 +
suns, I guess, for some, and just use that on any
 +
appearance, instance, occasion of that situation?
 +
 
 +
Why is it ever necessary to begin with an obscure description
 +
of a situation? -- for that is exactly what the propositional
 +
expression caled "Logue(o)", for Example, the Consat.Log file,
 +
really is.
 +
 
 +
Maybe I need to explain that first.
 +
 
 +
The first three items of syntax -- Logue(o), Model(o), Tenor(o) --
 +
are all just so many different propositional expressions that
 +
denote one and the same logical-valued function p : X -> %B%,
 +
and one whose abstract image we may well enough describe as
 +
a boolean function of the abstract type q : %B%^k -> %B%,
 +
where k happens to be 18 in the present Consat Example.
 +
 
 +
If we were to write out the truth table for q : %B%^18 -> %B%
 +
it would take 2^18 = 262144 rows.  Using the bold letter #x#
 +
for a coordinate tuple, writing #x# = <x_1, ..., x_18>, each
 +
row of the table would have the form <x_1, ..., x_18, q(#x#)>.
 +
And the function q is such that all rows evalue to %0% save 1.
 +
 
 +
Each of the four different formats expresses this fact about q
 +
in its own way.  The first three are logically equivalent, and
 +
the last one is the maximally determinate positive implication
 +
of what the others all say.
 +
 
 +
From this point of view, the logical computation that we went through,
 +
in the sequence Logue, Model, Tenor, Sense, was a process of changing
 +
from an obscure sign of the objective proposition to a more organized
 +
arrangement of its satisfying or unsatisfying interpretations, to the
 +
most succinct possible expression of the same meaning, to an adequate
 +
positive projection of it that is useful enough in the proper context.
 +
 
 +
This is the sort of mill -- it's called "computation" -- that we have
 +
to be able to put our representations through on a recurrent, regular,
 +
routine basis, that is, if we expect them to have any utility at all.
 +
And it is only when we have started to do that in genuinely effective
 +
and efficient ways, that we can even begin to think about facilitating
 +
any bit of qualitative conceptual analysis through computational means.
 +
 
 +
And as far as the qualitative side of logical computation
 +
and conceptual analysis goes, we have barely even started.
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
We are contemplating the sequence of initial and normal forms
 +
for the Consat problem and we have noted the following system
 +
of logical relations, taking the enchained expressions of the
 +
objective situation o in a pairwise associated way, of course:
 +
 
 +
Logue(o)  <=>  Model(o)  <=>  Tenor(o)  =>  Sense(o).
 +
 
 +
The specifics of the propositional expressions are cited here:
 +
 
 +
http://suo.ieee.org/ontology/msg03722.html
 +
 
 +
If we continue to pursue the analogy that we made with the form
 +
of mathematical activity commonly known as "solving equations",
 +
then there are many salient features of this type of logical
 +
problem solving endeavor that suddenly leap into the light.
 +
 
 +
First of all, we notice the importance of "equational reasoning"
 +
in mathematics, by which I mean, not just the quantitative type
 +
of equation that forms the matter of the process, but also the
 +
qualitative type of equation, or the "logical equivalence",
 +
that connects each expression along the way, right up to
 +
the penultimate stage, when we are satisfied in a given
 +
context to take a projective implication of the total
 +
knowledge of the situation that we have been taking
 +
some pains to preserve at every intermediate stage
 +
of the game.
 +
 
 +
This general pattern or strategy of inference, working its way through
 +
phases of "equational" or "total information preserving" inference and
 +
phases of "implicational" or "selective information losing" inference,
 +
is actually very common throughout mathematics, and I have in mind to
 +
examine its character in greater detail and in a more general setting.
 +
 
 +
Just as the barest hint of things to come along these lines, you might
 +
consider the question of what would constitute the equational analogue
 +
of modus ponens, in other words the scheme of inference that goes from
 +
x and x=>y to y.  Well the answer is a scheme of inference that passes
 +
from x and x=>y to x&y, and then being reversible, back again.  I will
 +
explore the rationale and the utility of this gambit in future reports.
 +
 
 +
One observation that we can make already at this point,
 +
however, is that these schemes of equational reasoning,
 +
or reversible inference, remain poorly developed among
 +
our currently prevailing styles of inference in logic,
 +
their potentials for applied logical software hardly
 +
being broached in our presently available systems.
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
Extra Examples
 +
 
 +
1.  Propositional logic example.
 +
Files:  Alpha.lex + Prop.log
 +
Ref:    [Cha, 20, Example 2.12]
 +
 
 +
2.  Chemical synthesis problem.
 +
Files:  Chem.*
 +
Ref:    [Cha, 21, Example 2.13]
 +
 
 +
3.  N Queens problem.
 +
Files:  Queen*.*,  Q8.*,  Q5.*
 +
Refs:  [BaC, 166], [VaH, 122], [Wir, 143].
 +
Notes:  Only the 5 Queens example will run in 640K memory.
 +
        Use the "Queen.lex" file to load the "Q5.eg*" log files.
 +
 
 +
4.  Five Houses puzzle.
 +
Files:  House.*
 +
Ref:    [VaH, 132].
 +
Notes:  Will not run in 640K memory.
 +
 
 +
5.  Graph coloring example.
 +
Files:  Color.*
 +
Ref:    [Wil, 196].
 +
 
 +
6.  Examples of Cook's Theorem in computational complexity,
 +
    that propositional satisfiability is NP-complete.
 +
 
 +
Files:  StiltN.* = "Space and Time Limited Turing Machine",
 +
        with N units of space and N units of time.
 +
        StuntN.* = "Space and Time Limited Turing Machine",
 +
        for computing the parity of a bit string,
 +
        with Number of Tape cells of input equal to N.
 +
Ref:    [Wil, 188-201].
 +
Notes:  Can only run Turing machine example for input of size 2.
 +
        Since the last tape cell is used for an end-of-file marker,
 +
        this amounts to only one significant digit of computation.
 +
        Use the "Stilt3.lex" file  to load the "Stunt2.egN" files.
 +
        Their Sense file outputs appear on the "Stunt2.seN" files.
 +
 
 +
7.  Fabric knowledge base.
 +
Files:  Fabric.*, Fab.*
 +
Ref:    [MaW, 8-16].
 +
 
 +
8.  Constraint Satisfaction example.
 +
Files:  Consat1.*, Consat2.*
 +
Ref:    [Win, 449, Exercise 3-9].
 +
Notes:  Attributed to Kenneth D. Forbus.
  
The form "S :> T" is here recruited for polymorphic
+
References
employment in at least the following types of roles:
 
  
1To signify that an individually named or quoted string T is
+
| Angluin, Dana,
    being typed as a sentence S of the language of interest !L!.
+
|"Learning with Hints", in
 +
|'Proceedings of the 1988 Workshop on Computational Learning Theory',
 +
| edited by D. Haussler & L. Pitt, Morgan Kaufmann, San Mateo, CA, 1989.
 +
 
 +
| Ball, W.W. Rouse, & Coxeter, H.S.M.,
 +
|'Mathematical Recreations and Essays', 13th ed.,
 +
| Dover, New York, NY, 1987.
 +
 
 +
| Chang, Chin-Liang & Lee, Richard Char-Tung,
 +
|'Symbolic Logic and Mechanical Theorem Proving',
 +
| Academic Press, New York, NY, 1973.
 +
 
 +
| Denning, Peter J., Dennis, Jack B., and Qualitz, Joseph E.,
 +
|'Machines, Languages, and Computation',
 +
| Prentice-Hall, Englewood Cliffs, NJ, 1978.
 +
 
 +
| Edelman, Gerald M.,
 +
|'Topobiology:  An Introduction to Molecular Embryology',
 +
| Basic Books, New York, NY, 1988.
 +
 
 +
| Lloyd, J.W.,
 +
|'Foundations of Logic Programming',
 +
| Springer-Verlag, Berlin, 1984.
 +
 
 +
| Maier, David & Warren, David S.,
 +
|'Computing with Logic:  Logic Programming with Prolog',
 +
| Benjamin/Cummings, Menlo Park, CA, 1988.
 +
 
 +
| McClelland, James L. and Rumelhart, David E.,
 +
|'Explorations in Parallel Distributed Processing:
 +
| A Handbook of Models, Programs, and Exercises',
 +
| MIT Press, Cambridge, MA, 1988.
 +
 
 +
| Peirce, Charles Sanders,
 +
|'Collected Papers of Charles Sanders Peirce',
 +
| edited by Charles Hartshorne, Paul Weiss, & Arthur W. Burks,
 +
| Harvard University Press, Cambridge, MA, 1931-1960.
 +
 
 +
| Peirce, Charles Sanders,
 +
|'The New Elements of Mathematics',
 +
| edited by Carolyn Eisele, Mouton, The Hague, 1976.
 +
 
 +
|'Charles S. Peirce:  Selected Writings;  Values in a Universe of Chance',
 +
| edited by Philip P. Wiener, Dover, New York, NY, 1966.
 +
 
 +
| Spencer Brown, George,
 +
|'Laws of Form',
 +
| George Allen & Unwin, London, UK, 1969.
 +
 
 +
| Van Hentenryck, Pascal,
 +
|'Constraint Satisfaction in Logic Programming',
 +
| MIT Press, Cambridge, MA, 1989.
 +
 
 +
| Wilf, Herbert S.,
 +
|'Algorithms and Complexity',
 +
| Prentice-Hall, Englewood Cliffs, NJ, 1986.
 +
 
 +
| Winston, Patrick Henry,
 +
|'Artificial Intelligence, 2nd ed.,
 +
| Addison-Wesley, Reading, MA, 1984.
 +
 
 +
| Wirth, Niklaus,
 +
|'Algorithms + Data Structures = Programs',
 +
| Prentice-Hall, Englewood Cliffs, NJ, 1976.
 +
 
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
</pre>
 +
 
 +
 
 +
==Fragmata==
 +
 
 +
===Cactus Language, Operator Variables, Reflection===
 +
 
 +
====NKS Forum (Mar 2005)====
 +
 
 +
* http://forum.wolframscience.com/showthread.php?threadid=755
 +
 
 +
===Cactus Rules===
 +
 
 +
====Inquiry List (Mar 2004)====
 +
 
 +
* http://stderr.org/pipermail/inquiry/2004-March/thread.html#1265
 +
 
 +
====Ontology List (Mar 2004)====
 +
 
 +
* http://suo.ieee.org/ontology/thrd1.html#05486
 +
* http://suo.ieee.org/ontology/thrd2.html#05489
 +
 
 +
===Cactus Town Cartoons===
 +
 
 +
====Arisbe List (Dec 2001)====
 +
 
 +
* http://stderr.org/pipermail/arisbe/2001-December/thread.html#1214
 +
# http://stderr.org/pipermail/arisbe/2001-December/001214.html
 +
# http://stderr.org/pipermail/arisbe/2001-December/001217.html
 +
 
 +
===Extensions Of Logical Graphs===
 +
 
 +
====Ontology List (Dec 2001)====
 +
 
 +
* http://suo.ieee.org/ontology/msg03586.html
 +
 
 +
====Arisbe List (Dec 2001)====
 +
 
 +
* http://stderr.org/pipermail/arisbe/2001-December/thread.html#1230
 +
 
 +
===Logic In Graphs===
 +
 
 +
====NKS Forum (Feb 2005)====
 +
 
 +
* http://forum.wolframscience.com/showthread.php?threadid=726
 +
 
 +
===Propositional Equation Reasoning Systems===
 +
 
 +
====Arisbe List (Mar&ndash;Apr 2001)====
 +
 
 +
* http://stderr.org/pipermail/arisbe/2001-March/thread.html#380
 +
* http://stderr.org/pipermail/arisbe/2001-April/thread.html#407
 +
# http://stderr.org/pipermail/arisbe/2001-March/000380.html
 +
# http://stderr.org/pipermail/arisbe/2001-April/000407.html
 +
# http://stderr.org/pipermail/arisbe/2001-April/000409.html
 +
# http://stderr.org/pipermail/arisbe/2001-April/000411.html
 +
# http://stderr.org/pipermail/arisbe/2001-April/000412.html
 +
# http://stderr.org/pipermail/arisbe/2001-April/000413.html
 +
# http://stderr.org/pipermail/arisbe/2001-April/000416.html
 +
# http://stderr.org/pipermail/arisbe/2001-April/000417.html
 +
# http://stderr.org/pipermail/arisbe/2001-April/000421.html
 +
# http://stderr.org/pipermail/arisbe/2001-April/000427.html
 +
# http://stderr.org/pipermail/arisbe/2001-April/000428.html
 +
# http://stderr.org/pipermail/arisbe/2001-April/000430.html
 +
# http://stderr.org/pipermail/arisbe/2001-April/000432.html
 +
# http://stderr.org/pipermail/arisbe/2001-April/000433.html
 +
# http://stderr.org/pipermail/arisbe/2001-April/000434.html
 +
# http://stderr.org/pipermail/arisbe/2001-April/000435.html
 +
 
 +
====Inquiry List (Mar 2003)====
 +
 
 +
* http://stderr.org/pipermail/inquiry/2003-March/thread.html#126
 +
 
 +
===Theme One Program : Logical Cacti===
 +
 
 +
====Inquiry List (Mar 2003)====
 +
 
 +
* http://stderr.org/pipermail/inquiry/2003-March/000114.html
 +
 
 +
====Inquiry List (Feb 2005)====
 +
 
 +
* http://stderr.org/pipermail/inquiry/2005-February/002360.html
 +
 
 +
==Alternate Version? Editing Fragment?==
 +
 
 +
<pre>
 +
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
 +
 
 +
In this Subsection, I finally bring together many of what may
 +
have appeared to be wholly independent threads of development,
 +
in the hope of paying off a percentage of my promissory notes,
 +
even if a goodly number my creditors have no doubt long since
 +
forgotten, if not exactly forgiven the debentures in question.
 +
 
 +
For ease of reference, I repeat here a couple of the
 +
definitions that are needed again in this discussion.
 +
 
 +
| A "boolean connection" of degree k, also known as a "boolean function"
 +
| on k variables, is a map of the form F : %B%^k -> %B%.  In other words,
 +
| a boolean connection of degree k is a proposition about things in the
 +
| universe of discourse X = %B%^k.
 +
|
 +
| An "imagination" of degree k on X is a k-tuple of propositions
 +
| about things in the universe X.  By way of displaying the kinds
 +
| of notation that are used to express this idea, the imagination
 +
| #f# = <f_1, ..., f_k> is can be given as a sequence of indicator
 +
| functions f_j : X -> %B%, for j = 1 to k.  All of these features
 +
| of the typical imagination #f# can be summed up in either one of
 +
| two ways:  either in the form of a membership statement, stating
 +
| words to the effect that #f# belongs to the space (X -> %B%)^k,
 +
| or in the form of the type declaration that #f# : (X -> %B%)^k,
 +
| though perhaps the latter specification is slightly more precise
 +
| than the former.
  
2.  To express the fact or to make the assertion that each member
+
The definition of the "stretch" operation and the uses of the
    of a specified set of strings T c !A!* also belongs to the
+
various brands of denotational operators can be reviewed here:
    syntactic category S, the one that qualifies a string as
 
    being a sentence in the relevant formal language !L!.
 
  
3To specify the intension or to signify the intention that every
+
055http://suo.ieee.org/email/msg07466.html
    string that fits the conditions of the abstract type T must also
+
057.  http://suo.ieee.org/email/msg07469.html
    fall under the grammatical heading of a sentence, as indicated by
 
    the type name "S", all within the target language !L!.
 
  
In these types of situation the letter "S", that signifies the type of
+
070.  http://suo.ieee.org/ontology/msg03473.html
a sentence in the language of interest, is called the "initial symbol"
+
071.  http://suo.ieee.org/ontology/msg03479.html
or the "sentence symbol" of a candidate formal grammar for the language,
 
while any number of letters like "T", signifying other types of strings
 
that are necessary to a reasonable account or a rational reconstruction
 
of the sentences that belong to the language, are collectively referred
 
to as "intermediate symbols".
 
  
Combining the singleton set {"S"} whose sole member is the initial symbol
+
o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o
with the set !Q! that assembles together all of the intermediate symbols
 
results in the set {"S"} |_| !Q! of "non-terminal symbols".  Completing
 
the package, the alphabet !A! of the language is also known as the set
 
of "terminal symbols".  In this discussion, I will adopt the convention
 
that !Q! is the set of intermediate symbols, but I will often use "q"
 
as a typical variable that ranges over all of the non-terminal symbols,
 
q in {"S"} |_| !Q!.  Finally, it is convenient to refer to all of the
 
symbols in {"S"} |_| !Q! |_| !A! as the "augmented alphabet" of the
 
prospective grammar for the language, and accordingly to describe
 
the strings in ({"S"} |_| !Q! |_| !A!)* as the "augmented strings",
 
in effect, expressing the forms that are superimposed on a language
 
by one of its conceivable grammars.  In certain settings is becomes
 
desirable to separate the augmented strings that contain the symbol
 
"S" from all other sorts of augmented strings.  In these situations,
 
the strings in the disjoint union {"S"} |_| (!Q! |_| !A!)* are known
 
as the "sentential forms" of the associated grammar.
 
 
</pre>
 
</pre>

Latest revision as of 21:01, 29 June 2009

Notes Found in a Cactus Patch

Note. This is a collection of fragments from previous discussions that I plan to use in documenting the cactus graph syntax for propositional logic.

Cactus Language

Table 13 illustrates the existential interpretation of cactus graphs and cactus expressions by providing English translations for a few of the most basic and commonly occurring forms.

Even though I do most of my thinking in the existential interpretation, I will continue to speak of these forms as logical graphs, because I think it is an important fact about them that the formal validity of the axioms and theorems is not dependent on the choice between the entitative and the existential interpretations.

The first extension is the reflective extension of logical graphs (RefLog). It is obtained by generalizing the negation operator "\(\texttt{(~)}\)" in a certain way, calling "\(\texttt{(~)}\)" the controlled, moderated, or reflective negation operator of order 1, then adding another such operator for each finite \(k = 2, 3, \ldots .\)

In sum, these operators are symbolized by bracketed argument lists as follows: "\(\texttt{(~)}\)", "\(\texttt{(~,~)}\)", "\(\texttt{(~,~,~)}\)", …, where the number of slots is the order of the reflective negation operator in question.

The cactus graph and the cactus expression shown here are both described as a spike.

o---------------------------------------o
|                                       |
|                   o                   |
|                   |                   |
|                   @                   |
|                                       |
o---------------------------------------o
|                  ( )                  |
o---------------------------------------o

The rule of reduction for a lobe is:

o---------------------------------------o
|                                       |
|  x_1   x_2   ...   x_k                |
|   o-----o--- ... ---o                 |
|    \               /                  |
|     \             /                   |
|      \           /                    |
|       \         /                     |
|        \       /                      |
|         \     /                       |
|          \   /                        |
|           \ /                         |
|            @      =      @            |
|                                       |
o---------------------------------------o

if and only if exactly one of the \(x_j\!\) is a spike.

In Ref Log, an expression of the form \(\texttt{((}~ e_1 ~\texttt{),(}~ e_2 ~\texttt{),(}~ \ldots ~\texttt{),(}~ e_k ~\texttt{))}\) expresses the fact that exactly one of the \(e_j\!\) is true. Expressions of this form are called universal partition expressions, and they parse into a type of graph called a painted and rooted cactus (PARC):

o---------------------------------------o
|                                       |
|  e_1   e_2   ...   e_k                |
|   o     o           o                 |
|   |     |           |                 |
|   o-----o--- ... ---o                 |
|    \               /                  |
|     \             /                   |
|      \           /                    |
|       \         /                     |
|        \       /                      |
|         \     /                       |
|          \   /                        |
|           \ /                         |
|            @                          |
|                                       |
o---------------------------------------o
o---------------------------------------o
|                                       |
| ( x1, x2, ..., xk )  =  [blank]       |
|                                       |
| iff                                   |
|                                       |
| Just one of the arguments             |
| x1, x2, ..., xk  =  ()                |
|                                       |
o---------------------------------------o

The interpretation of these operators, read as assertions about the values of their listed arguments, is as follows:

Existential Interpretation: Just one of the k argument is false.
Entitative Interpretation: Not just one of the k arguments is true.
o-------------------o-------------------o-------------------o
|       Graph       |      String       |    Translation    |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|         @         |        " "        |       true.       |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|         o         |                   |                   |
|         |         |                   |                   |
|         @         |        ( )        |      untrue.      |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|         r         |                   |                   |
|         @         |         r         |         r.        |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|         r         |                   |                   |
|         o         |                   |                   |
|         |         |                   |                   |
|         @         |        (r)        |       not r.      |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|       r s t       |                   |                   |
|         @         |       r s t       |   r and s and t.  |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|       r s t       |                   |                   |
|       o o o       |                   |                   |
|        \|/        |                   |                   |
|         o         |                   |                   |
|         |         |                   |                   |
|         @         |    ((r)(s)(t))    |    r or s or t.   |
o-------------------o-------------------o-------------------o
|                   |                   |    r implies s.   |
|         r   s     |                   |                   |
|         o---o     |                   |    if r then s.   |
|         |         |                   |                   |
|         @         |      (r (s))      |    no r sans s.   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|       r   s       |                   |                   |
|       o---o       |                   | r exclusive-or s. |
|        \ /        |                   |                   |
|         @         |      (r , s)      | r not equal to s. |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|       r   s       |                   |                   |
|       o---o       |                   |                   |
|        \ /        |                   |                   |
|         o         |                   | r if & only if s. |
|         |         |                   |                   |
|         @         |     ((r , s))     | r equates with s. |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|      r  s  t      |                   |                   |
|      o--o--o      |                   |                   |
|       \   /       |                   |                   |
|        \ /        |                   |  just one false   |
|         @         |    (r , s , t)    |  out of r, s, t.  |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|      r  s  t      |                   |                   |
|      o  o  o      |                   |                   |
|      |  |  |      |                   |                   |
|      o--o--o      |                   |                   |
|       \   /       |                   |                   |
|        \ /        |                   |   just one true   |
|         @         |   ((r),(s),(t))   |   among r, s, t.  |
o-------------------o-------------------o-------------------o
|                   |                   |   genus t over    |
|         r  s      |                   |   species r, s.   |
|         o  o      |                   |                   |
|      t  |  |      |                   |   partition t     |
|      o--o--o      |                   |   among r & s.    |
|       \   /       |                   |                   |
|        \ /        |                   |   whole pie t:    |
|         @         |   ( t ,(r),(s))   |   slices r, s.    |
o-------------------o-------------------o-------------------o
Table 13.  The Existential Interpretation
o-------------------o-------------------o-------------------o
|   Cactus Graph    | Cactus Expression |    Existential    |
|                   |                   |  Interpretation   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|         @         |        " "        |       true.       |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|         o         |                   |                   |
|         |         |                   |                   |
|         @         |        ( )        |      untrue.      |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|         a         |                   |                   |
|         @         |         a         |         a.        |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|         a         |                   |                   |
|         o         |                   |                   |
|         |         |                   |                   |
|         @         |        (a)        |       not a.      |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|       a b c       |                   |                   |
|         @         |       a b c       |   a and b and c.  |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|       a b c       |                   |                   |
|       o o o       |                   |                   |
|        \|/        |                   |                   |
|         o         |                   |                   |
|         |         |                   |                   |
|         @         |    ((a)(b)(c))    |    a or b or c.   |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|                   |                   |    a implies b.   |
|         a   b     |                   |                   |
|         o---o     |                   |    if a then b.   |
|         |         |                   |                   |
|         @         |      (a (b))      |    no a sans b.   |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|       a   b       |                   |                   |
|       o---o       |                   | a exclusive-or b. |
|        \ /        |                   |                   |
|         @         |      (a , b)      | a not equal to b. |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|       a   b       |                   |                   |
|       o---o       |                   |                   |
|        \ /        |                   |                   |
|         o         |                   | a if & only if b. |
|         |         |                   |                   |
|         @         |     ((a , b))     | a equates with b. |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|      a  b  c      |                   |                   |
|      o--o--o      |                   |                   |
|       \   /       |                   |                   |
|        \ /        |                   |  just one false   |
|         @         |    (a , b , c)    |  out of a, b, c.  |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|      a  b  c      |                   |                   |
|      o  o  o      |                   |                   |
|      |  |  |      |                   |                   |
|      o--o--o      |                   |                   |
|       \   /       |                   |                   |
|        \ /        |                   |   just one true   |
|         @         |   ((a),(b),(c))   |   among a, b, c.  |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|                   |                   |   genus a over    |
|         b  c      |                   |   species b, c.   |
|         o  o      |                   |                   |
|      a  |  |      |                   |   partition a     |
|      o--o--o      |                   |   among b & c.    |
|       \   /       |                   |                   |
|        \ /        |                   |   whole pie a:    |
|         @         |   ( a ,(b),(c))   |   slices b, c.    |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
Table 14.  The Entitative Interpretation
o-------------------o-------------------o-------------------o
|   Cactus Graph    | Cactus Expression |    Entitative     |
|                   |                   |  Interpretation   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|         @         |        " "        |      untrue.      |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|         o         |                   |                   |
|         |         |                   |                   |
|         @         |        ( )        |       true.       |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|         a         |                   |                   |
|         @         |         a         |         a.        |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|         a         |                   |                   |
|         o         |                   |                   |
|         |         |                   |                   |
|         @         |        (a)        |       not a.      |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|       a b c       |                   |                   |
|         @         |       a b c       |    a or b or c.   |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|       a b c       |                   |                   |
|       o o o       |                   |                   |
|        \|/        |                   |                   |
|         o         |                   |                   |
|         |         |                   |                   |
|         @         |    ((a)(b)(c))    |   a and b and c.  |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|                   |                   |    a implies b.   |
|                   |                   |                   |
|         o a       |                   |    if a then b.   |
|         |         |                   |                   |
|         @ b       |      (a) b        |    not a, or b.   |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|       a   b       |                   |                   |
|       o---o       |                   | a if & only if b. |
|        \ /        |                   |                   |
|         @         |      (a , b)      | a equates with b. |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|       a   b       |                   |                   |
|       o---o       |                   |                   |
|        \ /        |                   |                   |
|         o         |                   | a exclusive-or b. |
|         |         |                   |                   |
|         @         |     ((a , b))     | a not equal to b. |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|      a  b  c      |                   |                   |
|      o--o--o      |                   |                   |
|       \   /       |                   |                   |
|        \ /        |                   | not just one true |
|         @         |    (a , b , c)    | out of a, b, c.   |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|      a  b  c      |                   |                   |
|      o--o--o      |                   |                   |
|       \   /       |                   |                   |
|        \ /        |                   |                   |
|         o         |                   |                   |
|         |         |                   |   just one true   |
|         @         |   ((a , b , c))   |   among a, b, c.  |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
|                   |                   |                   |
|      a            |                   |                   |
|      o            |                   |   genus a over    |
|      |  b  c      |                   |   species b, c.   |
|      o--o--o      |                   |                   |
|       \   /       |                   |   partition a     |
|        \ /        |                   |   among b & c.    |
|         o         |                   |                   |
|         |         |                   |   whole pie a:    |
|         @         |   ( a ,(b),(c))   |   slices b, c.    |
|                   |                   |                   |
o-------------------o-------------------o-------------------o
o-----------------o-----------------o-----------------o-----------------o
|      Graph      |     String      |   Entitative    |   Existential   |
o-----------------o-----------------o-----------------o-----------------o
|                 |                 |                 |                 |
|        @        |       " "       |     untrue.     |      true.      |
o-----------------o-----------------o-----------------o-----------------o
|                 |                 |                 |                 |
|        o        |                 |                 |                 |
|        |        |                 |                 |                 |
|        @        |       ( )       |      true.      |     untrue.     |
o-----------------o-----------------o-----------------o-----------------o
|                 |                 |                 |                 |
|        r        |                 |                 |                 |
|        @        |        r        |        r.       |        r.       |
o-----------------o-----------------o-----------------o-----------------o
|                 |                 |                 |                 |
|        r        |                 |                 |                 |
|        o        |                 |                 |                 |
|        |        |                 |                 |                 |
|        @        |       (r)       |      not r.     |      not r.     |
o-----------------o-----------------o-----------------o-----------------o
|                 |                 |                 |                 |
|      r s t      |                 |                 |                 |
|        @        |      r s t      |   r or s or t.  |  r and s and t. |
o-----------------o-----------------o-----------------o-----------------o
|                 |                 |                 |                 |
|      r s t      |                 |                 |                 |
|      o o o      |                 |                 |                 |
|       \|/       |                 |                 |                 |
|        o        |                 |                 |                 |
|        |        |                 |                 |                 |
|        @        |   ((r)(s)(t))   |  r and s and t. |   r or s or t.  |
o-----------------o-----------------o-----------------o-----------------o
|                 |                 |                 |   r implies s.  |
|                 |                 |                 |                 |
|        o r      |                 |                 |   if r then s.  |
|        |        |                 |                 |                 |
|        @ s      |      (r) s      |  not r, or s    |   no r sans s.  |
o-----------------o-----------------o-----------------o-----------------o
|                 |                 |                 |   r implies s.  |
|        r   s    |                 |                 |                 |
|        o---o    |                 |                 |   if r then s.  |
|        |        |                 |                 |                 |
|        @        |     (r (s))     |                 |   no r sans s.  |
o-----------------o-----------------o-----------------o-----------------o
|                 |                 |                 |                 |
|      r   s      |                 |                 |                 |
|      o---o      |                 |                 |r exclusive-or s.|
|       \ /       |                 |                 |                 |
|        @        |     (r , s)     |                 |r not equal to s.|
o-----------------o-----------------o-----------------o-----------------o
|                 |                 |                 |                 |
|      r   s      |                 |                 |                 |
|      o---o      |                 |                 |                 |
|       \ /       |                 |                 |                 |
|        o        |                 |                 |r if & only if s.|
|        |        |                 |                 |                 |
|        @        |    ((r , s))    |                 |r equates with s.|
o-----------------o-----------------o-----------------o-----------------o
|                 |                 |                 |                 |
|     r  s  t     |                 |                 |                 |
|     o--o--o     |                 |                 |                 |
|      \   /      |                 |                 |                 |
|       \ /       |                 |                 | just one false  |
|        @        |   (r , s , t)   |                 | out of r, s, t. |
o-----------------o-----------------o-----------------o-----------------o
|                 |                 |                 |                 |
|     r  s  t     |                 |                 |                 |
|     o  o  o     |                 |                 |                 |
|     |  |  |     |                 |                 |                 |
|     o--o--o     |                 |                 |                 |
|      \   /      |                 |                 |                 |
|       \ /       |                 |                 |  just one true  |
|        @        |  ((r),(s),(t))  |                 |  among r, s, t. |
o-----------------o-----------------o-----------------o-----------------o
|                 |                 |                 |  genus t over   |
|        r  s     |                 |                 |  species r, s.  |
|        o  o     |                 |                 |                 |
|     t  |  |     |                 |                 |  partition t    |
|     o--o--o     |                 |                 |  among r & s.   |
|      \   /      |                 |                 |                 |
|       \ /       |                 |                 |  whole pie t:   |
|        @        |  ( t ,(r),(s))  |                 |  slices r, s.   |
o-----------------o-----------------o-----------------o-----------------o

Zeroth Order Logic

Here is a scaled-down version of one of my very first applications,
having to do with the demographic variables in a survey data base.

This Example illustrates the use of 2-variate logical forms
for expressing and reasoning about the logical constraints
that are involved in the following types of situations:

1.  Distinction:     A =/= B
    Also known as:   logical inequality, exclusive disjunction
    Represented as:  ( A , B )
    Graphed as:
    |
    |   A   B
    |   o---o
    |    \ /
    |     @

2.  Equality:        A = B
    Also known as:   logical equivalence, if and only if, A <=> B
    Represented as:  (( A , B ))
    Graphed as:
    |
    |   A   B
    |   o---o
    |    \ /
    |     o
    |     |
    |     @

3.  Implication:     A => B
    Also known as:   entailment, if-then
    Represented as:  ( A ( B ))
    Graphed as:
    |
    |   A   B
    |   o---o
    |   |
    |   @

Example of a proposition expressing a "zeroth order theory" (ZOT):

Consider the following text, written in what I am calling "Ref Log",
also known as the "Cactus Language" synpropositional logic:

|   ( male  , female )
|   (( boy  , male child ))
|   (( girl , female child ))
|   ( child ( human ))

Graphed as:

|                   boy   male     girl   female
|                     o---o child     o---o child
|  male   female       \ /             \ /          child   human
|     o---o             o               o               o---o
|      \ /              |               |               |
|       @               @               @               @|

Nota Bene.  Due to graphic constraints -- no, the other
kind of graphic constraints -- of the immediate medium,
I am forced to string out the logical conjuncts of the
actual cactus graph for this situation, one that might
sufficiently be reasoned out from the exhibit supra by
fusing together the four roots of the severed cactus.

Either of these expressions, text or graph, is equivalent to
what would otherwise be written in a more ordinary syntax as:

|  male  =/=  female
|  boy   <=>  male child
|  girl  <=>  female child
|  child  =>  human

This is a actually a single proposition, a conjunction of four lines:
one distinction, two equations, and one implication.  Together these
amount to a set of definitions conjointly constraining the logical
compatibility of the six feature names that appear.  They may be
thought of as sculpting out a space of models that is some subset
of the 2^6 = 64 possible interpretations, and thereby shaping some
universe of discourse.

Once this backdrop is defined, it is possible to "query" this universe,
simply by conjoining additional propositions in further constraint of
the underlying set of models.  This has many uses, as we shall see.

We are considering an Example of a propositional expression
that is formed on the following "alphabet" or "lexicon" of
six "logical features" or "boolean variables":

$A$  =  {"boy", "child", "female", "girl", "human", "male"}.

The expression is this:

|   ( male  , female )
|   (( boy  , male child ))
|   (( girl , female child ))
|   ( child ( human ))

Putting it very roughly -- and putting off a better description
of it till later -- we may think of this expression as notation
for a boolean function f : %B%^6 -> %B%.  This is what we might
call the "abstract type" of the function, but we will also find
it convenient on many occasions to represent the points of this
particular copy of the space %B%^6 in terms of the positive and
negative versions of the features from $A$ that serve to encase
them as logical "cells", as they are called in the venn diagram
picture of the corresponding universe of discourse X = [$A$].

Just for concreteness, this form of representation begins and ends:

<0,0,0,0,0,0>  =  (boy)(child)(female)(girl)(human)(male),
<0,0,0,0,0,1>  =  (boy)(child)(female)(girl)(human) male ,
<0,0,0,0,1,0>  =  (boy)(child)(female)(girl) human (male),
<0,0,0,0,1,1>  =  (boy)(child)(female)(girl) human  male ,
...
<1,1,1,1,0,0>  =   boy  child  female  girl (human)(male),
<1,1,1,1,0,1>  =   boy  child  female  girl (human) male ,
<1,1,1,1,1,0>  =   boy  child  female  girl  human (male),
<1,1,1,1,1,1>  =   boy  child  female  girl  human  male .

I continue with the previous Example, that I bring forward and sum up here:

|                  boy   male          girl   female
|                    o---o child          o---o child
|  male   female      \ /                  \ /               child  human
|     o---o            o                    o                    o--o
|      \ /             |                    |                    |
|       @              @                    @                    @
|
| (male , female)((boy , male child))((girl , female child))(child (human))

For my master's piece in Quantitative Psychology (Michigan State, 1989),
I wrote a program, "Theme One" (TO) by name, that among its other duties
operates to process the expressions of the cactus language in many of the
most pressing ways that we need in order to be able to use it effectively
as a propositional calculus.  The operational component of TO where one
does the work of this logical modeling is called "Study", and the core
of the logical calculator deep in the heart of this Study section is
a suite of computational functions that evolve a particular species
of "normal form", analogous to a "disjunctive normal form" (DNF),
from whatever expression they are prebendered as their input.

This "canonical", "normal", or "stable" form of logical expression --
I'll refine the distinctions among these subforms all in good time --
permits succinct depiction as an "arboreal boolean expansion" (ABE).

Once again, the graphic limitations of this space prevail against
any disposition that I might have to lay out a really substantial
case before you, of the brand that might have a chance to impress
you with the aptitude of this ilk of ABE in rooting out the truth
of many a complexly obscurely subtly adamant whetstone of our wit.

So let me just illustrate the way of it with one conjunct of our Example.
What follows will be a sequence of expressions, each one after the first
being logically equal to the one that precedes it:

Step 1

|    g    fc
|     o---o
|      \ /
|       o
|       |
|       @

Step 2

|                   o
|         fc        |   fc
|     o---o         o---o
|      \ /           \ /
|       o             o
|       |             |
|     g o-------------o--o g
|        \           /
|         \         /
|          \       /
|           \     /
|            \   /
|             \ /
|              @

Step 3

|      f c
|       o
|       |            f c
|       o             o
|       |             |
|     g o-------------o--o g
|        \           /
|         \         /
|          \       /
|           \     /
|            \   /
|             \ /
|              @

Step 4

|           o
|           |
|     c o   o c           o
|       |   |             |
|       o   o       c o   o c
|       |   |         |   |
|     f o---o--o f  f o---o--o f
|        \ /           \ /
|       g o-------------o--o g
|          \           /
|           \         /
|            \       /
|             \     /
|              \   /
|               \ /
|                @

Step 5

|           o       c o
|       c   |         |
|     f o---o--o f  f o---o--o f
|        \ /           \ /
|       g o-------------o--o g
|          \           /
|           \         /
|            \       /
|             \     /
|              \   /
|               \ /
|                @

Step 6

|                                       o
|                                       |
|           o                       o   o
|           |                       |   |
|     c o---o--o c      o         c o---o--o c
|        \ /            |            \ /
|       f o-------------o--o f      f o-------------o--o f
|          \           /               \           /
|           \         /                 \         /
|            \       /                   \       /
|             \     /                     \     /
|              \   /                       \   /
|               \ /                         \ /
|              g o---------------------------o--o g
|                 \                         /
|                  \                       /
|                   \                     /
|                    \                   /
|                     \                 /
|                      \               /
|                       \             /
|                        \           /
|                         \         /
|                          \       /
|                           \     /
|                            \   /
|                             \ /
|                              @

Step 7

|           o                       o
|           |                       |
|     c o---o--o c      o         c o---o--o c
|        \ /            |            \ /
|       f o-------------o--o f      f o-------------o--o f
|          \           /               \           /
|           \         /                 \         /
|            \       /                   \       /
|             \     /                     \     /
|              \   /                       \   /
|               \ /                         \ /
|              g o---------------------------o--o g
|                 \                         /
|                  \                       /
|                   \                     /
|                    \                   /
|                     \                 /
|                      \               /
|                       \             /
|                        \           /
|                         \         /
|                          \       /
|                           \     /
|                            \   /
|                             \ /
|                              @

This last expression is the ABE of the input expression.
It can be transcribed into ordinary logical language as:

| either girl and
|        either female and
|               either child and true
|               or not child and false
|        or not female and false
| or not girl and
|        either female and
|               either child and false
|               or not child and true
|        or not female and true

The expression "((girl , female child))" is sufficiently evaluated
by considering its logical values on the coordinate tuples of %B%^3,
or its indications on the cells of the associated venn diagram that
depicts the universe of discourse, namely, on these eight arguments:
      
<1, 1, 1>  =   girl  female  child ,
<1, 1, 0>  =   girl  female (child),
<1, 0, 1>  =   girl (female) child ,
<1, 0, 0>  =   girl (female)(child),
<0, 1, 1>  =  (girl) female  child ,
<0, 1, 0>  =  (girl) female (child),
<0, 0, 1>  =  (girl)(female) child ,
<0, 0, 0>  =  (girl)(female)(child).

The ABE output expression tells us the logical values of
the input expression on each of these arguments, doing so
by attaching the values to the leaves of a tree, and acting
as an "efficient" or "lazy" evaluator in the sense that the
process that generates the tree follows each path only up to
the point in the tree where it can determine the values on the
entire subtree beyond that point.  Thus, the ABE tree tells us:

 girl  female  child   -> 1
 girl  female (child)  -> 0
 girl (female) -> 0
(girl) female  child   -> 0
(girl) female (child)  -> 1
(girl)(female) -> 1

Picking out the interpretations that yield the truth of the expression,
and expanding the corresponding partial argument tuples, we arrive at
the following interpretations that satisfy the input expression:

 girl  female  child   -> 1
(girl) female (child)  -> 1
(girl)(female) child   -> 1
(girl)(female)(child)  -> 1

In sum, if it's a female and a child, then it's a girl,
and if it's either not a female or not a child or both,
then it's not a girl.

Brief Automata

By way of providing a simple illustration of Cook's Theorem,
that "Propositional Satisfiability is NP-Complete", here is
an exposition of one way to translate Turing Machine set-ups
into propositional expressions, employing the Ref Log Syntax
for Prop Calc that I described in a couple of earlier notes:

Notation:

Stilt(k)  =  Space and Time Limited Turing Machine,
             with k units of space and k units of time.

Stunt(k)  =  Space and Time Limited Turing Machine,
             for computing the parity of a bit string,
             with Number of Tape cells of input equal to k.

I will follow the pattern of the discussion in the book of
Herbert Wilf, 'Algorithms & Complexity' (1986), pages 188-201,
but translate into Ref Log, which is more efficient with respect
to the number of propositional clauses that are required.

Parity Machine

|                    1/1/+1
|                   ------->
|               /\ /        \ /\
|      0/0/+1  ^  0          1  ^  0/0/+1
|               \/|\        /|\/
|                 | <------- |
|         #/#/-1  |  1/1/+1  |  #/#/-1
|                 |          |
|                 v          v
|                 #          *

o-------o--------o-------------o---------o------------o
| State | Symbol | Next Symbol | Ratchet | Next State |
|   Q   |   S    |     S'      |   dR    |     Q'     |
o-------o--------o-------------o---------o------------o
|   0   |   0    |     0       |   +1    |     0      |
|   0   |   1    |     1       |   +1    |     1      |
|   0   |   #    |     #       |   -1    |     #      |
|   1   |   0    |     0       |   +1    |     1      |
|   1   |   1    |     1       |   +1    |     0      |
|   1   |   #    |     #       |   -1    |     *      |
o-------o--------o-------------o---------o------------o

The TM has a "finite automaton" (FA) as its component.
Let us refer to this particular FA by the name of "M".

The "tape-head" (that is, the "read-unit") will be called "H".
The "registers" are also called "tape-cells" or "tape-squares".

In order to consider how the finitely "stilted" rendition of this TM
can be translated into the form of a purely propositional description,
one now fixes k and limits the discussion to talking about a Stilt(k),
which is really not a true TM anymore but a finite automaton in disguise.

In this example, for the sake of a minimal illustration, we choose k = 2,
and discuss Stunt(2).  Since the zeroth tape cell and the last tape cell
are occupied with bof and eof marks "#", this amounts to only one digit
of significant computation.

To translate Stunt(2) into propositional form we use
the following collection of propositional variables:

For the "Present State Function" QF : P -> Q,

{p0_q#, p0_q*, p0_q0, p0_q1,
 p1_q#, p1_q*, p1_q0, p1_q1,
 p2_q#, p2_q*, p2_q0, p2_q1,
 p3_q#, p3_q*, p3_q0, p3_q1}

The propositional expression of the form "pi_qj" says:

| At the point-in-time p_i,
| the finite machine M is in the state q_j.

For the "Present Register Function" RF : P -> R,

{p0_r0, p0_r1, p0_r2, p0_r3,
 p1_r0, p1_r1, p1_r2, p1_r3,
 p2_r0, p2_r1, p2_r2, p2_r3,
 p3_r0, p3_r1, p3_r2, p3_r3}

The propositional expression of the form "pi_rj" says:

| At the point-in-time p_i,
| the tape-head H is on the tape-cell r_j.

For the "Present Symbol Function" SF : P -> (R -> S),

{p0_r0_s#, p0_r0_s*, p0_r0_s0, p0_r0_s1,
 p0_r1_s#, p0_r1_s*, p0_r1_s0, p0_r1_s1,
 p0_r2_s#, p0_r2_s*, p0_r2_s0, p0_r2_s1,
 p0_r3_s#, p0_r3_s*, p0_r3_s0, p0_r3_s1,
 p1_r0_s#, p1_r0_s*, p1_r0_s0, p1_r0_s1,
 p1_r1_s#, p1_r1_s*, p1_r1_s0, p1_r1_s1,
 p1_r2_s#, p1_r2_s*, p1_r2_s0, p1_r2_s1,
 p1_r3_s#, p1_r3_s*, p1_r3_s0, p1_r3_s1,
 p2_r0_s#, p2_r0_s*, p2_r0_s0, p2_r0_s1,
 p2_r1_s#, p2_r1_s*, p2_r1_s0, p2_r1_s1,
 p2_r2_s#, p2_r2_s*, p2_r2_s0, p2_r2_s1,
 p2_r3_s#, p2_r3_s*, p2_r3_s0, p2_r3_s1,
 p3_r0_s#, p3_r0_s*, p3_r0_s0, p3_r0_s1,
 p3_r1_s#, p3_r1_s*, p3_r1_s0, p3_r1_s1,
 p3_r2_s#, p3_r2_s*, p3_r2_s0, p3_r2_s1,
 p3_r3_s#, p3_r3_s*, p3_r3_s0, p3_r3_s1}

The propositional expression of the form "pi_rj_sk" says:

| At the point-in-time p_i,
| the tape-cell r_j bears the mark s_k.

o~~~~~~~~~o~~~~~~~~~o~~INPUTS~~o~~~~~~~~~o~~~~~~~~~o

Here are the Initial Conditions
for the two possible inputs to the
Ref Log redaction of this Parity TM:

o~~~~~~~~~o~~~~~~~~~o~INPUT~0~o~~~~~~~~~o~~~~~~~~~o

Initial Conditions:

p0_q0

p0_r1

p0_r0_s#
p0_r1_s0
p0_r2_s#

The Initial Conditions are given by a logical conjunction
that is composed of 5 basic expressions, altogether stating:

| At the point-in-time p_0, M is in the state q_0, and
| At the point-in-time p_0, H is on the cell  r_1, and
| At the point-in-time p_0, cell r_0 bears the mark "#", and
| At the point-in-time p_0, cell r_1 bears the mark "0", and
| At the point-in-time p_0, cell r_2 bears the mark "#".

o~~~~~~~~~o~~~~~~~~~o~INPUT~1~o~~~~~~~~~o~~~~~~~~~o

Initial Conditions:

p0_q0

p0_r1

p0_r0_s#
p0_r1_s1
p0_r2_s#

The Initial Conditions are given by a logical conjunction
that is composed of 5 basic expressions, altogether stating:

| At the point-in-time p_0, M is in the state q_0, and
| At the point-in-time p_0, H is on the cell  r_1, and
| At the point-in-time p_0, cell r_0 bears the mark "#", and
| At the point-in-time p_0, cell r_1 bears the mark "1", and
| At the point-in-time p_0, cell r_2 bears the mark "#".

o~~~~~~~~~o~~~~~~~~~o~PROGRAM~o~~~~~~~~~o~~~~~~~~~o

And here, yet again, just to store it nearby,
is the logical rendition of the TM's program:

Mediate Conditions:

( p0_q#  ( p1_q# ))
( p0_q*  ( p1_q* ))

( p1_q#  ( p2_q# ))
( p1_q*  ( p2_q* ))

Terminal Conditions:

(( p2_q# )( p2_q* ))

State Partition:

(( p0_q0 ),( p0_q1 ),( p0_q# ),( p0_q* ))
(( p1_q0 ),( p1_q1 ),( p1_q# ),( p1_q* ))
(( p2_q0 ),( p2_q1 ),( p2_q# ),( p2_q* ))

Register Partition:

(( p0_r0 ),( p0_r1 ),( p0_r2 ))
(( p1_r0 ),( p1_r1 ),( p1_r2 ))
(( p2_r0 ),( p2_r1 ),( p2_r2 ))

Symbol Partition:

(( p0_r0_s0 ),( p0_r0_s1 ),( p0_r0_s# ))
(( p0_r1_s0 ),( p0_r1_s1 ),( p0_r1_s# ))
(( p0_r2_s0 ),( p0_r2_s1 ),( p0_r2_s# ))

(( p1_r0_s0 ),( p1_r0_s1 ),( p1_r0_s# ))
(( p1_r1_s0 ),( p1_r1_s1 ),( p1_r1_s# ))
(( p1_r2_s0 ),( p1_r2_s1 ),( p1_r2_s# ))

(( p2_r0_s0 ),( p2_r0_s1 ),( p2_r0_s# ))
(( p2_r1_s0 ),( p2_r1_s1 ),( p2_r1_s# ))
(( p2_r2_s0 ),( p2_r2_s1 ),( p2_r2_s# ))

Interaction Conditions:

(( p0_r0 ) p0_r0_s0 ( p1_r0_s0 ))
(( p0_r0 ) p0_r0_s1 ( p1_r0_s1 ))
(( p0_r0 ) p0_r0_s# ( p1_r0_s# ))

(( p0_r1 ) p0_r1_s0 ( p1_r1_s0 ))
(( p0_r1 ) p0_r1_s1 ( p1_r1_s1 ))
(( p0_r1 ) p0_r1_s# ( p1_r1_s# ))

(( p0_r2 ) p0_r2_s0 ( p1_r2_s0 ))
(( p0_r2 ) p0_r2_s1 ( p1_r2_s1 ))
(( p0_r2 ) p0_r2_s# ( p1_r2_s# ))

(( p1_r0 ) p1_r0_s0 ( p2_r0_s0 ))
(( p1_r0 ) p1_r0_s1 ( p2_r0_s1 ))
(( p1_r0 ) p1_r0_s# ( p2_r0_s# ))

(( p1_r1 ) p1_r1_s0 ( p2_r1_s0 ))
(( p1_r1 ) p1_r1_s1 ( p2_r1_s1 ))
(( p1_r1 ) p1_r1_s# ( p2_r1_s# ))

(( p1_r2 ) p1_r2_s0 ( p2_r2_s0 ))
(( p1_r2 ) p1_r2_s1 ( p2_r2_s1 ))
(( p1_r2 ) p1_r2_s# ( p2_r2_s# ))

Transition Relations:

( p0_q0  p0_r1  p0_r1_s0  ( p1_q0  p1_r2  p1_r1_s0 ))
( p0_q0  p0_r1  p0_r1_s1  ( p1_q1  p1_r2  p1_r1_s1 ))
( p0_q0  p0_r1  p0_r1_s#  ( p1_q#  p1_r0  p1_r1_s# ))
( p0_q0  p0_r2  p0_r2_s#  ( p1_q#  p1_r1  p1_r2_s# ))

( p0_q1  p0_r1  p0_r1_s0  ( p1_q1  p1_r2  p1_r1_s0 ))
( p0_q1  p0_r1  p0_r1_s1  ( p1_q0  p1_r2  p1_r1_s1 ))
( p0_q1  p0_r1  p0_r1_s#  ( p1_q*  p1_r0  p1_r1_s# ))
( p0_q1  p0_r2  p0_r2_s#  ( p1_q*  p1_r1  p1_r2_s# ))

( p1_q0  p1_r1  p1_r1_s0  ( p2_q0  p2_r2  p2_r1_s0 ))
( p1_q0  p1_r1  p1_r1_s1  ( p2_q1  p2_r2  p2_r1_s1 ))
( p1_q0  p1_r1  p1_r1_s#  ( p2_q#  p2_r0  p2_r1_s# ))
( p1_q0  p1_r2  p1_r2_s#  ( p2_q#  p2_r1  p2_r2_s# ))

( p1_q1  p1_r1  p1_r1_s0  ( p2_q1  p2_r2  p2_r1_s0 ))
( p1_q1  p1_r1  p1_r1_s1  ( p2_q0  p2_r2  p2_r1_s1 ))
( p1_q1  p1_r1  p1_r1_s#  ( p2_q*  p2_r0  p2_r1_s# ))
( p1_q1  p1_r2  p1_r2_s#  ( p2_q*  p2_r1  p2_r2_s# ))

o~~~~~~~~~o~~~~~~~~~o~INTERPRETATION~o~~~~~~~~~o~~~~~~~~~o

Interpretation of the Propositional Program:

Mediate Conditions:

( p0_q#  ( p1_q# ))
( p0_q*  ( p1_q* ))

( p1_q#  ( p2_q# ))
( p1_q*  ( p2_q* ))

In Ref Log, an expression of the form "( X ( Y ))"
expresses an implication or an if-then proposition:
"Not X without Y",  "If X then Y",  "X => Y",  etc.

A text string expression of the form "( X ( Y ))"
parses to a graphical data-structure of the form:

    X   Y
    o---o
    |
    @

All together, these Mediate Conditions state:

| If at p_0  M is in state q_#, then at p_1  M is in state q_#, and
| If at p_0  M is in state q_*, then at p_1  M is in state q_*, and
| If at p_1  M is in state q_#, then at p_2  M is in state q_#, and
| If at p_1  M is in state q_*, then at p_2  M is in state q_*.

Terminal Conditions:

(( p2_q# )( p2_q* ))

In Ref Log, an expression of the form "(( X )( Y ))"
expresses a disjunction "X or Y" and it parses into:

    X   Y
    o   o
     \ /
      o
      |
      @

In effect, the Terminal Conditions state:

| At p_2,  M is in state q_#, or
| At p_2,  M is in state q_*.

State Partition:

(( p0_q0 ),( p0_q1 ),( p0_q# ),( p0_q* ))
(( p1_q0 ),( p1_q1 ),( p1_q# ),( p1_q* ))
(( p2_q0 ),( p2_q1 ),( p2_q# ),( p2_q* ))

In Ref Log, an expression of the form "(( e_1 ),( e_2 ),( ... ),( e_k ))"
expresses the fact that "exactly one of the e_j is true, for j = 1 to k".
Expressions of this form are called "universal partition" expressions, and
they parse into a type of graph called a "painted and rooted cactus" (PARC):

    e_1   e_2   ...   e_k
     o     o           o
     |     |           |
     o-----o--- ... ---o
      \               /
       \             /
        \           /
         \         /
          \       /
           \     /
            \   /
             \ /
              @

The State Partition expresses the conditions that:

| At each of the points-in-time p_i, for i = 0 to 2,
| M can be in exactly one state q_j, for j in the set {0, 1, #, *}.

Register Partition:

(( p0_r0 ),( p0_r1 ),( p0_r2 ))
(( p1_r0 ),( p1_r1 ),( p1_r2 ))
(( p2_r0 ),( p2_r1 ),( p2_r2 ))

The Register Partition expresses the conditions that:

| At each of the points-in-time p_i, for i = 0 to 2,
| H can be on exactly one cell  r_j, for j = 0 to 2.

Symbol Partition:

(( p0_r0_s0 ),( p0_r0_s1 ),( p0_r0_s# ))
(( p0_r1_s0 ),( p0_r1_s1 ),( p0_r1_s# ))
(( p0_r2_s0 ),( p0_r2_s1 ),( p0_r2_s# ))

(( p1_r0_s0 ),( p1_r0_s1 ),( p1_r0_s# ))
(( p1_r1_s0 ),( p1_r1_s1 ),( p1_r1_s# ))
(( p1_r2_s0 ),( p1_r2_s1 ),( p1_r2_s# ))

(( p2_r0_s0 ),( p2_r0_s1 ),( p2_r0_s# ))
(( p2_r1_s0 ),( p2_r1_s1 ),( p2_r1_s# ))
(( p2_r2_s0 ),( p2_r2_s1 ),( p2_r2_s# ))

The Symbol Partition expresses the conditions that:

| At each of the points-in-time p_i, for i in {0, 1, 2},
| in each of the tape-registers r_j, for j in {0, 1, 2}, 
| there can be exactly one sign s_k, for k in {0, 1, #}.

Interaction Conditions:

(( p0_r0 ) p0_r0_s0 ( p1_r0_s0 ))
(( p0_r0 ) p0_r0_s1 ( p1_r0_s1 ))
(( p0_r0 ) p0_r0_s# ( p1_r0_s# ))

(( p0_r1 ) p0_r1_s0 ( p1_r1_s0 ))
(( p0_r1 ) p0_r1_s1 ( p1_r1_s1 ))
(( p0_r1 ) p0_r1_s# ( p1_r1_s# ))

(( p0_r2 ) p0_r2_s0 ( p1_r2_s0 ))
(( p0_r2 ) p0_r2_s1 ( p1_r2_s1 ))
(( p0_r2 ) p0_r2_s# ( p1_r2_s# ))

(( p1_r0 ) p1_r0_s0 ( p2_r0_s0 ))
(( p1_r0 ) p1_r0_s1 ( p2_r0_s1 ))
(( p1_r0 ) p1_r0_s# ( p2_r0_s# ))

(( p1_r1 ) p1_r1_s0 ( p2_r1_s0 ))
(( p1_r1 ) p1_r1_s1 ( p2_r1_s1 ))
(( p1_r1 ) p1_r1_s# ( p2_r1_s# ))

(( p1_r2 ) p1_r2_s0 ( p2_r2_s0 ))
(( p1_r2 ) p1_r2_s1 ( p2_r2_s1 ))
(( p1_r2 ) p1_r2_s# ( p2_r2_s# ))

In briefest terms, the Interaction Conditions merely express
the circumstance that the sign in a tape-cell cannot change
between two points-in-time unless the tape-head is over the
cell in question at the initial one of those points-in-time.
All that we have to do is to see how they manage to say this.

In Ref Log, an expression of the following form:

"(( p<i>_r<j> ) p<i>_r<j>_s<k> ( p<i+1>_r<j>_s<k> ))",

and which parses as the graph:

       p<i>_r<j> o   o  p<i+1>_r<j>_s<k>
                  \ /
    p<i>_r<j>_s<k> o
                   |
                   @

can be read in the form of the following implication:

| If
| at the point-in-time p<i>, the tape-cell r<j> bears the mark s<k>,
| but it is not the case that
| at the point-in-time p<i>, the tape-head is on the tape-cell r<j>.
| then
| at the point-in-time p<i+1>, the tape-cell r<j> bears the mark s<k>.

Folks among us of a certain age and a peculiar manner of acculturation will
recognize these as the "Frame Conditions" for the change of state of the TM.

Transition Relations:

( p0_q0  p0_r1  p0_r1_s0  ( p1_q0  p1_r2  p1_r1_s0 ))
( p0_q0  p0_r1  p0_r1_s1  ( p1_q1  p1_r2  p1_r1_s1 ))
( p0_q0  p0_r1  p0_r1_s#  ( p1_q#  p1_r0  p1_r1_s# ))
( p0_q0  p0_r2  p0_r2_s#  ( p1_q#  p1_r1  p1_r2_s# ))

( p0_q1  p0_r1  p0_r1_s0  ( p1_q1  p1_r2  p1_r1_s0 ))
( p0_q1  p0_r1  p0_r1_s1  ( p1_q0  p1_r2  p1_r1_s1 ))
( p0_q1  p0_r1  p0_r1_s#  ( p1_q*  p1_r0  p1_r1_s# ))
( p0_q1  p0_r2  p0_r2_s#  ( p1_q*  p1_r1  p1_r2_s# ))

( p1_q0  p1_r1  p1_r1_s0  ( p2_q0  p2_r2  p2_r1_s0 ))
( p1_q0  p1_r1  p1_r1_s1  ( p2_q1  p2_r2  p2_r1_s1 ))
( p1_q0  p1_r1  p1_r1_s#  ( p2_q#  p2_r0  p2_r1_s# ))
( p1_q0  p1_r2  p1_r2_s#  ( p2_q#  p2_r1  p2_r2_s# ))

( p1_q1  p1_r1  p1_r1_s0  ( p2_q1  p2_r2  p2_r1_s0 ))
( p1_q1  p1_r1  p1_r1_s1  ( p2_q0  p2_r2  p2_r1_s1 ))
( p1_q1  p1_r1  p1_r1_s#  ( p2_q*  p2_r0  p2_r1_s# ))
( p1_q1  p1_r2  p1_r2_s#  ( p2_q*  p2_r1  p2_r2_s# ))

The Transition Conditions merely serve to express,
by means of 16 complex implication expressions,
the data of the TM table that was given above.

o~~~~~~~~~o~~~~~~~~~o~~OUTPUTS~~o~~~~~~~~~o~~~~~~~~~o

And here are the outputs of the computation,
as emulated by its propositional rendition,
and as actually generated within that form
of transmogrification by the program that
I wrote for finding all of the satisfying
interpretations (truth-value assignments)
of propositional expressions in Ref Log:

o~~~~~~~~~o~~~~~~~~~o~OUTPUT~0~o~~~~~~~~~o~~~~~~~~~o

Output Conditions:

 p0_q0
  p0_r1
   p0_r0_s#
    p0_r1_s0
     p0_r2_s#
      p1_q0
       p1_r2
        p1_r2_s#
         p1_r0_s#
          p1_r1_s0
           p2_q#
            p2_r1
             p2_r0_s#
              p2_r1_s0
               p2_r2_s#

The Output Conditions amount to the sole satisfying interpretation,
that is, a "sequence of truth-value assignments" (SOTVA) that make
the entire proposition come out true, and they state the following:

| At the point-in-time p_0, M is in the state q_0,       and
| At the point-in-time p_0, H is on the cell  r_1,       and
| At the point-in-time p_0, cell r_0 bears the mark "#", and
| At the point-in-time p_0, cell r_1 bears the mark "0", and
| At the point-in-time p_0, cell r_2 bears the mark "#", and
|
| At the point-in-time p_1, M is in the state q_0,       and 
| At the point-in-time p_1, H is on the cell  r_2,       and
| At the point-in-time p_1, cell r_0 bears the mark "#", and
| At the point-in-time p_1, cell r_1 bears the mark "0", and
| At the point-in-time p_1, cell r_2 bears the mark "#", and
|
| At the point-in-time p_2, M is in the state q_#,       and 
| At the point-in-time p_2, H is on the cell  r_1,       and
| At the point-in-time p_2, cell r_0 bears the mark "#", and
| At the point-in-time p_2, cell r_1 bears the mark "0", and
| At the point-in-time p_2, cell r_2 bears the mark "#".

In brief, the output for our sake being the symbol that rests
under the tape-head H when the machine M gets to a rest state,
we are now amazed by the remarkable result that Parity(0) = 0.

o~~~~~~~~~o~~~~~~~~~o~OUTPUT~1~o~~~~~~~~~o~~~~~~~~~o

Output Conditions:

 p0_q0
  p0_r1
   p0_r0_s#
    p0_r1_s1
     p0_r2_s#
      p1_q1
       p1_r2
        p1_r2_s#
         p1_r0_s#
          p1_r1_s1
           p2_q*
            p2_r1
             p2_r0_s#
              p2_r1_s1
               p2_r2_s#

The Output Conditions amount to the sole satisfying interpretation,
that is, a "sequence of truth-value assignments" (SOTVA) that make
the entire proposition come out true, and they state the following:

| At the point-in-time p_0, M is in the state q_0,       and
| At the point-in-time p_0, H is on the cell  r_1,       and
| At the point-in-time p_0, cell r_0 bears the mark "#", and
| At the point-in-time p_0, cell r_1 bears the mark "1", and
| At the point-in-time p_0, cell r_2 bears the mark "#", and
|
| At the point-in-time p_1, M is in the state q_1,       and 
| At the point-in-time p_1, H is on the cell  r_2,       and
| At the point-in-time p_1, cell r_0 bears the mark "#", and
| At the point-in-time p_1, cell r_1 bears the mark "1", and
| At the point-in-time p_1, cell r_2 bears the mark "#", and
|
| At the point-in-time p_2, M is in the state q_*,       and 
| At the point-in-time p_2, H is on the cell  r_1,       and
| At the point-in-time p_2, cell r_0 bears the mark "#", and
| At the point-in-time p_2, cell r_1 bears the mark "1", and
| At the point-in-time p_2, cell r_2 bears the mark "#".

In brief, the output for our sake being the symbol that rests
under the tape-head H when the machine M gets to a rest state,
we are now amazed by the remarkable result that Parity(1) = 1.

I realized after sending that last bunch of bits that there is room
for confusion about what is the input/output of the Study module of
the Theme One program as opposed to what is the input/output of the
"finitely approximated turing automaton" (FATA).  So here is better
delineation of what's what.  The input to Study is a text file that
is known as LogFile(Whatever) and the output of Study is a sequence
of text files that summarize the various canonical and normal forms
that it generates.  For short, let us call these NormFile(Whatelse).
With that in mind, here are the actual IO's of Study, excluding the
glosses in square brackets:

o~~~~~~~~~o~~~~~~~~~o~~INPUT~~o~~~~~~~~~o~~~~~~~~~o

[Input To Study = FATA Initial Conditions + FATA Program Conditions]

[FATA Initial Conditions For Input 0]

p0_q0

p0_r1

p0_r0_s#
p0_r1_s0
p0_r2_s#

[FATA Program Conditions For Parity Machine]

[Mediate Conditions]

( p0_q#  ( p1_q# ))
( p0_q*  ( p1_q* ))

( p1_q#  ( p2_q# ))
( p1_q*  ( p2_q* ))

[Terminal Conditions]

(( p2_q# )( p2_q* ))

[State Partition]

(( p0_q0 ),( p0_q1 ),( p0_q# ),( p0_q* ))
(( p1_q0 ),( p1_q1 ),( p1_q# ),( p1_q* ))
(( p2_q0 ),( p2_q1 ),( p2_q# ),( p2_q* ))

[Register Partition]

(( p0_r0 ),( p0_r1 ),( p0_r2 ))
(( p1_r0 ),( p1_r1 ),( p1_r2 ))
(( p2_r0 ),( p2_r1 ),( p2_r2 ))

[Symbol Partition]

(( p0_r0_s0 ),( p0_r0_s1 ),( p0_r0_s# ))
(( p0_r1_s0 ),( p0_r1_s1 ),( p0_r1_s# ))
(( p0_r2_s0 ),( p0_r2_s1 ),( p0_r2_s# ))

(( p1_r0_s0 ),( p1_r0_s1 ),( p1_r0_s# ))
(( p1_r1_s0 ),( p1_r1_s1 ),( p1_r1_s# ))
(( p1_r2_s0 ),( p1_r2_s1 ),( p1_r2_s# ))

(( p2_r0_s0 ),( p2_r0_s1 ),( p2_r0_s# ))
(( p2_r1_s0 ),( p2_r1_s1 ),( p2_r1_s# ))
(( p2_r2_s0 ),( p2_r2_s1 ),( p2_r2_s# ))

[Interaction Conditions]

(( p0_r0 ) p0_r0_s0 ( p1_r0_s0 ))
(( p0_r0 ) p0_r0_s1 ( p1_r0_s1 ))
(( p0_r0 ) p0_r0_s# ( p1_r0_s# ))

(( p0_r1 ) p0_r1_s0 ( p1_r1_s0 ))
(( p0_r1 ) p0_r1_s1 ( p1_r1_s1 ))
(( p0_r1 ) p0_r1_s# ( p1_r1_s# ))

(( p0_r2 ) p0_r2_s0 ( p1_r2_s0 ))
(( p0_r2 ) p0_r2_s1 ( p1_r2_s1 ))
(( p0_r2 ) p0_r2_s# ( p1_r2_s# ))

(( p1_r0 ) p1_r0_s0 ( p2_r0_s0 ))
(( p1_r0 ) p1_r0_s1 ( p2_r0_s1 ))
(( p1_r0 ) p1_r0_s# ( p2_r0_s# ))

(( p1_r1 ) p1_r1_s0 ( p2_r1_s0 ))
(( p1_r1 ) p1_r1_s1 ( p2_r1_s1 ))
(( p1_r1 ) p1_r1_s# ( p2_r1_s# ))

(( p1_r2 ) p1_r2_s0 ( p2_r2_s0 ))
(( p1_r2 ) p1_r2_s1 ( p2_r2_s1 ))
(( p1_r2 ) p1_r2_s# ( p2_r2_s# ))

[Transition Relations]

( p0_q0  p0_r1  p0_r1_s0  ( p1_q0  p1_r2  p1_r1_s0 ))
( p0_q0  p0_r1  p0_r1_s1  ( p1_q1  p1_r2  p1_r1_s1 ))
( p0_q0  p0_r1  p0_r1_s#  ( p1_q#  p1_r0  p1_r1_s# ))
( p0_q0  p0_r2  p0_r2_s#  ( p1_q#  p1_r1  p1_r2_s# ))

( p0_q1  p0_r1  p0_r1_s0  ( p1_q1  p1_r2  p1_r1_s0 ))
( p0_q1  p0_r1  p0_r1_s1  ( p1_q0  p1_r2  p1_r1_s1 ))
( p0_q1  p0_r1  p0_r1_s#  ( p1_q*  p1_r0  p1_r1_s# ))
( p0_q1  p0_r2  p0_r2_s#  ( p1_q*  p1_r1  p1_r2_s# ))

( p1_q0  p1_r1  p1_r1_s0  ( p2_q0  p2_r2  p2_r1_s0 ))
( p1_q0  p1_r1  p1_r1_s1  ( p2_q1  p2_r2  p2_r1_s1 ))
( p1_q0  p1_r1  p1_r1_s#  ( p2_q#  p2_r0  p2_r1_s# ))
( p1_q0  p1_r2  p1_r2_s#  ( p2_q#  p2_r1  p2_r2_s# ))

( p1_q1  p1_r1  p1_r1_s0  ( p2_q1  p2_r2  p2_r1_s0 ))
( p1_q1  p1_r1  p1_r1_s1  ( p2_q0  p2_r2  p2_r1_s1 ))
( p1_q1  p1_r1  p1_r1_s#  ( p2_q*  p2_r0  p2_r1_s# ))
( p1_q1  p1_r2  p1_r2_s#  ( p2_q*  p2_r1  p2_r2_s# ))

o~~~~~~~~~o~~~~~~~~~o~~OUTPUT~~o~~~~~~~~~o~~~~~~~~~o

[Output Of Study = FATA Output For Input 0]

 p0_q0
  p0_r1
   p0_r0_s#
    p0_r1_s0
     p0_r2_s#
      p1_q0
       p1_r2
        p1_r2_s#
         p1_r0_s#
          p1_r1_s0
           p2_q#
            p2_r1
             p2_r0_s#
              p2_r1_s0
               p2_r2_s#

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

Turing automata, finitely approximated or not, make my head spin and
my tape go loopy, and I still believe 'twere a far better thing I do
if I work up to that level of complexity in a more gracile graduated
manner.  So let us return to our Example in this gradual progress to
that vastly more well-guarded grail of our long-term pilgrim's quest:

|                  boy   male          girl   female
|                    o---o child          o---o child
|  male   female      \ /                  \ /               child  human
|     o---o            o                    o                    o--o
|      \ /             |                    |                    |
|       @              @                    @                    @
|
| (male , female)((boy , male child))((girl , female child))(child (human))

One section of the Theme One program has a suite of utilities that fall
under the title "Theme One Study" ("To Study", or just "TOS" for short).
To Study is to read and to parse a so-called and a generally so-suffixed
"log" file, and then to conjoin what is called a "query", which is really
just an additional propositional expression that imposes a further logical
constraint on the input expression.

The Figure roughly sketches the conjuncts of the graph-theoretic
data structure that the parser would commit to memory on reading
the appropriate log file that contains the text along the bottom.

I will now explain the various sorts of things that the TOS utility
can do with the log file that describes the universe of discourse in
our present Example.

Theme One Study is built around a suite of four successive generators
of "normal forms" for propositional expressions, just to use that term
in a very approximate way.  The functions that compute these normal forms
are called "Model", "Tenor", "Canon", and "Sense", and so we may refer to
to their text-style outputs as the "mod", "ten", "can", and "sen" files.

Though it could be any propositional expression on the same vocabulary
$A$ = {"boy", "child", "female", "girl", "human", "male"}, more usually
the query is a simple conjunction of one or more positive features that
we want to focus on or perhaps to filter out of the logical model space.
On our first run through this Example, we take the log file proposition
as it is, with no extra riders.

| Procedural Note.  TO Study Model displays a running tab of how much
| free memory space it has left.  On some of the harder problems that
| you may think of to give it, Model may run out of free memory and
| terminate, abnormally exiting Theme One.  Sometimes it helps to:
|
| 1.  Rephrase the problem in logically equivalent
|     but rhetorically increasedly felicitous ways.
|
| 2.  Think of additional facts that are taken for granted but not
|     made explicit and that cannot be logically inferred by Model.

After Model has finished, it is ready to write out its mod file,
which you may choose to show on the screen or save to a named file.
Mod files are usually too long to see (or to care to see) all at once
on the screen, so it is very often best to save them for later replay.
In our Example the Model function yields a mod file that looks like so:

Model Output and
Mod File Example
o-------------------o
| male              |
|  female -         |  1
|  (female )        |
|   girl -          |  2
|   (girl )         |
|    child          |
|     boy           |
|      human *      |  3 *
|      (human ) -   |  4
|     (boy ) -      |  5
|    (child )       |
|     boy -         |  6
|     (boy ) *      |  7 *
| (male )           |
|  female           |
|   boy -           |  8
|   (boy )          |
|    child          |
|     girl          |
|      human *      |  9 *
|      (human ) -   | 10
|     (girl ) -     | 11
|    (child )       |
|     girl -        | 12
|     (girl ) *     | 13 *
|  (female ) -      | 14
o-------------------o

Counting the stars "*" that indicate true interpretations
and the bars "-" that indicate false interpretations of
the input formula, we can see that the Model function,
out of the 64 possible interpretations, has actually
gone through the work of making just 14 evaluations,
all in order to find the 4 models that are allowed
by the input definitions.

To be clear about what this output means, the starred paths
indicate all of the complete specifications of objects in the
universe of discourse, that is, all of the consistent feature
conjunctions of maximum length, as permitted by the definitions
that are given in the log file.

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

Let's take a little break from the Example in progress
and look at where we are and what we have been doing from
computational, logical, and semiotic perspectives.  Because,
after all, as is usually the case, we should not let our focus
and our fascination with this particular Example prevent us from
recognizing it, and all that we do with it, as just an Example of
much broader paradigms and predicaments and principles, not to say
but a glimmer of ultimately more concernful and fascinating objects.

I chart the progression that we have just passed through in this way:

|                    Parse
|      Sign A  o-------------->o  Sign 1
|             ^                |
|            /                 |
|           /                  |
|          /                   |
| Object  o                    |  Transform
|          ^                   |
|           \                  |
|            \                 |
|             \                v
|      Sign B  o<--------------o  Sign 2
|                    Verse
|
| Figure.  Computation As Sign Transformation

In the present case, the Object is an objective situation
or a state of affairs, in effect, a particular pattern of
feature concurrences occurring to us in that world through
which we find ourselves most frequently faring, wily nily,
and the Signs are different tokens and different types of
data structures that we somehow or other find it useful
to devise or to discover for the sake of representing
current objects to ourselves on a recurring basis.

But not all signs, not even signs of a single object, are alike
in every other respect that one might name, not even with respect
to their powers of relating, significantly, to that common object.

And that is what our whole business of computation busies itself about,
when it minds its business best, that is, transmuting signs into signs
in ways that augment their powers of relating significantly to objects.

We have seen how the Model function and the mod output format
indicate all of the complete specifications of objects in the
universe of discourse, that is, all of the consistent feature
conjunctions of maximal specificity that are permitted by the
constraints or the definitions that are given in the log file.

To help identify these specifications of particular cells in
the universe of discourse, the next function and output format,
called "Tenor", edits the mod file to give only the true paths,
in effect, the "positive models", that are by default what we
usually mean when we say "models", and not the "anti-models"
or the "negative models" that fail to satisfy the formula
in question.

In the present Example the Tenor function
generates a Ten file that looks like this:

Tenor Output and
Ten File Example
o-------------------o
| male              |
|  (female )        |
|   (girl )         |
|    child          |
|     boy           |
|      human *      | <1>
|    (child )       |
|     (boy ) *      | <2>
| (male )           |
|  female           |
|   (boy )          |
|    child          |
|     girl          |
|      human *      | <3>
|    (child )       |
|     (girl ) *     | <4>
o-------------------o

As I said, the Tenor function just abstracts a transcript of the models,
that is, the satisfying interpretations, that were already interspersed
throughout the complete Model output.  These specifications, or feature
conjunctions, with the positive and the negative features listed in the
order of their actual budding on the "arboreal boolean expansion" twigs,
may be gathered and arranged in this antherypulogical flowering bouquet:

1.   male   (female ) (girl )  child    boy     human   *
2.   male   (female ) (girl ) (child ) (boy  )          *
3.  (male )  female   (boy  )  child    girl    human   *
4.  (male )  female   (boy  ) (child ) (girl )          *

Notice that Model, as reflected in this abstract, did not consider
the six positive features in the same order along each path.  This
is because the algorithm was designed to proceed opportunistically
in its attempt to reduce the original proposition through a series
of case-analytic considerations and the resulting simplifications.

Notice, too, that Model is something of a lazy evaluator, quitting work
when and if a value is determined by less than the full set of variables.
This is the reason why paths <2> and <4> are not ostensibly of the maximum
length.  According to this lazy mode of understanding, any path that is not
specified on a set of features really stands for the whole bundle of paths
that are derived by freely varying those features.  Thus, specifications
<2> and <4> summarize four models altogether, with the logical choice
between "human" and "not human" being left open at the point where
they leave off their branches in the releavent deciduous tree.

The last two functions in the Study section, "Canon" and "Sense",
extract further derivatives of the normal forms that are produced
by Model and Tenor.  Both of these functions take the set of model
paths and simply throw away the negative labels.  You may think of
these as the "rose colored glasses" or "job interview" normal forms,
in that they try to say everything that's true, so long as it can be
expressed in positive terms.  Generally, this would mean losing a lot
of information, and the result could no longer be expected to have the
property of remaining logically equivalent to the original proposition.

Fortunately, however, it seems that this type of positive projection of
the whole truth is just what is possible, most needed, and most clear in
many of the "natural" examples, that is, in examples that arise from the
domains of natural language and natural conceptual kinds.  In these cases,
where most of the logical features are redundantly coded, for example, in
the way that "adult" = "not child" and "child" = "not adult", the positive
feature bearing redacts are often sufficiently expressive all by themselves.

Canon merely censors its printing of the negative labels as it traverses the
model tree.  This leaves the positive labels in their original columns of the
outline form, giving it a slightly skewed appearance.  This can be misleading
unless you already know what you are looking for.  However, this Canon format
is computationally quick, and frequently suffices, especially if you already
have a likely clue about what to expect in the way of a question's outcome.

In the present Example the Canon function
generates a Can file that looks like this:

Canon Output and
Can File Example
o-------------------o
| male              |
|    child          |
|     boy           |
|      human        |
|  female           |
|    child          |
|     girl          |
|      human        |
o-------------------o

The Sense function does the extra work that is required
to place the positive labels of the model tree at their
proper level in the outline.

In the present Example the Sense function
generates a Sen file that looks like this:

Sense Output and
Sen File Example
o-------------------o
| male              |
|  child            |
|   boy             |
|    human          |
| female            |
|  child            |
|   girl            |
|    human          |
o-------------------o

The Canon and Sense outlines for this Example illustrate a certain
type of general circumstance that needs to be noted at this point.
Recall the model paths or the feature specifications that were
numbered <2> and <4> in the listing of the output for Tenor.
These paths, in effect, reflected Model's discovery that
the venn diagram cells for male or female non-children
and male or female non-humans were not excluded by
the definitions that were given in the Log file.
In the abstracts given by Canon and Sense, the
specifications <2> and <4> have been subsumed,
or absorbed unmarked, under the general topics
of their respective genders, male or female.
This happens because no purely positive
features were supplied to distinguish
the non-child and non-human cases.

That completes the discussion of
this six-dimensional Example.

Nota Bene, for possible future use.  In the larger current of work
with respect to which this meander of a conduit was initially both
diversionary and tributary, before those high and dry regensquirm
years when it turned into an intellectual interglacial oxbow lake,
I once had in mind a scape in which expressions in a definitional
lattice were ordered according to their simplicity on some scale
or another, and in this setting the word "sense" was actually an
acronym for "semantically equivalent next-simplest expression".

| If this is starting to sound a little bit familiar,
| it may be because the relationship between the two
| kinds of pictures of propositions, namely:
|
| 1.  Propositions about things in general, here,
|     about the times when certain facts are true,
|     having the form of functions f : X -> B,
|
| 2.  Propositions about binary codes, here, about
|     the bit-vector labels on venn diagram cells,
|     having the form of functions f' : B^k -> B,
|
| is an epically old story, one that I, myself,
| have related one or twice upon a time before,
| to wit, at least, at the following two cites:
|
| http://suo.ieee.org/email/msg01251.html
| http://suo.ieee.org/email/msg01293.html
|
| There, and now here, once more, and again, it may be observed
| that the relation is one whereby the proposition f : X -> B,
| the one about things and times and mores in general, factors
| into a coding function c : X -> B^k, followed by a derived
| proposition f' : B^k -> B that judges the resulting codes.
|
|                         f
|                   X o------>o B
|                      \     ^
|   c = <x_1, ..., x_k> \   / f'
|                        v /
|                         o
|                        B^k
|
| You may remember that this was supposed to illustrate
| the "factoring" of a proposition f : X -> B = {0, 1}
| into the composition f'(c(x)), where c : X -> B^k is
| the "coding" of each x in X as an k-bit string in B^k,
| and where f' is the mapping of codes into a co-domain
| that we interpret as t-f-values, B = {0, 1} = {F, T}.

In short, there is the standard equivocation ("systematic ambiguity"?) as to
whether we are talking about the "applied" and concretely typed proposition
f : X -> B or the "pure" and abstractly typed proposition f' : B^k -> B.
Or we can think of the latter object as the approximate code icon of
the former object. 

Anyway, these types of formal objects are the sorts of things that
I take to be the denotational objects of propositional expressions.
These objects, along with their invarious and insundry mathematical
properties, are the orders of things that I am talking about when
I refer to the "invariant structures in these objects themselves".

"Invariant" means "invariant under a suitable set of transformations",
in this case the translations between various languages that preserve
the objects and the structures in question.  In extremest generality,
this is what universal constructions in category theory are all about.

In summation, the functions f : X -> B and f' : B* -> B have invariant, formal,
mathematical, objective properties that any adequate language might eventually
evolve to express, only some languages express them more obscurely than others.

To be perfectly honest, I continue to be surprised that anybody in this group
has trouble with this.  There are perfectly apt and familiar examples in the
contrast between roman numerals and arabic numerals, or the contrast between
redundant syntaxes, like those that use the pentalphabet {~, &, v, =>, <=>},
and trimmer syntaxes, like those used in existential and conceptual graphs.
Every time somebody says "Let's take {~, &, v, =>, <=>} as an operational
basis for logic" it's just like that old joke that mathematicians tell on
engineers where the ingenue in question says "1 is a prime, 2 is a prime,
3 is a prime, 4 is a prime, ..." -- and I know you think that I'm being
hyperbolic, but I'm really only up to parabolas here ...

I have already refined my criticism so that it does not apply to
the spirit of FOL or KIF or whatever, but only to the letters of
specific syntactic proposals.  There is a fact of the matter as
to whether a concrete language provides a clean or a cluttered
basis for representing the identified set of formal objects.
And it shows up in pragmatic realities like the efficiency
of real time concept formation, concept use, learnability,
reasoning power, and just plain good use of real time.
These are the dire consequences that I learned in my
very first tries at mathematically oriented theorem
automation, and the only factor that has obscured
them in mainstream work since then is the speed
with which folks can now do all of the same
old dumb things that they used to do on
their way to kludging out the answers.

It seems to be darn near impossible to explain to the
centurion all of the neat stuff that he's missing by
sticking to his old roman numerals.  He just keeps
on reckoning that what he can't count must be of
no account at all.  There is way too much stuff
that these original syntaxes keep us from even
beginning to discuss, like differential logic,
just for starters.

Our next Example illustrates the use of the Cactus Language
for representing "absolute" and "relative" partitions, also
known as "complete" and "contingent" classifications of the
universe of discourse, all of which amounts to divvying it
up into mutually exclusive regions, exhaustive or not, as
one frequently needs in situations involving a genus and
its sundry species, and frequently pictures in the form
of a venn diagram that looks just like a "pie chart".

Example.  Partition, Genus & Species

The idea that one needs for expressing partitions
in cactus expressions can be summed up like this:

| If the propositional expression
|
| "( p , q , r , ... )"
|
| means that just one of
|
| p, q, r, ... is false,
|
| then the propositional expression
|
| "((p),(q),(r), ... )"
|
| must mean that just one of
|
| (p), (q), (r), ... is false,
|
| in other words, that just one of
|
| p, q, r, ... is true.

Thus we have an efficient means to express and to enforce
a partition of the space of models, in effect, to maintain
the condition that a number of features or propositions are
to be held in mutually exclusive and exhaustive disjunction.
This supplies a much needed bridge between the binary domain
of two values and any other domain with a finite number of
feature values.

Another variation on this theme allows one to maintain the
subsumption of many separate species under an explicit genus.
To see this, let us examine the following form of expression:

( q , ( q_1 ) , ( q_2 ) , ( q_3 ) ).

Now consider what it would mean for this to be true.  We see two cases:

1.  If the proposition q is true, then exactly one of the
    propositions (q_1), (q_2), (q_3) must be false, and so
    just one of the propositions q_1, q_2, q_3 must be true.

2.  If the proposition q is false, then every one of the
    propositions (q_1), (q_2), (q_2) must be true, and so
    each one of the propositions q_1, q_2, q_3 must be false.
    In short, if q is false then all of the other q's are also.

Figures 1 and 2 illustrate this type of situation.

Figure 1 is the venn diagram of a 4-dimensional universe of discourse
X = [q, q_1, q_2, q_3], conventionally named after the gang of four
logical features that generate it.  Strictly speaking, X is made up
of two layers, the position space X of abstract type %B%^4, and the
proposition space X^ = (X -> %B%) of abstract type %B%^4 -> %B%,
but it is commonly lawful enough to sign the signature of both
spaces with the same X, and thus to give the power of attorney
for the propositions to the so-indicted position space thereof.

Figure 1 also makes use of the convention whereby the regions
or the subsets of the universe of discourse that correspond
to the basic features q, q_1, q_2, q_3 are labelled with
the parallel set of upper case letters Q, Q_1, Q_2, Q_3.

|                        o
|                       / \
|                      /   \
|                     /     \
|                    /       \
|                   o         o
|                  /%\       /%\
|                 /%%%\     /%%%\
|                /%%%%%\   /%%%%%\
|               /%%%%%%%\ /%%%%%%%\
|              o%%%%%%%%%o%%%%%%%%%o
|             / \%%%%%%%/ \%%%%%%%/ \
|            /   \%%%%%/   \%%%%%/   \
|           /     \%%%/     \%%%/     \
|          /       \%/       \%/       \
|         o         o         o         o
|        / \       /%\       / \       / \
|       /   \     /%%%\     /   \     /   \
|      /     \   /%%%%%\   /     \   /     \
|     /       \ /%%%%%%%\ /       \ /       \
|    o         o%%%%%%%%%o         o         o
|    ·\       / \%%%%%%%/ \       / \       /·
|    · \     /   \%%%%%/   \     /   \     / ·
|    ·  \   /     \%%%/     \   /     \   /  ·
|    ·   \ /       \%/       \ /       \ /   ·
|    ·    o         o         o         o    ·
|    ·    ·\       / \       / \       /·    ·
|    ·    · \     /   \     /   \     / ·    ·
|    ·    ·  \   /     \   /     \   /  ·    ·
|    · Q  ·   \ /       \ /       \ /   ·Q_3 ·
|    ··········o         o         o··········
|         ·     \       /%\       /     ·
|         ·      \     /%%%\     /      ·
|         ·       \   /%%%%%\   /       ·
|         · Q_1    \ /%%%%%%%\ /    Q_2 ·
|         ··········o%%%%%%%%%o··········
|                    \%%%%%%%/
|                     \%%%%%/
|                      \%%%/
|                       \%/
|                        o
|
| Figure 1.  Genus Q and Species Q_1, Q_2, Q_3

Figure 2 is another form of venn diagram that one often uses,
where one collapses the unindited cells and leaves only the
models of the proposition in question.  Some people would
call the transformation that changes from the first form
to the next form an operation of "taking the quotient",
but I tend to think of it as the "soap bubble picture"
or more exactly the "wire & thread & soap film" model
of the universe of discourse, where one pops out of
consideration the sections of the soap film that
stretch across the anti-model regions of space.

o-------------------------------------------------o
|                                                 |
|  X                                              |
|                                                 |
|                        o                        |
|                       / \                       |
|                      /   \                      |
|                     /     \                     |
|                    /       \                    |
|                   /         \                   |
|                  o    Q_1    o                  |
|                 / \         / \                 |
|                /   \       /   \                |
|               /     \     /     \               |
|              /       \   /       \              |
|             /         \ /         \             |
|            /           Q           \            |
|           /            |            \           |
|          /             |             \          |
|         /       Q_2    |    Q_3       \         |
|        /               |               \        |
|       /                |                \       |
|      o-----------------o-----------------o      |
|                                                 |
|                                                 |
|                                                 |
o-------------------------------------------------o

Figure 2.  Genus Q and Species Q_1, Q_2, Q_3

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

Example.  Partition, Genus & Species (cont.)

Last time we considered in general terms how the forms
of complete partition and contingent partition operate
to maintain mutually disjoint and possibly exhaustive
categories of positions in a universe of discourse.

This time we contemplate another concrete Example of
near minimal complexity, designed to demonstrate how
the forms of partition and subsumption can interact
in structuring a space of feature specifications.

In this Example, we describe a universe of discourse
in terms of the following vocabulary of five features:

| L.  living_thing
| 
| N.  non_living
| 
| A.  animal
|
| V.  vegetable
|
| M.  mineral

Let us construe these features as being subject to four constraints:

| 1.  Everything is either a living_thing or non_living, but not both.
|
| 2.  Everything is either animal, vegetable, or mineral,
|     but no two of these together.
|
| 3.  A living_thing is either animal or vegetable, but not both,
|     and everything animal or vegetable is a living_thing.
|
| 4.  Everything mineral is non_living.

These notions and constructions are expressed in the Log file shown below:

Logical Input File
o-------------------------------------------------o
|                                                 |
|   ( living_thing , non_living )                 |
|                                                 |
|   (( animal ),( vegetable ),( mineral ))        |
|                                                 |
|   ( living_thing ,( animal ),( vegetable ))     |
|                                                 |
|   ( mineral ( non_living ))                     |
|                                                 |
o-------------------------------------------------o

The cactus expression in this file is the expression
of a "zeroth order theory" (ZOT), one that can be
paraphrased in more ordinary language to say:

Translation
o-------------------------------------------------o
|                                                 |
|   living_thing  =/=  non_living                 |
|                                                 |
|   par : all -> {animal, vegetable, mineral}     |
|                                                 |
|   par : living_thing -> {animal, vegetable}     |
|                                                 |
|   mineral => non_living                         |
|                                                 |
o-------------------------------------------------o

Here, "par : all -> {p, q, r}" is short for an assertion
that the universe as a whole is partitioned into subsets
that correspond to the features p, q, r.

Also, "par : q -> {r, s}" asserts that "Q partitions into R and S.

It is probably enough just to list the outputs of Model, Tenor, and Sense
when run on the preceding Log file.  Using the same format and labeling as
before, we may note that Model has, from 2^5 = 32 possible interpretations,
made 11 evaluations, and found 3 models answering the generic descriptions
that were imposed by the logical input file.

Model Outline
o------------------------o
| living_thing           |
|  non_living -          |  1
|  (non_living )         |
|   mineral -            |  2
|   (mineral )           |
|    animal              |
|     vegetable -        |  3
|     (vegetable ) *     |  4 *
|    (animal )           |
|     vegetable *        |  5 *
|     (vegetable ) -     |  6
| (living_thing )        |
|  non_living            |
|   animal -             |  7
|   (animal )            |
|    vegetable -         |  8
|    (vegetable )        |
|     mineral *          |  9 *
|     (mineral ) -       | 10
|  (non_living ) -       | 11
o------------------------o

Tenor Outline
o------------------------o
| living_thing           |
|  (non_living )         |
|   (mineral )           |
|    animal              |
|     (vegetable ) *     | <1>
|    (animal )           |
|     vegetable *        | <2>
| (living_thing )        |
|  non_living            |
|   (animal )            |
|    (vegetable )        |
|     mineral *          | <3>
o------------------------o

Sense Outline
o------------------------o
| living_thing           |
|  animal                |
|  vegetable             |
| non_living             |
|  mineral               |
o------------------------o

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

Example.  Molly's World

I think that we are finally ready to tackle a more respectable example.
The Example known as "Molly's World" is borrowed from the literature on
computational learning theory, adapted with a few changes from the example
called "Molly’s Problem" in the paper "Learning With Hints" by Dana Angluin.
By way of setting up the problem, I quote Angluin's motivational description:

| Imagine that you have become acquainted with an alien named Molly from the
| planet Ornot, who is currently employed in a day-care center.  She is quite
| good at propositional logic, but a bit weak on knowledge of Earth.  So you
| decide to formulate the beginnings of a propositional theory to help her
| label things in her immediate environment.
|
| Angluin, Dana, "Learning With Hints", pages 167-181, in:
| David Haussler & Leonard Pitt (eds.), 'Proceedings of the 1988 Workshop
| on Computational Learning Theory', Morgan Kaufmann, San Mateo, CA, 1989.

The purpose of this quaint pretext is, of course, to make sure that the
reader appreciates the constraints of the problem:  that no extra savvy
is fair, all facts must be presumed or deduced on the immediate premises.

My use of this example is not directly relevant to the purposes of the
discussion from which it is taken, so I simply give my version of it
without comment on those issues.

Here is my rendition of the initial knowledge base delimiting Molly’s World:

Logical Input File:  Molly.Log
o---------------------------------------------------------------------o
|                                                                     |
| ( object ,( toy ),( vehicle ))                                      |
| (( small_size ),( medium_size ),( large_size ))                     |
| (( two_wheels ),( three_wheels ),( four_wheels ))                   |
| (( no_seat ),( one_seat ),( few_seats ),( many_seats ))             |
| ( object ,( scooter ),( bike ),( trike ),( car ),( bus ),( wagon )) |
| ( two_wheels    no_seat            ,( scooter ))                    |
| ( two_wheels    one_seat    pedals ,( bike ))                       |
| ( three_wheels  one_seat    pedals ,( trike ))                      |
| ( four_wheels   few_seats   doors  ,( car ))                        |
| ( four_wheels   many_seats  doors  ,( bus ))                        |
| ( four_wheels   no_seat     handle ,( wagon ))                      |
| ( scooter           ( toy  small_size ))                            |
| ( wagon             ( toy  small_size ))                            |
| ( trike             ( toy  small_size ))                            |
| ( bike  small_size  ( toy ))                                        |
| ( bike  medium_size ( vehicle ))                                    |
| ( bike  large_size  )                                               |
| ( car               ( vehicle  large_size ))                        |
| ( bus               ( vehicle  large_size ))                        |
| ( toy               ( object ))                                     |
| ( vehicle           ( object ))                                     |
|                                                                     |
o---------------------------------------------------------------------o

All of the logical forms that are used in the preceding Log file
will probably be familiar from earlier discussions.  The purpose
of one or two constructions may, however, be a little obscure,
so I will insert a few words of additional explanation here:

The rule "( bike large_size )", for example, merely
says that nothing can be both a bike and large_size.

The rule "( three_wheels one_seat pedals ,( trike ))" says that anything
with all the features of three_wheels, one_seat, and pedals is excluded
from being anything but a trike.  In short, anything with just those
three features is equivalent to a trike.

Recall that the form "( p , q )" may be interpreted to assert either
the exclusive disjunction or the logical inequivalence of p and q.

The rules have been stated in this particular way simply
to imitate the style of rules in the reference example.

This last point does bring up an important issue, the question
of "rhetorical" differences in expression and their potential
impact on the "pragmatics" of computation.  Unfortunately,
I will have to abbreviate my discussion of this topic for
now, and only mention in passing the following facts.

Logically equivalent expressions, even though they must lead
to logically equivalent normal forms, may have very different
characteristics when it comes to the efficiency of processing.

For instance, consider the following four forms:

| 1.  (( p , q ))
|
| 2.  ( p ,( q ))
|
| 3.  (( p ), q )
|
| 4.  (( p , q ))

All of these are equally succinct ways of maintaining that
p is logically equivalent to q, yet each can have different
effects on the route that Model takes to arrive at an answer.
Apparently, some equalities are more equal than others.

These effects occur partly because the algorithm chooses to make cases
of variables on a basis of leftmost shallowest first, but their impact
can be complicated by the interactions that each expression has with
the context that it occupies.  The main lesson to take away from all
of this, at least, for the time being, is that it is probably better
not to bother too much about these problems, but just to experiment
with different ways of expressing equivalent pieces of information
until you get a sense of what works best in various situations.

I think that you will be happy to see only the
ultimate Sense of Molly’s World, so here it is:

Sense Outline:  Molly.Sen
o------------------------o
| object                 |
|  two_wheels            |
|   no_seat              |
|    scooter             |
|     toy                |
|      small_size        |
|   one_seat             |
|    pedals              |
|     bike               |
|      small_size        |
|       toy              |
|      medium_size       |
|       vehicle          |
|  three_wheels          |
|   one_seat             |
|    pedals              |
|     trike              |
|      toy               |
|       small_size       |
|  four_wheels           |
|   few_seats            |
|    doors               |
|     car                |
|      vehicle           |
|       large_size       |
|   many_seats           |
|    doors               |
|     bus                |
|      vehicle           |
|       large_size       |
|   no_seat              |
|    handle              |
|     wagon              |
|      toy               |
|       small_size       |
o------------------------o

This outline is not the Sense of the unconstrained Log file,
but the result of running Model with a query on the single
feature "object".  Using this focus helps the Modeler
to make more relevant Sense of Molly’s World.

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

DM = Douglas McDavid

DM: This, again, is an example of how real issues of ontology are
    so often trivialized at the expense of technicalities.  I just
    had a burger, some fries, and a Coke.  I would say all that was
    non-living and non-mineral.  A virus, I believe is non-animal,
    non-vegetable, but living (and non-mineral).  Teeth, shells,
    and bones are virtually pure mineral, but living.  These are
    the kinds of issues that are truly "ontological," in my
    opinion.  You are not the only one to push them into
    the background as of lesser importance.  See the
    discussion of "18-wheelers" in John Sowa's book.

it's not my example, and from you say, it's not your example either.
copied it out of a book or a paper somewhere, too long ago to remember.
i am assuming that the author or tardition from which it came must have
seen some kind of sense in it.  tell you what, write out your own theory
of "what is" in so many variables, more or less, publish it in a book or
a paper, and then folks will tell you that they dispute each and every
thing that you have just said, and it won't really matter all that much
how complex it is or how subtle you are.  that has been the way of all
ontology for about as long as anybody can remember or even read about.
me?  i don't have sufficient arrogance to be an ontologist, and you
know that's saying a lot, as i can't even imagine a way to convince
myself that i believe i know "what is", really and truly for sure
like some folks just seem to do.  so i am working to improve our
technical ability to do logic, which is mostly a job of shooting
down the more serious delusions that we often get ourselves into.
can i be of any use to ontologists?  i dunno.  i guess it depends
on how badly they are attached to some of the delusions of knowing
what their "common" sense tells them everybody ought to already know,
but that every attempt to check that out in detail tells them it just
ain't so.  a problem for which denial was just begging to be invented,
and so it was.

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

Example.  Molly's World (cont.)

In preparation for a contingently possible future discussion,
I need to attach a few parting thoughts to the case workup
of Molly's World that may not seem terribly relevant to
the present setting, but whose pertinence I hope will
become clearer in time.

The logical paradigm from which this Example was derived is that
of "Zeroth Order Horn Clause Theories".  The clauses at issue
in these theories are allowed to be of just three kinds:

| 1.  p & q & r & ... => z
|
| 2.  z
|
| 3.  ~[p & q & r & ...]

Here, the proposition letters "p", "q", "r", ..., "z"
are restricted to being single positive features, not
themselves negated or otherwise complex expressions.

In the Cactus Language or Existential Graph syntax
these forms would take on the following appearances:

| 1.  ( p q r ... ( z ))
|
| 2.    z
|
| 3.  ( p q r ... )

The style of deduction in Horn clause logics is essentially
proof-theoretic in character, with the main burden of proof
falling on implication relations ("=>") and on "projective"
forms of inference, that is, information-losing inferences
like modus ponens and resolution.  Cf. [Llo], [MaW].

In contrast, the method used here is substantially model-theoretic,
the stress being to start from more general forms of expression for
laying out facts (for example, distinctions, equations, partitions)
and to work toward results that maintain logical equivalence with
their origins.

What all of this has to do with the output above is this:
>From the perspective that is adopted in the present work,
almost any theory, for example, the one that is founded
on the postulates of Molly's World, will have far more
models than the implicational and inferential mode of
reasoning is designed to discover.  We will be forced
to confront them, however, if we try to run Model on
a large set of implications.

The typical Horn clause interpreter gets around this
difficulty only by a stratagem that takes clauses to
mean something other than what they say, that is, by
distorting the principles of semantics in practice.
Our Model, on the other hand, has no such finesse.

This explains why it was necessary to impose the
prerequisite "object" constraint on the Log file
for Molly's World.  It supplied no more than what
we usually take for granted, in order to obtain
a set of models that we would normally think of
as being the intended import of the definitions.

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

Example.  Jets & Sharks

The propositional calculus based on the boundary operator, that is,
the multigrade logical connective of the form "( , , , ... )" can be
interpreted in a way that resembles the logic of activation states and
competition constraints in certain neural network models.  One way to do
this is by interpreting the blank or unmarked state as the resting state
of a neural pool, the bound or marked state as its activated state, and
by representing a mutually inhibitory pool of neurons p, q, r by means
of the expression "( p , q , r )".  To illustrate this possibility,
I transcribe into cactus language expressions a notorious example
from the "parallel distributed processing" (PDP) paradigm [McR]
and work through two of the associated exercises as portrayed
in this format.

Logical Input File:  JAS  =  ZOT(Jets And Sharks)
o----------------------------------------------------------------o
|                                                                |
|  (( art    ),( al   ),( sam  ),( clyde ),( mike  ),            |
|   ( jim    ),( greg ),( john ),( doug  ),( lance ),            |
|   ( george ),( pete ),( fred ),( gene  ),( ralph ),            |
|   ( phil   ),( ike  ),( nick ),( don   ),( ned   ),( karl ),   |
|   ( ken    ),( earl ),( rick ),( ol    ),( neal  ),( dave ))   |
|                                                                |
|  ( jets , sharks )                                             |
|                                                                |
|  ( jets ,                                                      |
|    ( art    ),( al   ),( sam  ),( clyde ),( mike  ),           |
|    ( jim    ),( greg ),( john ),( doug  ),( lance ),           |
|    ( george ),( pete ),( fred ),( gene  ),( ralph ))           |
|                                                                |
|  ( sharks ,                                                    |
|    ( phil ),( ike  ),( nick ),( don ),( ned  ),( karl ),       |
|    ( ken  ),( earl ),( rick ),( ol  ),( neal ),( dave ))       |
|                                                                |
|  (( 20's ),( 30's ),( 40's ))                                  |
|                                                                |
|  ( 20's ,                                                      |
|    ( sam    ),( jim  ),( greg ),( john ),( lance ),            |
|    ( george ),( pete ),( fred ),( gene ),( ken   ))            |
|                                                                |
|  ( 30's ,                                                      |
|    ( al   ),( mike ),( doug ),( ralph ),                       |
|    ( phil ),( ike  ),( nick ),( don   ),                       |
|    ( ned  ),( rick ),( ol   ),( neal  ),( dave ))              |
|                                                                |
|  ( 40's ,                                                      |
|    ( art ),( clyde ),( karl ),( earl ))                        |
|                                                                |
|  (( junior_high ),( high_school ),( college ))                 |
|                                                                |
|  ( junior_high ,                                               |
|    ( art  ),( al    ),( clyde  ),( mike  ),( jim ),            |
|    ( john ),( lance ),( george ),( ralph ),( ike ))            |
|                                                                |
|  ( high_school ,                                               |
|    ( greg ),( doug ),( pete ),( fred ),( nick ),               |
|    ( karl ),( ken  ),( earl ),( rick ),( neal ),( dave ))      |
|                                                                |
|  ( college ,                                                   |
|    ( sam ),( gene ),( phil ),( don ),( ned ),( ol ))           |
|                                                                |
|  (( single ),( married ),( divorced ))                         |
|                                                                |
|  ( single ,                                                    |
|    ( art   ),( sam  ),( clyde ),( mike ),                      |
|    ( doug  ),( pete ),( fred  ),( gene ),                      |
|    ( ralph ),( ike  ),( nick  ),( ken  ),( neal ))             |
|                                                                |
|  ( married ,                                                   |
|    ( al  ),( greg ),( john ),( lance ),( phil ),               |
|    ( don ),( ned  ),( karl ),( earl  ),( ol   ))               |
|                                                                |
|  ( divorced ,                                                  |
|    ( jim ),( george ),( rick ),( dave ))                       |
|                                                                |
|  (( bookie ),( burglar ),( pusher ))                           |
|                                                                |
|  ( bookie ,                                                    |
|    ( sam  ),( clyde ),( mike ),( doug ),                       |
|    ( pete ),( ike   ),( ned  ),( karl ),( neal ))              |
|                                                                |
|  ( burglar ,                                                   |
|    ( al     ),( jim ),( john ),( lance ),                      |
|    ( george ),( don ),( ken  ),( earl  ),( rick ))             |
|                                                                |
|  ( pusher ,                                                    |
|    ( art   ),( greg ),( fred ),( gene ),                       |
|    ( ralph ),( phil ),( nick ),( ol   ),( dave ))              |
|                                                                |
o----------------------------------------------------------------o

We now apply Study to the proposition that
defines the Jets and Sharks knowledge base,
that is to say, the knowledge that we are
given about the Jets and Sharks, not the
knowledge that the Jets and Sharks have.

With a query on the name "ken" we obtain the following
output, giving all of the features associated with Ken:

Sense Outline:  JAS & Ken
o---------------------------------------o
| ken                                   |
|  sharks                               |
|   20's                                |
|    high_school                        |
|     single                            |
|      burglar                          |
o---------------------------------------o

With a query on the two features "college" and "sharks"
we obtain the following outline of all of the features
that satisfy these constraints:

Sense Outline:  JAS & College & Sharks
o---------------------------------------o
| college                               |
|  sharks                               |
|   30's                                |
|    married                            |
|     bookie                            |
|      ned                              |
|     burglar                           |
|      don                              |
|     pusher                            |
|      phil                             |
|      ol                               |
o---------------------------------------o

>From this we discover that all college Sharks
are 30-something and married.  Furthermore,
we have a complete listing of their names
broken down by occupation, as I have no
doubt that all of them will be in time.

| Reference:
|
| McClelland, James L. & Rumelhart, David E.,
|'Explorations in Parallel Distributed Processing:
| A Handbook of Models, Programs, and Exercises',
| MIT Press, Cambridge, MA, 1988.

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

One of the issues that my pondering weak and weary over
has caused me to burn not a few barrels of midnight oil
over the past elventeen years or so is the relationship
among divers and sundry "styles of inference", by which
I mean particular choices of inference paradigms, rules,
or schemata.  The chief breakpoint seems to lie between
information-losing and information-maintaining modes of
inference, also called "implicational" and "equational",
or "projective" and "preservative" brands, respectively.

Since it appears to be mostly the implicational and projective
styles of inference that are more familiar to folks hereabouts,
I will start off this subdiscussion by introducing a number of
risibly simple but reasonably manageable examples of the other
brand of inference, treated as equational reasoning approaches
to problems about satisfying "zeroth order constraints" (ZOC's).

Applications of a Propositional Calculator:
Constraint Satisfaction Problems.
Jon Awbrey, April 24, 1995.

The Four Houses Puzzle

Constructed on the model of the "Five Houses Puzzle" in [VaH, 132-136].

Problem Statement.  Four people with different nationalities live in the
first four houses of a street.  They practice four distinct professions,
and each of them has a favorite animal, all of them different.  The four
houses are painted different colors.  The following facts are known:

|  1.  The Englander lives in the first house on the left.
|  2.  The doctor lives in the second house.
|  3.  The third house is painted red.
|  4.  The zebra is a favorite in the fourth house.
|  5.  The person in the first house has a dog.
|  6.  The Japanese lives in the third house.
|  7.  The red house is on the left of the yellow one.
|  8.  They breed snails in the house to right of the doctor.
|  9.  The Englander lives next to the green house.
| 10.  The fox is in the house next to to the diplomat.
| 11.  The Spaniard likes zebras.
| 12.  The Japanese is a painter.
| 13.  The Italian lives in the green house.
| 14.  The violinist lives in the yellow house.
| 15.  The dog is a pet in the blue house.
| 16.  The doctor keeps a fox.

The problem is to find all of the assignments of
features to houses that satisfy these requirements.

Logical Input File:  House^4.Log
o---------------------------------------------------------------------o
|                                                                     |
|  eng_1  doc_2  red_3  zeb_4  dog_1  jap_3                           |
|                                                                     |
|  (( red_1  yel_2 ),( red_2  yel_3 ),( red_3  yel_4 ))               |
|  (( doc_1  sna_2 ),( doc_2  sna_3 ),( doc_3  sna_4 ))               |
|                                                                     |
|  (( eng_1  gre_2 ),                                                 |
|   ( eng_2  gre_3 ),( eng_2  gre_1 ),                                |
|   ( eng_3  gre_4 ),( eng_3  gre_2 ),                                |
|                    ( eng_4  gre_3 ))                                |
|                                                                     |
|  (( dip_1  fox_2 ),                                                 |
|   ( dip_2  fox_3 ),( dip_2  fox_1 ),                                |
|   ( dip_3  fox_4 ),( dip_3  fox_2 ),                                |
|                    ( dip_4  fox_3 ))                                |
|                                                                     |
|  (( spa_1 zeb_1 ),( spa_2 zeb_2 ),( spa_3 zeb_3 ),( spa_4 zeb_4 ))  |
|  (( jap_1 pai_1 ),( jap_2 pai_2 ),( jap_3 pai_3 ),( jap_4 pai_4 ))  |
|  (( ita_1 gre_1 ),( ita_2 gre_2 ),( ita_3 gre_3 ),( ita_4 gre_4 ))  |
|                                                                     |
|  (( yel_1 vio_1 ),( yel_2 vio_2 ),( yel_3 vio_3 ),( yel_4 vio_4 ))  |
|  (( blu_1 dog_1 ),( blu_2 dog_2 ),( blu_3 dog_3 ),( blu_4 dog_4 ))  |
|                                                                     |
|  (( doc_1 fox_1 ),( doc_2 fox_2 ),( doc_3 fox_3 ),( doc_4 fox_4 ))  |
|                                                                     |
|  ((                                                                 |
|                                                                     |
|  (( eng_1 ),( eng_2 ),( eng_3 ),( eng_4 ))                          |
|  (( spa_1 ),( spa_2 ),( spa_3 ),( spa_4 ))                          |
|  (( jap_1 ),( jap_2 ),( jap_3 ),( jap_4 ))                          |
|  (( ita_1 ),( ita_2 ),( ita_3 ),( ita_4 ))                          |
|                                                                     |
|  (( eng_1 ),( spa_1 ),( jap_1 ),( ita_1 ))                          |
|  (( eng_2 ),( spa_2 ),( jap_2 ),( ita_2 ))                          |
|  (( eng_3 ),( spa_3 ),( jap_3 ),( ita_3 ))                          |
|  (( eng_4 ),( spa_4 ),( jap_4 ),( ita_4 ))                          |
|                                                                     |
|  (( gre_1 ),( gre_2 ),( gre_3 ),( gre_4 ))                          |
|  (( red_1 ),( red_2 ),( red_3 ),( red_4 ))                          |
|  (( yel_1 ),( yel_2 ),( yel_3 ),( yel_4 ))                          |
|  (( blu_1 ),( blu_2 ),( blu_3 ),( blu_4 ))                          |
|                                                                     |
|  (( gre_1 ),( red_1 ),( yel_1 ),( blu_1 ))                          |
|  (( gre_2 ),( red_2 ),( yel_2 ),( blu_2 ))                          |
|  (( gre_3 ),( red_3 ),( yel_3 ),( blu_3 ))                          |
|  (( gre_4 ),( red_4 ),( yel_4 ),( blu_4 ))                          |
|                                                                     |
|  (( pai_1 ),( pai_2 ),( pai_3 ),( pai_4 ))                          |
|  (( dip_1 ),( dip_2 ),( dip_3 ),( dip_4 ))                          |
|  (( vio_1 ),( vio_2 ),( vio_3 ),( vio_4 ))                          |
|  (( doc_1 ),( doc_2 ),( doc_3 ),( doc_4 ))                          |
|                                                                     |
|  (( pai_1 ),( dip_1 ),( vio_1 ),( doc_1 ))                          |
|  (( pai_2 ),( dip_2 ),( vio_2 ),( doc_2 ))                          |
|  (( pai_3 ),( dip_3 ),( vio_3 ),( doc_3 ))                          |
|  (( pai_4 ),( dip_4 ),( vio_4 ),( doc_4 ))                          |
|                                                                     |
|  (( dog_1 ),( dog_2 ),( dog_3 ),( dog_4 ))                          |
|  (( zeb_1 ),( zeb_2 ),( zeb_3 ),( zeb_4 ))                          |
|  (( fox_1 ),( fox_2 ),( fox_3 ),( fox_4 ))                          |
|  (( sna_1 ),( sna_2 ),( sna_3 ),( sna_4 ))                          |
|                                                                     |
|  (( dog_1 ),( zeb_1 ),( fox_1 ),( sna_1 ))                          |
|  (( dog_2 ),( zeb_2 ),( fox_2 ),( sna_2 ))                          |
|  (( dog_3 ),( zeb_3 ),( fox_3 ),( sna_3 ))                          |
|  (( dog_4 ),( zeb_4 ),( fox_4 ),( sna_4 ))                          |
|                                                                     |
|  ))                                                                 |
|                                                                     |
o---------------------------------------------------------------------o

Sense Outline:  House^4.Sen
o-----------------------------o
| eng_1                       |
|  doc_2                      |
|   red_3                     |
|    zeb_4                    |
|     dog_1                   |
|      jap_3                  |
|       yel_4                 |
|        sna_3                |
|         gre_2               |
|          dip_1              |
|           fox_2             |
|            spa_4            |
|             pai_3           |
|              ita_2          |
|               vio_4         |
|                blu_1        |
o-----------------------------o

Table 1.  Solution to the Four Houses Puzzle
o------------o------------o------------o------------o------------o
|            | House 1    | House 2    | House 3    | House 4    |
o------------o------------o------------o------------o------------o
| Nation     | England    | Italy      | Japan      | Spain      |
| Color      | blue       | green      | red        | yellow     |
| Profession | diplomat   | doctor     | painter    | violinist  |
| Animal     | dog        | fox        | snails     | zebra      |
o------------o------------o------------o------------o------------o

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

First off, I do not trivialize the "real issues of ontology", indeed,
it is precisely my estimate of the non-trivial difficulty of this task,
of formulating the types of "generic ontology" that we propose to do here,
that forces me to choose and to point out the inescapability of the approach
that I am currently taking, which is to enter on the necessary preliminary of
building up the logical tools that we need to tackle the ontology task proper.
And I would say, to the contrary, that it is those who think we can arrive at
a working general ontology by sitting on the porch shooting the breeze about
"what it is" until the cows come home -- that is, the method for which it
has become cliche to indict the Ancient Greeks, though, if truth be told,
we'd have to look to the pre-socratics and the pre-stoics to find a good
match for the kinds of revelation that are common hereabouts -- I would
say that it's those folks who trivialize the "real issues of ontology".

A person, living in our times, who is serious about knowing the being of things,
really only has one choice -- to pick what tiny domain of things he or she just
has to know about the most, thence to hie away to the adept gurus of the matter
in question, forgeting the rest, cause "general ontology" is a no-go these days.
It is presently in a state like astronomy before telescopes, and that means not
entirely able to discern itself from astrology and other psychically projective
exercises of wishful and dreadful thinking like that.

So I am busy grinding lenses ...

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

DM = Douglas McDavid

DM: Thanks for both the original and additional response.  I'm not trying to
    single you out, as I have been picking  on various postings in a similar
    manner ever since I started contributing to this discussion.  I agree with
    you that the task of this working group is non-trivially difficult.  In fact,
    I believe we are still a long way from a clear and useful agreement about what
    constitutes "upper" ontology, and what it would mean to standardize it.  However,
    I don't agree that the only place to make progress is in tiny domains of things.
    I've contributed the thought that a fundamental, upper-level concept is the
    concept of system, and that that would be a good place to begin.  And I'll
    never be able to refrain from evaluating the content as well as the form
    of any examples presented for consideration here.  Probably should
    accompany these comments with a ;-)

There will never be a standard universal ontology
of the absolute essential impertubable monolithic
variety that some people still dream of in their
fantasies of spectating on and speculating about
a pre-relativistically non-participatory universe
from their singular but isolated gods'eye'views.
The bells tolled for that one many years ago,
but some of the more blithe of the blissful
islanders have just not gotten the news yet.

But there is still a lot to do that would be useful
under the banner of a "standard upper ontology",
if only we stay loose in our interpretation
of what that implies in practical terms.

One likely approach to the problem would be to take
a hint from the afore-allusioned history of physics --
to inquire for whom, else, the bell tolls -- and to
see if there are any bits of wisdom from that prior
round of collective experience that can be adapted
by dint of analogy to our present predicament.
I happen to think that there are.

And there the answer was, not to try and force a return,
though lord knows they all gave it their very best shot,
to an absolute and imperturbable framework of existence,
but to see the reciprocal participant relation that all
partakers have to the constitution of that framing, yes,
even unto those who would abdictators and abstainees be.

But what does that imply about some shred of a standard?
It means that we are better off seeking, not a standard,
one-size-fits-all ontology, but more standard resources
for trying to interrelate diverse points of view and to
transform the data that's gathered from one perspective
in ways that it can most appropriately be compared with
the data that is gathered from other standpoints on the
splendorous observational scenes and theorematic stages.

That is what I am working on.
And it hasn't been merely
for a couple of years.

As to this bit:

o-------------------------------------------------o
|                                                 |
|   ( living_thing , non_living )                 |
|                                                 |
|   (( animal ),( vegetable ),( mineral ))        |
|                                                 |
|   ( living_thing ,( animal ),( vegetable ))     |
|                                                 |
|   ( mineral ( non_living ))                     |
|                                                 |
o-------------------------------------------------o

My 5-dimensional Example, that I borrowed from some indifferent source
of what is commonly recognized as "common sense" -- and I think rather
obviously designed more for the classification of pre-modern species
of whole critters and pure matters of natural substance than the
motley mixture of un/natural and in/organic conglouterites that
we find served up on the menu of modernity -- was not intended
even so much as a toy ontology, but simply as an expository
example, concocted for the sake of illustrating the sorts
of logical interaction that occur among four different
patterns of logical constraint, all of which types
arise all the time no matter what the domain, and
which I believe that my novel forms of expression,
syntactically speaking, express quite succinctly,
especially when you contemplate the complexities
of the computation that may flow and must follow
from even these meagre propositional expressions.

Yes, systems -- but -- even here usage differs in significant ways.
I have spent ten years now trying to integrate my earlier efforts
under an explicit systems banner, but even within the bounds of
a systems engineering programme at one site there is a wide
semantic dispersion that issues from this word "system".
I am committed, and in writing, to taking what we so
glibly and prospectively call "intelligent systems"
seriously as dynamical systems.  That has many
consequences, and I have to pick and choose
which of those I may be suited to follow.

But that is too long a story for now ...

";-)"?

Somehow that has always looked like
the Chesshire Cat's grin to me ...

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

By way of catering to popular demand, I have decided to
render this symposium a bit more à la carte, and thus to
serve up as faster food than heretofore a choice selection
of the more sumptuous bits that I have in my logical larder,
not yet full fare, by any means, but a sample of what might
one day approach to being an abundantly moveable feast of
ontological contents and general metaphysical delights.
I'll leave it to you to name your poison, as it were.

Applications of a Propositional Calculator:
Constraint Satisfaction Problems.
Jon Awbrey, April 24, 1995.

Fabric Knowledge Base
Based on the example in [MaW, pages 8-16].

Logical Input File:  Fab.Log
o---------------------------------------------------------------------o
|                                                                     |
| (has_floats , plain_weave )                                         |
| (has_floats ,(twill_weave ),(satin_weave ))                         |
|                                                                     |
| (plain_weave ,                                                      |
|  (plain_weave  one_color ),                                         |
|  (color_groups  ),                                                  |
|  (grouped_warps ),                                                  |
|  (some_thicker  ),                                                  |
|  (crossed_warps ),                                                  |
|  (loop_threads  ),                                                  |
|  (plain_weave  flannel ))                                           |
|                                                                     |
| (plain_weave  one_color  cotton  balanced  smooth  ,(percale ))     |
| (plain_weave  one_color  cotton            sheer   ,(organdy ))     |
| (plain_weave  one_color  silk              sheer   ,(organza ))     |
|                                                                     |
| (plain_weave  color_groups  warp_stripe  fill_stripe ,(plaid   ))   |
| (plaid        equal_stripe                           ,(gingham ))   |
|                                                                     |
| (plain_weave  grouped_warps ,(basket_weave ))                       |
|                                                                     |
| (basket_weave  typed ,                                              |
|  (type_2_to_1 ),                                                    |
|  (type_2_to_2 ),                                                    |
|  (type_4_to_4 ))                                                    |
|                                                                     |
| (basket_weave  typed  type_2_to_1   thicker_fill   ,(oxford      )) |
| (basket_weave  typed (type_2_to_2 ,                                 |
|                       type_4_to_4 ) same_thickness ,(monks_cloth )) |
| (basket_weave (typed )              rough  open    ,(hopsacking  )) |
|                                                                     |
| (typed (basket_weave ))                                             |
|                                                                     |
| (basket_weave ,(oxford ),(monks_cloth ),(hopsacking ))              |
|                                                                     |
| (plain_weave   some_thicker ,(ribbed_weave ))                       |
|                                                                     |
| (ribbed_weave ,(small_rib ),(medium_rib ),(heavy_rib ))             |
| (ribbed_weave ,(flat_rib  ),(round_rib ))                           |
|                                                                     |
| (ribbed_weave  thicker_fill          ,(cross_ribbed ))              |
| (cross_ribbed  small_rib   flat_rib  ,(faille       ))              |
| (cross_ribbed  small_rib   round_rib ,(grosgrain    ))              |
| (cross_ribbed  medium_rib  round_rib ,(bengaline    ))              |
| (cross_ribbed  heavy_rib   round_rib ,(ottoman      ))              |
|                                                                     |
| (cross_ribbed ,(faille ),(grosgrain ),(bengaline ),(ottoman ))      |
|                                                                     |
| (plain_weave  crossed_warps ,(leno_weave  ))                        |
| (leno_weave   open          ,(marquisette ))                        |
| (plain_weave  loop_threads  ,(pile_weave ))                         |
|                                                                     |
| (pile_weave ,(fill_pile ),(warp_pile ))                             |
| (pile_weave ,(cut ),(uncut ))                                       |
|                                                                     |
| (pile_weave  warp_pile  cut                   ,(velvet    ))        |
| (pile_weave  fill_pile  cut    aligned_pile   ,(corduroy  ))        |
| (pile_weave  fill_pile  cut    staggered_pile ,(velveteen ))        |
| (pile_weave  fill_pile  uncut  reversible     ,(terry     ))        |
|                                                                     |
| (pile_weave  fill_pile  cut ( (aligned_pile , staggered_pile ) ))   |
|                                                                     |
| (pile_weave ,(velvet ),(corduroy ),(velveteen ),(terry ))           |
|                                                                     |
| (plain_weave ,                                                      |
|  (percale    ),(organdy     ),(organza    ),(plaid   ),             |
|  (oxford     ),(monks_cloth ),(hopsacking ),                        |
|  (faille     ),(grosgrain   ),(bengaline  ),(ottoman ),             |
|  (leno_weave ),(pile_weave  ),(plain_weave   flannel ))             |
|                                                                     |
| (twill_weave ,                                                      |
|  (warp_faced ),                                                     |
|  (filling_faced ),                                                  |
|  (even_twill ),                                                     |
|  (twill_weave  flannel ))                                           |
|                                                                     |
| (twill_weave  warp_faced  colored_warp  white_fill ,(denim ))       |
| (twill_weave  warp_faced  one_color                ,(drill ))       |
| (twill_weave  even_twill  diagonal_rib             ,(serge ))       |
|                                                                     |
| (twill_weave  warp_faced (                                          |
|  (one_color ,                                                       |
|   ((colored_warp )(white_fill )) )                                  |
| ))                                                                  |
|                                                                     |
| (twill_weave  warp_faced ,(denim ),(drill ))                        |
| (twill_weave  even_twill ,(serge ))                                 |
|                                                                     |
| ((                                                                  |
|     (  ((plain_weave )(twill_weave ))                               |
|        ((cotton      )(wool        )) napped ,(flannel ))           |
| ))                                                                  |
|                                                                     |
| (satin_weave ,(warp_floats ),(fill_floats ))                        |
|                                                                     |
| (satin_weave ,(satin_weave smooth ),(satin_weave napped ))          |
| (satin_weave ,(satin_weave cotton ),(satin_weave silk   ))          |
|                                                                     |
| (satin_weave  warp_floats  smooth         ,(satin    ))             |
| (satin_weave  fill_floats  smooth         ,(sateen   ))             |
| (satin_weave               napped  cotton ,(moleskin ))             |
|                                                                     |
| (satin_weave ,(satin ),(sateen ),(moleskin ))                       |
|                                                                     |
o---------------------------------------------------------------------o

| Reference [MaW]
|
| Maier, David & Warren, David S.,
|'Computing with Logic:  Logic Programming with Prolog',
| Benjamin/Cummings, Menlo Park, CA, 1988.

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

I think that it might be a good idea to go back to a simpler example
of a constraint satisfaction problem, and to discuss the elements of
its expression as a ZOT in a less cluttered setting before advancing
onward once again to problems on the order of the Four Houses Puzzle.

| Applications of a Propositional Calculator:
| Constraint Satisfaction Problems.
| Jon Awbrey, April 24, 1995.

Graph Coloring

Based on the discussion in [Wil, page 196].

One is given three colors, say, orange, silver, indigo,
and a graph on four nodes that has the following shape:

|          1
|          o
|         / \
|        /   \
|     4 o-----o 2
|        \   /
|         \ /
|          o
|          3

The problem is to color the nodes of the graph
in such a way that no pair of nodes that are
adjacent in the graph, that is, linked by
an edge, get the same color.

The objective situation that is to be achieved can be represented
in a so-called "declarative" fashion, in effect, by employing the
cactus language as a very simple sort of declarative programming
language, and depicting the prospective solution to the problem
as a ZOT.

To do this, begin by declaring the following set of
twelve boolean variables or "zeroth order features":

{1_orange, 1_silver, 1_indigo,
 2_orange, 2_silver, 2_indigo,
 3_orange, 3_silver, 3_indigo,
 4_orange, 4_silver, 4_indigo}

The interpretation to keep in mind will be such that
the feature name of the form "<node i>_<color j>"
says that the node i is assigned the color j.

Logical Input File:  Color.Log
o----------------------------------------------------------------------o
|                                                                      |
|  (( 1_orange ),( 1_silver ),( 1_indigo ))                            |
|  (( 2_orange ),( 2_silver ),( 2_indigo ))                            |
|  (( 3_orange ),( 3_silver ),( 3_indigo ))                            |
|  (( 4_orange ),( 4_silver ),( 4_indigo ))                            |
|                                                                      |
|  ( 1_orange  2_orange )( 1_silver  2_silver )( 1_indigo  2_indigo )  |
|  ( 1_orange  4_orange )( 1_silver  4_silver )( 1_indigo  4_indigo )  |
|  ( 2_orange  3_orange )( 2_silver  3_silver )( 2_indigo  3_indigo )  |
|  ( 2_orange  4_orange )( 2_silver  4_silver )( 2_indigo  4_indigo )  |
|  ( 3_orange  4_orange )( 3_silver  4_silver )( 3_indigo  4_indigo )  |
|                                                                      |
o----------------------------------------------------------------------o

The first stanza of verses declares that
every node is assigned exactly one color.

The second stanza of verses declares that
no adjacent nodes get the very same color.

Each satisfying interpretation of this ZOT
that is also a program corresponds to what
graffitists call a "coloring" of the graph.

Theme One's Model interpreter, when we set
it to work on this ZOT, will array  before
our eyes all of the colorings of the graph.

Sense Outline:  Color.Sen
o-----------------------------o
| 1_orange                    |
|  2_silver                   |
|   3_orange                  |
|    4_indigo                 |
|  2_indigo                   |
|   3_orange                  |
|    4_silver                 |
| 1_silver                    |
|  2_orange                   |
|   3_silver                  |
|    4_indigo                 |
|  2_indigo                   |
|   3_silver                  |
|    4_orange                 |
| 1_indigo                    |
|  2_orange                   |
|   3_indigo                  |
|    4_silver                 |
|  2_silver                   |
|   3_indigo                  |
|    4_orange                 |
o-----------------------------o

| Reference [Wil]
|
| Wilf, Herbert S.,
|'Algorithms and Complexity',
| Prentice-Hall, Englewood Cliffs, NJ, 1986.
|
| Nota Bene.  There is a wrong Figure in some
| printings of the book, that does not match
| the description of the Example that is
| given in the text.

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

Let us continue to examine the properties of the cactus language
as a minimal style of declarative programming language.  Even in
the likes of this zeroth order microcosm one can observe, and on
a good day still more clearly for the lack of other distractions,
many of the buzz worlds that will spring into full bloom, almost
as if from nowhere, to become the first order of business in the
latter day logical organa, plus combinators, plus lambda calculi.

By way of homage to the classics of the art, I can hardly pass
this way without paying my dues to the next sample of examples.

N Queens Problem

I will give the ZOT that describes the N Queens Problem for N = 5,
since that is the most that I and my old 286 could do when last I
wrote up this Example.

The problem is now to write a "zeroth order program" (ZOP) that
describes the following objective:  To place 5 chess queens on
a 5 by 5 chessboard so that no queen attacks any other queen.

It is clear that there can be at most one queen on each row
of the board and so by dint of regal necessity, exactly one
queen in each row of the desired array.  This gambit allows
us to reduce the problem to one of picking a permutation of
five things in fives places, and this affords us sufficient
clue to begin down a likely path toward the intended object,
by recruiting the following phalanx of 25 logical variables:

Literal Input File:  Q5.Lit
o---------------------------------------o
|                                       |
|  q1_r1, q1_r2, q1_r3, q1_r4, q1_r5,   |
|  q2_r1, q2_r2, q2_r3, q2_r4, q2_r5,   |
|  q3_r1, q3_r2, q3_r3, q3_r4, q3_r5,   |
|  q4_r1, q4_r2, q4_r3, q4_r4, q4_r5,   |
|  q5_r1, q5_r2, q5_r3, q5_r4, q5_r5.   |
|                                       |
o---------------------------------------o

Thus we seek to define a function, of abstract type f : %B%^25 -> %B%,
whose fibre of truth f^(-1)(%1%) is a set of interpretations, each of
whose elements bears the abstract type of a point in the space %B%^25,
and whose reading will inform us of our desired set of configurations.

Logical Input File:  Q5.Log
o------------------------------------------------------------o
|                                                            |
|  ((q1_r1 ),(q1_r2 ),(q1_r3 ),(q1_r4 ),(q1_r5 ))            |
|  ((q2_r1 ),(q2_r2 ),(q2_r3 ),(q2_r4 ),(q2_r5 ))            |
|  ((q3_r1 ),(q3_r2 ),(q3_r3 ),(q3_r4 ),(q3_r5 ))            |
|  ((q4_r1 ),(q4_r2 ),(q4_r3 ),(q4_r4 ),(q4_r5 ))            |
|  ((q5_r1 ),(q5_r2 ),(q5_r3 ),(q5_r4 ),(q5_r5 ))            |
|                                                            |
|  ((q1_r1 ),(q2_r1 ),(q3_r1 ),(q4_r1 ),(q5_r1 ))            |
|  ((q1_r2 ),(q2_r2 ),(q3_r2 ),(q4_r2 ),(q5_r2 ))            |
|  ((q1_r3 ),(q2_r3 ),(q3_r3 ),(q4_r3 ),(q5_r3 ))            |
|  ((q1_r4 ),(q2_r4 ),(q3_r4 ),(q4_r4 ),(q5_r4 ))            |
|  ((q1_r5 ),(q2_r5 ),(q3_r5 ),(q4_r5 ),(q5_r5 ))            |
|                                                            |
|  ((                                                        |
|                                                            |
|  (q1_r1 q2_r2 )(q1_r1 q3_r3 )(q1_r1 q4_r4 )(q1_r1 q5_r5 )  |
|                (q2_r2 q3_r3 )(q2_r2 q4_r4 )(q2_r2 q5_r5 )  |
|                              (q3_r3 q4_r4 )(q3_r3 q5_r5 )  |
|                                            (q4_r4 q5_r5 )  |
|                                                            |
|  (q1_r2 q2_r3 )(q1_r2 q3_r4 )(q1_r2 q4_r5 )                |
|                (q2_r3 q3_r4 )(q2_r3 q4_r5 )                |
|                              (q3_r4 q4_r5 )                |
|                                                            |
|  (q1_r3 q2_r4 )(q1_r3 q3_r5 )                              |
|                (q2_r4 q3_r5 )                              |
|                                                            |
|  (q1_r4 q2_r5 )                                            |
|                                                            |
|  (q2_r1 q3_r2 )(q2_r1 q4_r3 )(q2_r1 q5_r4 )                |
|                (q3_r2 q4_r3 )(q3_r2 q5_r4 )                |
|                              (q4_r3 q5_r4 )                |
|                                                            |
|  (q3_r1 q4_r2 )(q3_r1 q5_r3 )                              |
|                (q4_r2 q5_r3 )                              |
|                                                            |
|  (q4_r1 q5_r2 )                                            |
|                                                            |
|  (q1_r5 q2_r4 )(q1_r5 q3_r3 )(q1_r5 q4_r2 )(q1_r5 q5_r1 )  |
|                (q2_r4 q3_r3 )(q2_r4 q4_r2 )(q2_r4 q5_r1 )  |
|                              (q3_r3 q4_r2 )(q3_r3 q5_r1 )  |
|                                            (q4_r2 q5_r1 )  |
|                                                            |
|  (q2_r5 q3_r4 )(q2_r5 q4_r3 )(q2_r5 q5_r2 )                |
|                (q3_r4 q4_r3 )(q3_r4 q5_r2 )                |
|                              (q4_r3 q5_r2 )                |
|                                                            |
|  (q3_r5 q4_r4 )(q3_r5 q5_r3 )                              |
|                (q4_r4 q5_r3 )                              |
|                                                            |
|  (q4_r5 q5_r4 )                                            |
|                                                            |
|  (q1_r4 q2_r3 )(q1_r4 q3_r2 )(q1_r4 q4_r1 )                |
|                (q2_r3 q3_r2 )(q2_r3 q4_r1 )                |
|                              (q3_r2 q4_r1 )                |
|                                                            |
|  (q1_r3 q2_r2 )(q1_r3 q3_r1 )                              |
|                (q2_r2 q3_r1 )                              |
|                                                            |
|  (q1_r2 q2_r1 )                                            |
|                                                            |
|  ))                                                        |
|                                                            |
o------------------------------------------------------------o

The vanguard of this logical regiment consists of two
stock'a'block platoons, the pattern of whose features
is the usual sort of array for conveying permutations.
Between the stations of their respective offices they
serve to warrant that all of the interpretations that
are left standing on the field of valor at the end of
the day will be ones that tell of permutations 5 by 5.
The rest of the ruck and the runt of the mill in this
regimental logos are there to cover the diagonal bias
against attacking queens that is our protocol to suit.

And here is the issue of the day:

Sense Output:  Q5.Sen
o-------------------o
| q1_r1             |
|  q2_r3            |
|   q3_r5           |
|    q4_r2          |
|     q5_r4         | <1>
|  q2_r4            |
|   q3_r2           |
|    q4_r5          |
|     q5_r3         | <2>
| q1_r2             |
|  q2_r4            |
|   q3_r1           |
|    q4_r3          |
|     q5_r5         | <3>
|  q2_r5            |
|   q3_r3           |
|    q4_r1          |
|     q5_r4         | <4>
| q1_r3             |
|  q2_r1            |
|   q3_r4           |
|    q4_r2          |
|     q5_r5         | <5>
|  q2_r5            |
|   q3_r2           |
|    q4_r4          |
|     q5_r1         | <6>
| q1_r4             |
|  q2_r1            |
|   q3_r3           |
|    q4_r5          |
|     q5_r2         | <7>
|  q2_r2            |
|   q3_r5           |
|    q4_r3          |
|     q5_r1         | <8>
| q1_r5             |
|  q2_r2            |
|   q3_r4           |
|    q4_r1          |
|     q5_r3         | <9>
|  q2_r3            |
|   q3_r1           |
|    q4_r4          |
|     q5_r2         | <A>
o-------------------o

The number at least checks with all of the best authorities,
so I can breathe a sigh of relief on that account, at least.
I am sure that there just has to be a more clever way to do
this, that is to say, within the bounds of ZOT reason alone,
but the above is the best that I could figure out with the
time that I had at the time.

References:  [BaC, 166], [VaH, 122], [Wir, 143].

[BaC]  Ball, W.W. Rouse, & Coxeter, H.S.M.,
      'Mathematical Recreations and Essays',
       13th ed., Dover, New York, NY, 1987.

[VaH]  Van Hentenryck, Pascal,
      'Constraint Satisfaction in Logic Programming,
       MIT Press, Cambridge, MA, 1989.

[Wir]  Wirth, Niklaus,
      'Algorithms + Data Structures = Programs',
       Prentice-Hall, Englewood Cliffs, NJ, 1976.

http://mathworld.wolfram.com/QueensProblem.html
http://www.research.att.com/cgi-bin/access.cgi/as/njas/sequences/eisA.cgi?Anum=000170

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

I turn now to another golden oldie of a constraint satisfaction problem
that I would like to give here a slightly new spin, but not so much for
the sake of these trifling novelties as from a sense of old time's ache
and a duty to -- well, what's the opposite of novelty?

Phobic Apollo

| Suppose Peter, Paul, and Jane are musicians.  One of them plays
| saxophone, another plays guitar, and the third plays drums.  As
| it happens, one of them is afraid of things associated with the
| number 13, another of them is afraid of cats, and the third is
| afraid of heights.  You also know that Peter and the guitarist
| skydive, that Paul and the saxophone player enjoy cats, and
| that the drummer lives in apartment 13 on the 13th floor.
|
| Soon we will want to use these facts to reason
| about whether or not certain identity relations
| hold or are excluded.  Assume X(Peter, Guitarist)
| means "the person who is Peter is not the person who
| plays the guitar".  In this notation, the facts become:
|
| 1.  X(Peter, Guitarist)
| 2.  X(Peter, Fears Heights)
| 3.  X(Guitarist, Fears Heights)
| 4.  X(Paul, Fears Cats)
| 5.  X(Paul, Saxophonist)
| 6.  X(Saxophonist, Fears Cats)
| 7.  X(Drummer, Fears 13)
| 8.  X(Drummer, Fears Heights)
|
| Exercise attributed to Kenneth D. Forbus, pages 449-450 in:
| Patrick Henry Winston, 'Artificial Intelligence', 2nd ed.,
| Addison-Wesley, Reading, MA, 1984.

Here is one way to represent these facts in the form of a ZOT
and use it as a logical program to draw a succinct conclusion:

Logical Input File:  ConSat.Log
o-----------------------------------------------------------------------o
|                                                                       |
|  (( pete_plays_guitar ),( pete_plays_sax ),( pete_plays_drums ))      |
|  (( paul_plays_guitar ),( paul_plays_sax ),( paul_plays_drums ))      |
|  (( jane_plays_guitar ),( jane_plays_sax ),( jane_plays_drums ))      |
|                                                                       |
|  (( pete_plays_guitar ),( paul_plays_guitar ),( jane_plays_guitar ))  |
|  (( pete_plays_sax    ),( paul_plays_sax    ),( jane_plays_sax    ))  |
|  (( pete_plays_drums  ),( paul_plays_drums  ),( jane_plays_drums  ))  |
|                                                                       |
|  (( pete_fears_13 ),( pete_fears_cats ),( pete_fears_height ))        |
|  (( paul_fears_13 ),( paul_fears_cats ),( paul_fears_height ))        |
|  (( jane_fears_13 ),( jane_fears_cats ),( jane_fears_height ))        |
|                                                                       |
|  (( pete_fears_13     ),( paul_fears_13     ),( jane_fears_13     ))  |
|  (( pete_fears_cats   ),( paul_fears_cats   ),( jane_fears_cats   ))  |
|  (( pete_fears_height ),( paul_fears_height ),( jane_fears_height ))  |
|                                                                       |
|  ((                                                                   |
|                                                                       |
|  ( pete_plays_guitar )                                                |
|  ( pete_fears_height )                                                |
|                                                                       |
|  ( pete_plays_guitar  pete_fears_height )                             |
|  ( paul_plays_guitar  paul_fears_height )                             |
|  ( jane_plays_guitar  jane_fears_height )                             |
|                                                                       |
|  ( paul_fears_cats )                                                  |
|  ( paul_plays_sax  )                                                  |
|                                                                       |
|  ( pete_plays_sax  pete_fears_cats )                                  |
|  ( paul_plays_sax  paul_fears_cats )                                  |
|  ( jane_plays_sax  jane_fears_cats )                                  |
|                                                                       |
|  ( pete_plays_drums  pete_fears_13 )                                  |
|  ( paul_plays_drums  paul_fears_13 )                                  |
|  ( jane_plays_drums  jane_fears_13 )                                  |
|                                                                       |
|  ( pete_plays_drums  pete_fears_height )                              |
|  ( paul_plays_drums  paul_fears_height )                              |
|  ( jane_plays_drums  jane_fears_height )                              |
|                                                                       |
|  ))                                                                   |
|                                                                       |
o-----------------------------------------------------------------------o

Sense Outline:  ConSat.Sen
o-----------------------------o
| pete_plays_drums            |
|  paul_plays_guitar          |
|   jane_plays_sax            |
|    pete_fears_cats          |
|     paul_fears_13           |
|      jane_fears_height      |
o-----------------------------o

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

Phobic Apollo (cont.)

It might be instructive to review various aspects
of how the Theme One Study function actually went
about arriving at its answer to that last problem.
Just to prove that my program and I really did do
our homework on that Phobic Apollo ConSat problem,
and didn't just provoke some Oracle or other data
base server to give it away, here is the middling
output of the Model function as run on ConSat.Log:

Model Outline:  ConSat.Mod
o-------------------------------------------------o
| pete_plays_guitar -                             |
| (pete_plays_guitar )                            |
|  pete_plays_sax                                 |
|   pete_plays_drums -                            |
|   (pete_plays_drums )                           |
|    paul_plays_sax -                             |
|    (paul_plays_sax )                            |
|     jane_plays_sax -                            |
|     (jane_plays_sax )                           |
|      paul_plays_guitar                          |
|       paul_plays_drums -                        |
|       (paul_plays_drums )                       |
|        jane_plays_guitar -                      |
|        (jane_plays_guitar )                     |
|         jane_plays_drums                        |
|          pete_fears_13                          |
|           pete_fears_cats -                     |
|           (pete_fears_cats )                    |
|            pete_fears_height -                  |
|            (pete_fears_height )                 |
|             paul_fears_13 -                     |
|             (paul_fears_13 )                    |
|              jane_fears_13 -                    |
|              (jane_fears_13 )                   |
|               paul_fears_cats -                 |
|               (paul_fears_cats )                |
|                paul_fears_height -              |
|                (paul_fears_height ) -           |
|          (pete_fears_13 )                       |
|           pete_fears_cats -                     |
|           (pete_fears_cats )                    |
|            pete_fears_height -                  |
|            (pete_fears_height ) -               |
|         (jane_plays_drums ) -                   |
|      (paul_plays_guitar )                       |
|       paul_plays_drums                          |
|        jane_plays_drums -                       |
|        (jane_plays_drums )                      |
|         jane_plays_guitar                       |
|          pete_fears_13                          |
|           pete_fears_cats -                     |
|           (pete_fears_cats )                    |
|            pete_fears_height -                  |
|            (pete_fears_height )                 |
|             paul_fears_13 -                     |
|             (paul_fears_13 )                    |
|              jane_fears_13 -                    |
|              (jane_fears_13 )                   |
|               paul_fears_cats -                 |
|               (paul_fears_cats )                |
|                paul_fears_height -              |
|                (paul_fears_height ) -           |
|          (pete_fears_13 )                       |
|           pete_fears_cats -                     |
|           (pete_fears_cats )                    |
|            pete_fears_height -                  |
|            (pete_fears_height ) -               |
|         (jane_plays_guitar ) -                  |
|       (paul_plays_drums ) -                     |
|  (pete_plays_sax )                              |
|   pete_plays_drums                              |
|    paul_plays_drums -                           |
|    (paul_plays_drums )                          |
|     jane_plays_drums -                          |
|     (jane_plays_drums )                         |
|      paul_plays_guitar                          |
|       paul_plays_sax -                          |
|       (paul_plays_sax )                         |
|        jane_plays_guitar -                      |
|        (jane_plays_guitar )                     |
|         jane_plays_sax                          |
|          pete_fears_13 -                        |
|          (pete_fears_13 )                       |
|           pete_fears_cats                       |
|            pete_fears_height -                  |
|            (pete_fears_height )                 |
|             paul_fears_cats -                   |
|             (paul_fears_cats )                  |
|              jane_fears_cats -                  |
|              (jane_fears_cats )                 |
|               paul_fears_13                     |
|                paul_fears_height -              |
|                (paul_fears_height )             |
|                 jane_fears_13 -                 |
|                 (jane_fears_13 )                |
|                  jane_fears_height *            |
|                  (jane_fears_height ) -         |
|               (paul_fears_13 )                  |
|                paul_fears_height -              |
|                (paul_fears_height ) -           |
|           (pete_fears_cats )                    |
|            pete_fears_height -                  |
|            (pete_fears_height ) -               |
|         (jane_plays_sax ) -                     |
|      (paul_plays_guitar )                       |
|       paul_plays_sax -                          |
|       (paul_plays_sax ) -                       |
|   (pete_plays_drums ) -                         |
o-------------------------------------------------o

This is just the traverse of the "arboreal boolean expansion" (ABE) tree
that Model function germinates from the propositional expression that we
planted in the file Consat.Log, which works to describe the facts of the
situation in question.  Since there are 18 logical feature names in this
propositional expression, we are literally talking about a function that
enjoys the abstract type f : %B%^18 -> %B%.  If I had wanted to evaluate
this function by expressly writing out its truth table, then it would've
required 2^18 = 262144 rows.  Now I didn't bother to count, but I'm sure
that the above output does not have anywhere near that many lines, so it
must be that my program, and maybe even its author, has done a couple of
things along the way that are moderately intelligent.  At least, we hope.

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

AK = Antti Karttunen
JA = Jon Awbrey

AK: Am I (and other SeqFanaticians) missing something from this thread?

AK: Your previous message on seqfan (headers below) is a bit of the same topic,
    but does it belong to the same thread?  Where I could obtain the other
    messages belonging to those two threads?  (I'm just now starting to
    study "mathematical logic", and its relations to combinatorics are
    very interesting.)  Is this "cactus" language documented anywhere?

here i was just following a courtesy of copying people
when i reference their works, in this case neil's site:

http://www.research.att.com/cgi-bin/access.cgi/as/njas/sequences/eisA.cgi?Anum=000170

but then i thought that the seqfantasians might be amused, too.

the bit on higher order propositions, in particular,
those of type h : (B^2 -> B) -> B, i sent because
of the significance that 2^2^2^2 = 65536 took on
for us around that time.  & the ho, ho, ho joke.

"zeroth order logic" (zol) is just another name for
the propositional calculus or the sentential logic
that comes before "first order logic" (fol), aka
first intens/tional logic, quantificational logic,
or predicate calculus, depending on who you talk to.

the line of work that i have been doing derives from
the ideas of c.s. peirce (1839-1914), who developed
a couple of systems of "logical graphs", actually,
two variant interpretations of the same abstract
structures, called "entitative" and "existential"
graphs.  he organized his system into "alpha",
"beta", and "gamma" layers, roughly equivalent
to our propositional, quantificational, and
modal levels of logic today.

on the more contemporary scene, peirce's entitative interpretation
of logical graphs was revived and extended by george spencer brown
in his book 'laws of form', while the existential interpretation
has flourished in the development of "conceptual graphs" by
john f sowa and a community of growing multitudes.

a passel of links:

http://members.door.net/arisbe/
http://www.enolagaia.com/GSB.html
http://www.cs.uah.edu/~delugach/CG/
http://www.jfsowa.com/
http://www.jfsowa.com/cg/
http://www.jfsowa.com/peirce/ms514w.htm
http://users.bestweb.net/~sowa/
http://users.bestweb.net/~sowa/peirce/ms514.htm

i have mostly focused on "alpha" (prop calc or zol) --
though the "func conception of quant logic" thread was
a beginning try at saying how the same line of thought
might be extended to 1st, 2nd, & higher order logics --
and i devised a particular graph & string syntax that
is based on a species of cacti, officially described as
the "reflective extension of logical graphs" (ref log),
but more lately just referred to as "cactus language".

it turns out that one can do many interesting things
with prop calc if one has an efficient enough syntax
and a powerful enough interpreter for it, even using
it as a very minimal sort of declarative programming
language, hence, the current thread was directed to
applying "zeroth order theories" (zot's) as brands
of "zeroth order programs" (zop's) to a set of old
constraint satisfaction and knowledge rep examples.

more recent expositions of the cactus language have been directed
toward what some people call "ontology engineering" -- it sounds
so much cooler than "taxonomy" -- and so these are found in the
ieee standard upper ontology working group discussion archives.

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

Let's now pause and reflect on the mix of abstract and concrete material
that we have cobbled together in spectacle of this "World Of Zero" (WOZ),
since I believe that we may have seen enough, if we look at it right, to
illustrate a few of the more salient phenomena that would normally begin
to weigh in as a major force only on a much larger scale.  Now, it's not
exactly like this impoverished sample, all by itself, could determine us
to draw just the right generalizations, or force us to see the shape and
flow of its immanent law -- it is much too sparse a scattering of points
to tease out the lines of its up and coming generations quite so clearly --
but it can be seen to exemplify many of the more significant themes that
we know evolve in more substantial environments, that is, On Beyond Zero,
since we have already seen them, "tho' obscur'd", in these higher realms. 

One the the themes that I want to to keep an eye on as this discussion
develops is the subject that might be called "computation as semiosis".

In this light, any calculus worth its salt must be capable of helping
us do two things, calculation, of course, but also analysis.  This is
probably one of the reasons why the ordinary sort of differential and
integral calculus over quantitative domains is frequently referred to
as "real analysis", or even just "analysis".  It seems quite clear to
me that any adequate logical calculus, in many ways expected to serve
as a qualitative analogue of analytic geometry in the way that it can
be used to describe configurations in logically circumscribed domains,
ought to qualify in both dimensions, namely, analysis and computation.

With all of these various features of the situation in mind, then, we come
to the point of viewing analysis and computation as just so many different
kinds of "sign transformations in respect of pragmata" (STIROP's).  Taking
this insight to heart, let us next work to assemble a comprehension of our
concrete examples, set in the medium of the abstract calculi that allow us
to express their qualitative patterns, that may hope to be an increment or
two less inchoate than we have seen so far, and that may even permit us to
catch the action of these fading fleeting sign transformations on the wing.

Here is how I picture our latest round of examples
as filling out the framework of this investigation:

o-----------------------------o-----------------------------o
|     Objective Framework     |   Interpretive Framework    |
o-----------------------------o-----------------------------o
|                                                           |
|                               s_1 = Logue(o)       |      |
|                              /                     |      |
|                             /                      |      |
|                            @                       |      |
|                          ·  \                      |      |
|                        ·     \                     |      |
|                      ·        i_1 = Model(o)       v      |
|                    ·          s_2 = Model(o)       |      |
|                  ·           /                     |      |
|                ·            /                      |      |
|     Object = o · · · · · · @                       |      |
|                ·            \                      |      |
|                  ·           \                     |      |
|                    ·          i_2 = Tenor(o)       v      |
|                      ·        s_3 = Tenor(o)       |      |
|                        ·     /                     |      |
|                          ·  /                      |      |
|                            @                       |      |
|                             \                      |      |
|                              \                     |      |
|                               i_3 = Sense(o)       v      |
|                                                           |
o-----------------------------------------------------------o
Figure.  Computation As Semiotic Transformation

The Figure shows three distinct sign triples of the form <o, s, i>, where
o = ostensible objective = the observed, indicated, or intended situation.

| A.  <o, Logue(o), Model(o)>
|
| B.  <o, Model(o), Tenor(o)>
|
| C.  <o, Tenor(o), Sense(o)>

Let us bring these several signs together in one place,
to compare and contrast their common and their diverse
characters, and to think about why we make such a fuss
about passing from one to the other in the first place.

1.  Logue(o)  =  Consat.Log
o-----------------------------------------------------------------------o
|                                                                       |
|  (( pete_plays_guitar ),( pete_plays_sax ),( pete_plays_drums ))      |
|  (( paul_plays_guitar ),( paul_plays_sax ),( paul_plays_drums ))      |
|  (( jane_plays_guitar ),( jane_plays_sax ),( jane_plays_drums ))      |
|                                                                       |
|  (( pete_plays_guitar ),( paul_plays_guitar ),( jane_plays_guitar ))  |
|  (( pete_plays_sax    ),( paul_plays_sax    ),( jane_plays_sax    ))  |
|  (( pete_plays_drums  ),( paul_plays_drums  ),( jane_plays_drums  ))  |
|                                                                       |
|  (( pete_fears_13 ),( pete_fears_cats ),( pete_fears_height ))        |
|  (( paul_fears_13 ),( paul_fears_cats ),( paul_fears_height ))        |
|  (( jane_fears_13 ),( jane_fears_cats ),( jane_fears_height ))        |
|                                                                       |
|  (( pete_fears_13     ),( paul_fears_13     ),( jane_fears_13     ))  |
|  (( pete_fears_cats   ),( paul_fears_cats   ),( jane_fears_cats   ))  |
|  (( pete_fears_height ),( paul_fears_height ),( jane_fears_height ))  |
|                                                                       |
|  ((                                                                   |
|                                                                       |
|  ( pete_plays_guitar )                                                |
|  ( pete_fears_height )                                                |
|                                                                       |
|  ( pete_plays_guitar  pete_fears_height )                             |
|  ( paul_plays_guitar  paul_fears_height )                             |
|  ( jane_plays_guitar  jane_fears_height )                             |
|                                                                       |
|  ( paul_fears_cats )                                                  |
|  ( paul_plays_sax  )                                                  |
|                                                                       |
|  ( pete_plays_sax  pete_fears_cats )                                  |
|  ( paul_plays_sax  paul_fears_cats )                                  |
|  ( jane_plays_sax  jane_fears_cats )                                  |
|                                                                       |
|  ( pete_plays_drums  pete_fears_13 )                                  |
|  ( paul_plays_drums  paul_fears_13 )                                  |
|  ( jane_plays_drums  jane_fears_13 )                                  |
|                                                                       |
|  ( pete_plays_drums  pete_fears_height )                              |
|  ( paul_plays_drums  paul_fears_height )                              |
|  ( jane_plays_drums  jane_fears_height )                              |
|                                                                       |
|  ))                                                                   |
|                                                                       |
o-----------------------------------------------------------------------o

2.  Model(o)  =  Consat.Mod  ><>  http://suo.ieee.org/ontology/msg03718.html

3.  Tenor(o)  =  Consat.Ten (Just The Gist Of It)
o-------------------------------------------------o
| (pete_plays_guitar )                            | <01> -
|  (pete_plays_sax )                              | <02> -
|   pete_plays_drums                              | <03> +
|    (paul_plays_drums )                          | <04> -
|     (jane_plays_drums )                         | <05> -
|      paul_plays_guitar                          | <06> +
|       (paul_plays_sax )                         | <07> -
|        (jane_plays_guitar )                     | <08> -
|         jane_plays_sax                          | <09> +
|          (pete_fears_13 )                       | <10> -
|           pete_fears_cats                       | <11> +
|            (pete_fears_height )                 | <12> -
|             (paul_fears_cats )                  | <13> -
|              (jane_fears_cats )                 | <14> -
|               paul_fears_13                     | <15> +
|                (paul_fears_height )             | <16> -
|                 (jane_fears_13 )                | <17> -
|                  jane_fears_height *            | <18> +
o-------------------------------------------------o

4.  Sense(o)  =  Consat.Sen
o-------------------------------------------------o
| pete_plays_drums                                | <03>
|  paul_plays_guitar                              | <06>
|   jane_plays_sax                                | <09>
|    pete_fears_cats                              | <11>
|     paul_fears_13                               | <15>
|      jane_fears_height                          | <18>
o-------------------------------------------------o

As one proceeds through the subsessions of the Theme One Study session,
the computation transforms its larger "signs", in this case text files,
from one to the next, in the sequence:  Logue, Model, Tenor, and Sense.

Let us see if we can pin down, on sign-theoretic grounds,
why this very sort of exercise is so routinely necessary.

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

We were in the middle of pursuing several questions about
sign relational transformations in general, in particular,
the following Example of a sign transformation that arose
in the process of setting up and solving a classical sort
of constraint satisfaction problem.

o-----------------------------o-----------------------------o
|     Objective Framework     |   Interpretive Framework    |
o-----------------------------o-----------------------------o
|                                                           |
|                               s_1 = Logue(o)       |      |
|                              /                     |      |
|                             /                      |      |
|                            @                       |      |
|                          ·  \                      |      |
|                        ·     \                     |      |
|                      ·        i_1 = Model(o)       v      |
|                    ·          s_2 = Model(o)       |      |
|                  ·           /                     |      |
|                ·            /                      |      |
|     Object = o · · · · · · @                       |      |
|                ·            \                      |      |
|                  ·           \                     |      |
|                    ·          i_2 = Tenor(o)       v      |
|                      ·        s_3 = Tenor(o)       |      |
|                        ·     /                     |      |
|                          ·  /                      |      |
|                            @                       |      |
|                             \                      |      |
|                              \                     |      |
|                               i_3 = Sense(o)       v      |
|                                                           |
o-----------------------------------------------------------o
Figure.  Computation As Semiotic Transformation

1.  Logue(o)  =  Consat.Log
o-----------------------------------------------------------------------o
|                                                                       |
|  (( pete_plays_guitar ),( pete_plays_sax ),( pete_plays_drums ))      |
|  (( paul_plays_guitar ),( paul_plays_sax ),( paul_plays_drums ))      |
|  (( jane_plays_guitar ),( jane_plays_sax ),( jane_plays_drums ))      |
|                                                                       |
|  (( pete_plays_guitar ),( paul_plays_guitar ),( jane_plays_guitar ))  |
|  (( pete_plays_sax    ),( paul_plays_sax    ),( jane_plays_sax    ))  |
|  (( pete_plays_drums  ),( paul_plays_drums  ),( jane_plays_drums  ))  |
|                                                                       |
|  (( pete_fears_13 ),( pete_fears_cats ),( pete_fears_height ))        |
|  (( paul_fears_13 ),( paul_fears_cats ),( paul_fears_height ))        |
|  (( jane_fears_13 ),( jane_fears_cats ),( jane_fears_height ))        |
|                                                                       |
|  (( pete_fears_13     ),( paul_fears_13     ),( jane_fears_13     ))  |
|  (( pete_fears_cats   ),( paul_fears_cats   ),( jane_fears_cats   ))  |
|  (( pete_fears_height ),( paul_fears_height ),( jane_fears_height ))  |
|                                                                       |
|  ((                                                                   |
|                                                                       |
|  ( pete_plays_guitar )                                                |
|  ( pete_fears_height )                                                |
|                                                                       |
|  ( pete_plays_guitar  pete_fears_height )                             |
|  ( paul_plays_guitar  paul_fears_height )                             |
|  ( jane_plays_guitar  jane_fears_height )                             |
|                                                                       |
|  ( paul_fears_cats )                                                  |
|  ( paul_plays_sax  )                                                  |
|                                                                       |
|  ( pete_plays_sax  pete_fears_cats )                                  |
|  ( paul_plays_sax  paul_fears_cats )                                  |
|  ( jane_plays_sax  jane_fears_cats )                                  |
|                                                                       |
|  ( pete_plays_drums  pete_fears_13 )                                  |
|  ( paul_plays_drums  paul_fears_13 )                                  |
|  ( jane_plays_drums  jane_fears_13 )                                  |
|                                                                       |
|  ( pete_plays_drums  pete_fears_height )                              |
|  ( paul_plays_drums  paul_fears_height )                              |
|  ( jane_plays_drums  jane_fears_height )                              |
|                                                                       |
|  ))                                                                   |
|                                                                       |
o-----------------------------------------------------------------------o

2.  Model(o)  =  Consat.Mod  ><>  http://suo.ieee.org/ontology/msg03718.html

3.  Tenor(o)  =  Consat.Ten (Just The Gist Of It)
o-------------------------------------------------o
| (pete_plays_guitar )                            | <01> -
|  (pete_plays_sax )                              | <02> -
|   pete_plays_drums                              | <03> +
|    (paul_plays_drums )                          | <04> -
|     (jane_plays_drums )                         | <05> -
|      paul_plays_guitar                          | <06> +
|       (paul_plays_sax )                         | <07> -
|        (jane_plays_guitar )                     | <08> -
|         jane_plays_sax                          | <09> +
|          (pete_fears_13 )                       | <10> -
|           pete_fears_cats                       | <11> +
|            (pete_fears_height )                 | <12> -
|             (paul_fears_cats )                  | <13> -
|              (jane_fears_cats )                 | <14> -
|               paul_fears_13                     | <15> +
|                (paul_fears_height )             | <16> -
|                 (jane_fears_13 )                | <17> -
|                  jane_fears_height *            | <18> +
o-------------------------------------------------o

4.  Sense(o)  =  Consat.Sen
o-------------------------------------------------o
| pete_plays_drums                                | <03>
|  paul_plays_guitar                              | <06>
|   jane_plays_sax                                | <09>
|    pete_fears_cats                              | <11>
|     paul_fears_13                               | <15>
|      jane_fears_height                          | <18>
o-------------------------------------------------o

We can worry later about the proper use of quotation marks
in discussing such a case, where the file name "Yada.Yak"
denotes a piece of text that expresses a proposition that
describes an objective situation or an intentional object,
but whatever the case it is clear that we are knee & neck
deep in a sign relational situation of a modest complexity.

I think that the right sort of analogy might help us
to sort it out, or at least to tell what's important
from the things that are less so.  The paradigm that
comes to mind for me is the type of context in maths
where we talk about the "locus" or the "solution set"
of an equation, and here we think of the equation as
denoting its solution set or describing a locus, say,
a point or a curve or a surface or so on up the scale.

In this figure of speech, we might say for instance:

| o is
| what "x^3 - 3x^2 + 3x - 1 = 0" denotes is
| what "(x-1)^3 = 0" denotes is
| what "1" denotes
| is 1.

Making explicit the assumptive interpretations
that the context probably enfolds in this case,
we assume this description of the solution set:

{x in the Reals : x^3 - 3x^2 + 3x -1 = 0} = {1}.

In sign relational terms, we have the 3-tuples:

| <o, "x^3 - 3x^2 + 3x - 1 = 0", "(x-1)^3 = 0">
|
| <o, "(x-1)^3 = 0", "1">
|
| <o, "1", "1">

As it turns out we discover that the
object o was really just 1 all along.

But why do we put ourselves through the rigors of these
transformations at all?  If 1 is what we mean, why not
just say "1" in the first place and be done with it?
A person who asks a question like that has forgetten
how we keep getting ourselves into these quandaries,
and who it is that assigns the problems, for it is
Nature herself who is the taskmistress here and the
problems are set in the manner that she determines,
not in the style to which we would like to become
accustomed.  The best that we can demand of our
various and sundry calculi is that they afford
us with the nets and the snares more readily
to catch the shape of the problematic game
as it flies up before us on its own wings,
and only then to tame it to the amenable
demeanors that we find to our liking.

In sum, the first place is not ours to take.
We are but poor second players in this game.

That understood, I can now lay out our present Example
along the lines of this familiar mathematical exercise.

| o is
| what Consat.Log denotes is
| what Consat.Mod denotes is
| what Consat.Ten denotes is
| what Consat.Sen denotes.

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

It will be good to keep this picture before us a while longer:

o-----------------------------o-----------------------------o
|     Objective Framework     |   Interpretive Framework    |
o-----------------------------o-----------------------------o
|                                                           |
|                               s_1  =  Logue(o)     |      |
|                              /                     |      |
|                             /                      |      |
|                            @                       |      |
|                          ·  \                      |      |
|                        ·     \                     |      |
|                      ·        i_1  =  Model(o)     v      |
|                    ·          s_2  =  Model(o)     |      |
|                  ·           /                     |      |
|                ·            /                      |      |
|   Object  =  o · · · · · · @                       |      |
|                ·            \                      |      |
|                  ·           \                     |      |
|                    ·          i_2  =  Tenor(o)     v      |
|                      ·        s_3  =  Tenor(o)     |      |
|                        ·     /                     |      |
|                          ·  /                      |      |
|                            @                       |      |
|                             \                      |      |
|                              \                     |      |
|                               i_3  =  Sense(o)     v      |
|                                                           |
o-----------------------------------------------------------o
Figure.  Computation As Semiotic Transformation

The labels that decorate the syntactic plane and indicate
the semiotic transitions in the interpretive panel of the
framework point us to text files whose contents rest here:

http://suo.ieee.org/ontology/msg03722.html

The reason that I am troubling myself -- and no doubt you --
with the details of this Example is because it highlights
a number of the thistles that we will have to grasp if we
ever want to escape from the traps of YARNBOL and YARWARS
in which so many of our fairweather fiends are seeking to
ensnare us, and not just us -- the whole web of the world.

YARNBOL  =  Yet Another Roman Numeral Based Ontology Language.
YARWARS  =  Yet Another Representation Without A Reasoning System.

In order to avoid this, or to reverse the trend once it gets started,
we just have to remember what a dynamic living process a computation
really is, precisely because it is meant to serve as an iconic image
of dynamic, deliberate, purposeful transformations that we are bound
to go through and to carry out in a hopeful pursuit of the solutions
to the many real live problems that life and society place before us.
So I take it rather seriously.

Okay, back to the grindstone.

The question is:  "Why are these trips necessary?"

How come we don't just have one proper expression
for each situation under the sun, or all possible
suns, I guess, for some, and just use that on any
appearance, instance, occasion of that situation?

Why is it ever necessary to begin with an obscure description
of a situation? -- for that is exactly what the propositional
expression caled "Logue(o)", for Example, the Consat.Log file,
really is.

Maybe I need to explain that first.

The first three items of syntax -- Logue(o), Model(o), Tenor(o) --
are all just so many different propositional expressions that
denote one and the same logical-valued function p : X -> %B%,
and one whose abstract image we may well enough describe as
a boolean function of the abstract type q : %B%^k -> %B%,
where k happens to be 18 in the present Consat Example.

If we were to write out the truth table for q : %B%^18 -> %B%
it would take 2^18 = 262144 rows.  Using the bold letter #x#
for a coordinate tuple, writing #x# = <x_1, ..., x_18>, each
row of the table would have the form <x_1, ..., x_18, q(#x#)>.
And the function q is such that all rows evalue to %0% save 1.

Each of the four different formats expresses this fact about q
in its own way.  The first three are logically equivalent, and
the last one is the maximally determinate positive implication
of what the others all say.

From this point of view, the logical computation that we went through,
in the sequence Logue, Model, Tenor, Sense, was a process of changing
from an obscure sign of the objective proposition to a more organized
arrangement of its satisfying or unsatisfying interpretations, to the
most succinct possible expression of the same meaning, to an adequate
positive projection of it that is useful enough in the proper context.

This is the sort of mill -- it's called "computation" -- that we have
to be able to put our representations through on a recurrent, regular,
routine basis, that is, if we expect them to have any utility at all.
And it is only when we have started to do that in genuinely effective
and efficient ways, that we can even begin to think about facilitating
any bit of qualitative conceptual analysis through computational means.

And as far as the qualitative side of logical computation
and conceptual analysis goes, we have barely even started.

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

We are contemplating the sequence of initial and normal forms
for the Consat problem and we have noted the following system
of logical relations, taking the enchained expressions of the
objective situation o in a pairwise associated way, of course:

Logue(o)  <=>  Model(o)  <=>  Tenor(o)  =>  Sense(o).

The specifics of the propositional expressions are cited here:

http://suo.ieee.org/ontology/msg03722.html

If we continue to pursue the analogy that we made with the form
of mathematical activity commonly known as "solving equations",
then there are many salient features of this type of logical
problem solving endeavor that suddenly leap into the light.

First of all, we notice the importance of "equational reasoning"
in mathematics, by which I mean, not just the quantitative type
of equation that forms the matter of the process, but also the
qualitative type of equation, or the "logical equivalence",
that connects each expression along the way, right up to
the penultimate stage, when we are satisfied in a given
context to take a projective implication of the total
knowledge of the situation that we have been taking
some pains to preserve at every intermediate stage
of the game.

This general pattern or strategy of inference, working its way through
phases of "equational" or "total information preserving" inference and
phases of "implicational" or "selective information losing" inference,
is actually very common throughout mathematics, and I have in mind to
examine its character in greater detail and in a more general setting.

Just as the barest hint of things to come along these lines, you might
consider the question of what would constitute the equational analogue
of modus ponens, in other words the scheme of inference that goes from
x and x=>y to y.  Well the answer is a scheme of inference that passes
from x and x=>y to x&y, and then being reversible, back again.  I will
explore the rationale and the utility of this gambit in future reports.

One observation that we can make already at this point,
however, is that these schemes of equational reasoning,
or reversible inference, remain poorly developed among
our currently prevailing styles of inference in logic,
their potentials for applied logical software hardly
being broached in our presently available systems.

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

Extra Examples

1.  Propositional logic example.
Files:  Alpha.lex + Prop.log
Ref:    [Cha, 20, Example 2.12]

2.  Chemical synthesis problem.
Files:  Chem.*
Ref:    [Cha, 21, Example 2.13]

3.  N Queens problem.
Files:  Queen*.*,  Q8.*,  Q5.*
Refs:   [BaC, 166], [VaH, 122], [Wir, 143].
Notes:  Only the 5 Queens example will run in 640K memory.
        Use the "Queen.lex" file to load the "Q5.eg*" log files.

4.  Five Houses puzzle.
Files:  House.*
Ref:    [VaH, 132].
Notes:  Will not run in 640K memory.

5.  Graph coloring example.
Files:  Color.*
Ref:    [Wil, 196].

6.  Examples of Cook's Theorem in computational complexity,
    that propositional satisfiability is NP-complete.

Files:  StiltN.* = "Space and Time Limited Turing Machine",
        with N units of space and N units of time.
        StuntN.* = "Space and Time Limited Turing Machine",
        for computing the parity of a bit string,
        with Number of Tape cells of input equal to N.
Ref:    [Wil, 188-201].
Notes:  Can only run Turing machine example for input of size 2.
        Since the last tape cell is used for an end-of-file marker,
        this amounts to only one significant digit of computation.
        Use the "Stilt3.lex" file  to load the "Stunt2.egN" files.
        Their Sense file outputs appear on the "Stunt2.seN" files.

7.  Fabric knowledge base.
Files:  Fabric.*, Fab.*
Ref:    [MaW, 8-16].

8.  Constraint Satisfaction example.
Files:  Consat1.*, Consat2.*
Ref:    [Win, 449, Exercise 3-9].
Notes:  Attributed to Kenneth D. Forbus.

References

| Angluin, Dana,
|"Learning with Hints", in 
|'Proceedings of the 1988 Workshop on Computational Learning Theory',
| edited by D. Haussler & L. Pitt, Morgan Kaufmann, San Mateo, CA, 1989.

| Ball, W.W. Rouse, & Coxeter, H.S.M.,
|'Mathematical Recreations and Essays', 13th ed.,
| Dover, New York, NY, 1987.

| Chang, Chin-Liang & Lee, Richard Char-Tung,
|'Symbolic Logic and Mechanical Theorem Proving',
| Academic Press, New York, NY, 1973.

| Denning, Peter J., Dennis, Jack B., and Qualitz, Joseph E.,
|'Machines, Languages, and Computation',
| Prentice-Hall, Englewood Cliffs, NJ, 1978.

| Edelman, Gerald M.,
|'Topobiology:  An Introduction to Molecular Embryology',
| Basic Books, New York, NY, 1988.

| Lloyd, J.W.,
|'Foundations of Logic Programming',
| Springer-Verlag, Berlin, 1984.

| Maier, David & Warren, David S.,
|'Computing with Logic:  Logic Programming with Prolog',
| Benjamin/Cummings, Menlo Park, CA, 1988.

| McClelland, James L. and Rumelhart, David E.,
|'Explorations in Parallel Distributed Processing:
| A Handbook of Models, Programs, and Exercises',
| MIT Press, Cambridge, MA, 1988.

| Peirce, Charles Sanders,
|'Collected Papers of Charles Sanders Peirce',
| edited by Charles Hartshorne, Paul Weiss, & Arthur W. Burks,
| Harvard University Press, Cambridge, MA, 1931-1960.

| Peirce, Charles Sanders,
|'The New Elements of Mathematics',
| edited by Carolyn Eisele, Mouton, The Hague, 1976.

|'Charles S. Peirce:  Selected Writings;  Values in a Universe of Chance',
| edited by Philip P. Wiener, Dover, New York, NY, 1966.

| Spencer Brown, George,
|'Laws of Form',
| George Allen & Unwin, London, UK, 1969.

| Van Hentenryck, Pascal,
|'Constraint Satisfaction in Logic Programming',
| MIT Press, Cambridge, MA, 1989.

| Wilf, Herbert S.,
|'Algorithms and Complexity',
| Prentice-Hall, Englewood Cliffs, NJ, 1986.

| Winston, Patrick Henry,
|'Artificial Intelligence, 2nd ed.,
| Addison-Wesley, Reading, MA, 1984.

| Wirth, Niklaus,
|'Algorithms + Data Structures = Programs',
| Prentice-Hall, Englewood Cliffs, NJ, 1976.

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o


Fragmata

Cactus Language, Operator Variables, Reflection

NKS Forum (Mar 2005)

Cactus Rules

Inquiry List (Mar 2004)

Ontology List (Mar 2004)

Cactus Town Cartoons

Arisbe List (Dec 2001)

  1. http://stderr.org/pipermail/arisbe/2001-December/001214.html
  2. http://stderr.org/pipermail/arisbe/2001-December/001217.html

Extensions Of Logical Graphs

Ontology List (Dec 2001)

Arisbe List (Dec 2001)

Logic In Graphs

NKS Forum (Feb 2005)

Propositional Equation Reasoning Systems

Arisbe List (Mar–Apr 2001)

  1. http://stderr.org/pipermail/arisbe/2001-March/000380.html
  2. http://stderr.org/pipermail/arisbe/2001-April/000407.html
  3. http://stderr.org/pipermail/arisbe/2001-April/000409.html
  4. http://stderr.org/pipermail/arisbe/2001-April/000411.html
  5. http://stderr.org/pipermail/arisbe/2001-April/000412.html
  6. http://stderr.org/pipermail/arisbe/2001-April/000413.html
  7. http://stderr.org/pipermail/arisbe/2001-April/000416.html
  8. http://stderr.org/pipermail/arisbe/2001-April/000417.html
  9. http://stderr.org/pipermail/arisbe/2001-April/000421.html
  10. http://stderr.org/pipermail/arisbe/2001-April/000427.html
  11. http://stderr.org/pipermail/arisbe/2001-April/000428.html
  12. http://stderr.org/pipermail/arisbe/2001-April/000430.html
  13. http://stderr.org/pipermail/arisbe/2001-April/000432.html
  14. http://stderr.org/pipermail/arisbe/2001-April/000433.html
  15. http://stderr.org/pipermail/arisbe/2001-April/000434.html
  16. http://stderr.org/pipermail/arisbe/2001-April/000435.html

Inquiry List (Mar 2003)

Theme One Program : Logical Cacti

Inquiry List (Mar 2003)

Inquiry List (Feb 2005)

Alternate Version? Editing Fragment?

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o

In this Subsection, I finally bring together many of what may
have appeared to be wholly independent threads of development,
in the hope of paying off a percentage of my promissory notes,
even if a goodly number my creditors have no doubt long since
forgotten, if not exactly forgiven the debentures in question.

For ease of reference, I repeat here a couple of the
definitions that are needed again in this discussion.

| A "boolean connection" of degree k, also known as a "boolean function"
| on k variables, is a map of the form F : %B%^k -> %B%.  In other words,
| a boolean connection of degree k is a proposition about things in the
| universe of discourse X = %B%^k.
|
| An "imagination" of degree k on X is a k-tuple of propositions
| about things in the universe X.  By way of displaying the kinds
| of notation that are used to express this idea, the imagination
| #f# = <f_1, ..., f_k> is can be given as a sequence of indicator
| functions f_j : X -> %B%, for j = 1 to k.  All of these features
| of the typical imagination #f# can be summed up in either one of
| two ways:  either in the form of a membership statement, stating
| words to the effect that #f# belongs to the space (X -> %B%)^k,
| or in the form of the type declaration that #f# : (X -> %B%)^k,
| though perhaps the latter specification is slightly more precise
| than the former.

The definition of the "stretch" operation and the uses of the
various brands of denotational operators can be reviewed here:

055.  http://suo.ieee.org/email/msg07466.html
057.  http://suo.ieee.org/email/msg07469.html

070.  http://suo.ieee.org/ontology/msg03473.html
071.  http://suo.ieee.org/ontology/msg03479.html

o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o~~~~~~~~~o