Differential Analytic Turing Automata

MyWikiBiz, Author Your Legacy — Thursday December 26, 2024
< Directory:Jon Awbrey‎ | Papers
Revision as of 20:44, 1 March 2009 by Jon Awbrey (talk | contribs) (→‎Note 6: update ref to current version)
Jump to navigationJump to search


Note 1

For the purposes of the NKS Forum my aim is to chart a course from general ideas about transformational equivalence classes of graphs to a notion of differential analytic turing automata (DATA). It may be a while before we get within sight of that goal, but it will provide some measure of motivation to name the thread after the envisioned end rather than the more homely starting place.

The basic idea here is that you have a species of graphs and a set of transformation rules that take you from one graph to another — and back again, as I'm only thinking of equational rules — and this partitions the species of graphs into transformational equivalence classes (TECs).

There are many interesting excursions to be had here, but I will focus mainly on logical applications, and and so the TECs I talk about will almost always have the character of logical equivalence classes (LECs).

An example that will figure heavily in the sequel is given by rooted trees as the species of graphs and a pair of equational transformation rules that derive from the graphical calculi of C.S. Peirce, as revived and extended by George Spencer Brown.

Here are the fundamental transformation rules, also referred to as the arithmetic axioms, more precisely, the arithmetic initials.

PERS Figure 01.jpg (1)
PERS Figure 02.jpg (2)

That should be enough to get started.

Note 2

I will be making use of the cactus language extension of Peirce's Alpha Graphs, so called because it uses a species of graphs that are usually called "cacti" in graph theory. The last exposition of the cactus syntax that I've written can be found here:

The representational and computational efficiency of the cactus language for the tasks that are usually associated with boolean algebra and propositional calculus makes it possible to entertain a further extension, to what we may call differential logic, because it develops this basic level of logic in the same way that differential calculus augments analytic geometry to handle change and diversity. There are several different introductions to differential logic that I have written and distributed across the Internet. You might start with the following couple of treatments:

I am currently rewriting these presentations in hopes of making them as clear as they can be, so please let me know if you have any questions.

Note 3

I will draw on those previously advertised resources of notation and theory as needed, but right now I sense the need for some concrete examples.

Let's say we have a system that is known by the name of its state space \(X\!\) and we have a boolean state variable \(x : X \to \mathbb{B},\) where \(\mathbb{B} = \{ 0, 1 \}.\)

We observe \(X\!\) for a while, relative to a discrete time frame, and we write down the following sequence of values for \(x.\!\)

\(\begin{array}{cc} t & x \\ \\ 0 & 0 \\ 1 & 1 \\ 2 & 0 \\ 3 & 1 \\ 4 & 0 \\ 5 & 1 \\ 6 & 0 \\ 7 & 1 \\ 8 & 0 \\ 9 & \ldots \end{array}\)

"Aha!" we say, and think we see the way of things, writing down the rule \(x' = (x),\!\) where \(x'\!\) is the state that comes next after \(x,\!\) and \((x)\!\) is the negation of \(x\!\) in boolean logic.

Another way to detect patterns is to write out a table of finite differences. For this example, we would get:

\(\begin{array}{ccccc} t & x & dx & d^2 x & \ldots \\ \\ 0 & 0 & 1 & 0 & \ldots \\ 1 & 1 & 1 & 0 & \\ 2 & 0 & 1 & 0 & \\ 3 & 1 & 1 & 0 & \\ 4 & 0 & 1 & 0 & \\ 5 & 1 & 1 & 0 & \\ 6 & 0 & 1 & 0 & \\ 7 & 1 & 1 & 0 & \\ 8 & 0 & 1 & \ldots & \\ 9 & \ldots & \ldots & \ldots & \\ \end{array}\)

And of course, all the higher order differences are zero.

This leads to thinking of \(X\!\) as having an extended state \((x, dx, d^2 x, \ldots, d^k x),\) and this additional language gives us the facility of describing state transitions in terms of the various orders of differences. For example, the rule \(x' = (x)\!\) can now be expressed by the rule \(dx = 1.\!\)

I'll leave you to muse on the possibilities of that.

Note 4

I am preparing a more fleshed-out 1-variable example, but in the mean time, for anybody who's finished all that other reading, there is a more detailed account of differential logic in the following paper:

For future reference, here are a couple of handy rosetta stones for translating back and forth between different notations for the boolean functions \(f : \mathbb{B}^k \to \mathbb{B},\) where \(k = 1, 2.\!\)

Note 5

For a slightly more interesting example, let's suppose that we have a dynamic system that is known by its state space \(X,\!\) and we have a boolean state variable \(x : X \to \mathbb{B}.\) In addition, we are given an initial condition \(x = dx\!\) and a law \(d^2 x = (x).\!\)

The initial condition has two cases: either  \(x = dx = 0\!\)  or  \(x = dx = 1.\!\)

Here is a table of the two trajectories or orbits that we get by starting from each of the two permissible initial states and staying within the constraints of the dynamic law \(d^2 x = (x).\!\)

\(\text{Initial State}\ x \cdot dx\)
\(\begin{array}{cccc} t & d^0 x & d^1 x & d^2 x \\ \\ 0 & 1 & 1 & 0 \\ 1 & 0 & 1 & 1 \\ 2 & 1 & 0 & 0 \\ 3 & 1 & 0 & 0 \\ 4 & 1 & 0 & 0 \\ 5 & '' & '' & '' \\ \end{array}\)


\(\text{Initial State}\ (x) \cdot (dx)\)

\(\begin{array}{cccc} t & d^0 x & d^1 x & d^2 x \\ \\ 0 & 0 & 0 & 1 \\ 1 & 0 & 1 & 1 \\ 2 & 1 & 0 & 0 \\ 3 & 1 & 0 & 0 \\ 4 & 1 & 0 & 0 \\ 5 & '' & '' & '' \\ \end{array}\)

Note that the state \(x\ \underline{(} dx \underline{)}\ \underline{(} d^2 x \underline{)},\) that is, \((x, dx, d^2 x) ~=~ (1, 0, 0),\) is a stable attractor for both orbits.

Further discussion of this example, complete with charts and graphs, can be found at this location:

Note 6

One more example may serve to suggest just how much dynamic complexity can be built on a universe of discourse that has but a single logical feature at its base.

But first, let me introduce a few more elements of general notation that I'll be using to describe finite dimensional universes of discourse and the qualitative dynamics that we envision occurring in them.

Let \(\mathcal{X} = \{ x_1, \ldots, x_n \}\) be the alphabet of logical features or variables that we use to describe the n-dimensional universe of discourse \(X^\circ = [\mathcal{X}] = [ x_1, \ldots, x_n ].\) Picturesquely viewed, one may think of a venn diagram with n overlapping "circles" that are labeled with the feature names in the set \(\mathcal{X}.\) Staying with this picture, one visualizes the universe of discourse \(X^\circ = [\mathcal{X}]\) as having two layers: (1) the set \(X = \langle \mathcal{X} \rangle = \langle x_1, \dots, x_n \rangle\) of points or cells — in another sense of the word than when we speak of cellular automata — (2) the set \(X^\uparrow = (X \to \mathbb{B})\) of propositions, boolean-valued functions, or maps from \(X\!\) to \(\mathbb{B}.\)

Thus, we may speak of the universe of discourse \(X^\circ\) as being an ordered pair \((X, X^\uparrow),\) with \(2^n\!\) points in the underlying space \(X\!\) and \(2^{2^n}\) propositions in the function space \(X^\uparrow.\)

A more complete discussion of these notations can be found here:

Now, to the Example.

Once again, let us begin with a 1-feature alphabet \(\mathcal{X} = \{ x_1 \} = \{ x \}.\) In the discussion that follows I will consider a class of trajectories that are ruled by the constraint that \(d^k x = 0\!\) for all \(k\!\) greater than some fixed \(m,\!\) and I will indulge in the use of some picturesque speech to describes salient classes of such curves. Given this finite order condition, there is a highest order non-zero difference \(d^m x\!\) that is exhibited at each point in the course of any determinate trajectory. Relative to any point of the corresponding orbit or curve, let us call this highest order differential feature \(d^m x\!\) the drive at that point. Curves of constant drive \(d^m x\!\) are then referred to as \(m^\text{th}\!\) gear curves.

One additional piece of notation will be needed here. Starting from the base alphabet \(\mathcal{X} = \{ x \},\) we define and notate \(\operatorname{E}^j \mathcal{X} = \{ x, d^1 x, d^2 x, \ldots, d^j x \}\) as the \(j^\text{th}\!\) order extended alphabet over \(\mathcal{X}.\)

Let us now consider the family of \(4^\text{th}\!\) gear curves through the extended space \(\operatorname{E}^4 X = \langle x, dx, d^2 x, d^3 x, d^4 x \rangle.\) These are the trajectories that are generated subject to the law \(d^4 x = 1,\!\) where it is understood in making such a statement that all higher order differences are equal to \(0.\!\)

Since \(d^4 x\!\) and all higher order \(d^j x\!\) are fixed, the entire dynamics can be plotted in the extended space \(\operatorname{E}^3 X = \langle x, dx, d^2 x, d^3 x \rangle.\) Thus, there is just enough room in a planar venn diagram to plot both orbits and to show how they partition the points of \(\operatorname{E}^3 X.\) As it turns out, there are exactly two possible orbits, of eight points each, as illustrated in Figures 16-a and 16-b. See here:

Note 7

Here are the 4^th gear curves over the 1-feature universe
X = <|x|> arranged in the form of tabular arrays, listing
the extended state vectors <x, dx, d^2.x, d^3.x, d^4.x>
as they occur in one cyclic period of each orbit.

d d d d d
0 1 2 3 4
x x x x x

Orbit 1

0 0 0 0 1
0 0 0 1 1
0 0 1 0 1
0 1 1 1 1
1 0 0 0 1
1 0 0 1 1
1 0 1 0 1
1 1 1 1 1

Orbit 2

1 1 0 0 1
0 1 0 1 1
1 1 1 0 1
0 0 1 1 1
0 1 0 0 1
1 1 0 1 1
0 1 1 0 1
1 0 1 1 1

In this arrangement, the temporal ordering of states
can be reckoned by a kind of "parallel round-up rule".
Specifically, if <a_k, a_[k+1]> is any pair of adjacent
digits in a state vector <a_0, a_1, ..., a_n>, then the
value of a_k in the next state is [a_k]' = a_k + a_[k+1],
the addition being taken mod 2, of course.

A more complete discussion of this arrangement is given here:

DLOG D24.  http://stderr.org/pipermail/inquiry/2003-May/000503.html

Note 8

I am going to tip-toe in silence/consilience past many
questions of a philosophical nature/nurture that might
be asked at this juncture, no doubt to revisit them at
some future opportunity/importunity, however the cases
happen to align in the course of their inevitable fall.

Instead, let's "keep it concrete and simple", taking up the
consideration of an incrementally more complex example, but
having a slightly more general character than the orders of
sequential transformations that we've been discussing up to
this point.

The types of logical transformations that I have in mind can
be thought of as "transformations of discourse" because they
map a universe of discourse into a universe of discourse by
way of logical equations between the qualitative features
or logical variables in the source and target universes.

The sequential transformations or state transitions that we have
been considering so far are actually special cases of these more
general logical transformations, specifically, they are the ones
that have a single universe of discourse, as it happens to exist
at different moments in time, in the role of both the source and
the target universes of the transformation in question.

Onward and upward to Flatland, the differential analysis of
transformations between 2-dimensional universes of discourse.

Consider the transformation from the universe U% = [u, v] to the
universe X% = [x, y] that is defined by this system of equations:

x   =   f<u, v>   =   ((u)(v))

y   =   g<u, v>   =   ((u, v))

The parenthetical expressions on the right are the cactus forms for
the boolean functions that correspond to inclusive disjunction and
logical equivalence, respectively.  By way of a reminder, consult
Table 1 on the page at this location:

DLOG D1.  http://stderr.org/pipermail/inquiry/2003-May/000478.html

The component notation F = <F_1, F_2> = <f, g> : U% -> X% allows
us to give a name and a type to this transformation, and permits
us to define it by means of the compact description that follows:

<x, y>   =   F<u, v>   =   <((u)(v)), ((u, v))>

The information that defines the logical transformation F
can be represented in the form of a truth table, as below.

u v | f g
----+----
0 0 | 0 1
0 1 | 1 0
1 0 | 1 0
1 1 | 1 1

A more complete framework of discussion and a fuller development of
this example can be found in the neighborhood of the following site:

DLOG D73.  http://stderr.org/pipermail/inquiry/2003-June/000557.html

Note 9

By virtue of Zipf's law -- yes, there is a dynamics to its economics --
I have found myself forced, when it comes to matters that I've been
thinking about for "years and years" (YAY) to make a heavy, if not
to say an overbearing use of acronyms, both for their utility as
mnemonic devices and as a "regime of code compression" (ROCC).
I'm well aware that this can be annoying at times, so please
forgive what I hope will be a transient nuisance and I will
do my level best to unpack these lexical quanta as we go.

A "moral of the episode" (MOTE) on the wing here:
A "transformation of textual elements" (TOTE) is
a "formal or mathematical abstraction" (FOMA) of
a very high order, to wit, a "formal object" (FO).

As an object example, let us consider the TOTE in progress:

<x, y>   =   F<u, v>   =   <((u)(v)), ((u, v))>

Taken as a transformation from the universe U% = [u, v]
to the universe X% = [x, y], this is a particular type
of formal object, and it can be studied at that level
of abstraction until the chickens come home to roost,
as they say, but when the time comes to count those
chickens, if you will, the terms of artifice that
we use to talk about abstract objects, almost as
if we actually knew what we were talking about,
need to be fully fledged or fleshed out with
extra "bits of interpretive data" (BOID's).

And so, to decompress the story, the TOTE
that we use to convey the FOMA has to be
interpreted before it can be applied to
anything that actually puts supper on
the table, so to speak.

What are some of the ways that an abstract logical transformation
like F gets interpreted in the setting of a concrete application?

Mathematical parlance comes part way to the rescue here and
tosses us the line that a transformation of syntactic signs
can be interpreted in either one of two ways, as an "alias"
or as an "alibi".

When we consider a transformation in the alias interpretation,
we are merely changing the terms that we use to describe what
may very well be, to some approximation, the very same things.

For example, in some applications the discursive universes
U% = [u, v] and X% = [x, y] are best understood as diverse
frames, instruments, reticules, scopes, or templates, that
we adopt for the sake of viewing from variant perspectives
what we conceive to be roughly the same underlying objects.

When we consider a transformation in the alibi interpretation,
we are thinking of the objective things as objectively moving
around in space or changing their qualitative characteristics.
There are times when we think of this alibi transformation as
taking place in a dimension of time, and then there are times
when time is not an object.

For example, in some applications the discursive universes
U% = [u, v] and X% = [x, y] are actually the same universe,
and what we have is a frame where x is the next state of u
and y is the next state of v, notated as x = u' and y = v'.
This permits us to rewrite the transformation F as follows:

<u', v'>   =   F<u, v>   =   <((u)(v)), ((u, v))>

All in all, then, we have three different ways in general
of applying or interpreting a transformation of discourse,
that we might sum up as one brand of alias and two brands
of alibi, all together, the Elseword, Elsewhere, Elsewhen.

No more angels on pinheads,
the brass tacks next time.

Note 10

It is time to formulate the differential analysis of
a logical transformation, or a "mapping of discourse".
It is wise to begin with the first order differentials.

We are considering an abstract logical transformation
F = <f, g> : [u, v] -> [x, y] that can be interpreted
in a number of different ways.  Let's fix on a couple
of major variants that might be indicated as follows:

Alias Map.  <x , y >  =  F<u, v>  =  <((u)(v)), ((u, v))>

Alibi Map.  <u', v'>  =  F<u, v>  =  <((u)(v)), ((u, v))>

F is just one example among -- well, now that I think of it --
how many other logical transformations from the same source
to the same target universe?  In the light of that question,
maybe it would be advisable to contemplate the character of
F within the fold of its most closely akin transformations.

Given the alphabets !U! = {u, v} and !X! = {x, y},
along with the corresponding universes of discourse
U% and X% ~=~ [B^2], how many logical transformations
of the general form G = <G_1, G_2> : U% -> X% are there?

Since G_1 and G_2 can be any propositions of the type B^2 -> B,
there are 2^4 = 16 choices for each of the maps G_1 and G_2, and
thus there are 2^4 * 2^4 = 2^8 = 256 different mappings altogether
of the form G : U% -> X%.  The set of all functions of a given type
is customarily denoted by placing its type indicator in parentheses,
in the present instance writing (U% -> X%) = {G : U% -> X%}, and so
the cardinality of this "function space" can be most conveniently
summed up by writing |(U% -> X%)| = |(B^2 -> B^2)| = 4^4 = 256.

Given any transformation of this type, G : U% -> X%, the (first order)
differential analysis of G is based on the definition of a couple of
further transformations, derived by way of operators on G, that ply
between the (first order) extended universes, EU% = [u, v, du, dv]
and EX% = [x, y, dx, dy], of G's own source and target universes.

First, the "enlargement map" (or the "secant transformation")
EG = <EG_1, EG_2> : EU% -> EX% is defined by the following
pair of component equations:

EG_1  =  G_1 <u + du, v + dv>

EG_2  =  G_2 <u + du, v + dv>

Second, the "difference map" (or the "chordal transformation")
DG = <DG_1, DG_2> : EU% -> EX% is defined in a component-wise
fashion as the boolean sum of the initial proposition G_j and
the enlarged or the "shifted" proposition EG_j, for j = 1, 2,
in accord with following pair of equations:

DG_1  =  G_1 <u, v>  +  EG_1 <u, v, du, dv>

      =  G_1 <u, v>  +  G_1 <u + du, v + dv>

DG_2  =  G_2 <u, v>  +  EG_2 <u, v, du, dv>

      =  G_2 <u, v>  +  G_2 <u + du, v + dv>

Maintaining a strict analogy with ordinary difference calculus
would perhaps have us write DG_j = EG_j - G_j, but the sum and
difference operations are the same thing in boolean arithmetic.
It is more often natural in the logical context to consider an
initial proposition q, then to compute the enlargement Eq, and
finally to determine the difference Dq = q + Eq, so we let the
variant order of terms reflect this sequence of considerations.

Given these general considerations about the operators E and D,
let's return to particular cases, and carry out the first order
analysis of the transformation F<u, v>  =  <((u)(v)), ((u, v))>.

Note 11

By way of getting our feet back on solid ground, let's crank up
our current case of a transformation of discourse, F : U% -> X%,
with concrete type [u, v] -> [x, y] or abstract type B^2 -> B^2,
and let it spin through a sufficient number of turns to see how
it goes, as viewed under the scope of what is probably its most
straightforward view, as an elsewhen map F : [u, v] -> [u', v'].

Elsewhen Map.  <u', v'>  =  <((u)(v)), ((u, v))>

u v

Incipit 1.  <u, v> = <0, 0> 

0 0
0 1
1 0
1 0
" "

Incipit 2.  <u, v> = <1, 1> 

1 1
1 1
" "

In fine, there are two basins of attraction,
the state <1, 0> and the state <1, 1>, with
the orbit <0, 0>, <0, 1>, <1, 0> leading to
the first basin and the orbit <1, 1> making
up an isolated basin.

Note 12

Way back in DATA Note 3, we guessed, or "abduced",
as a line of logicians from Aristotle to Peirce
and beyond would say, the form of a rule that
adequately accounts for the finite protocol
of states that we observed the system X
pass through, as spied in the light of
its boolean state variable x : X -> B,
and that rule is well-formulated in
any of these styles of notation:

1.1.  f : B -> B such that f : x ~> (x)

1.2.  x' = (x)

1.3.  x := (x)

1.4.  dx =  1

In the current example, having read the manual first,
I guess, we already know in advance the program that
generates the state transitions, and it is a rule of
the following equivalent and easily derivable forms:

2.1.  F : B^2 -> B^2 such that F : <u, v> ~> <((u)(v)), ((u, v))>

2.2.  u' = ((u)(v)),  v' = ((u, v))

2.3.  u := ((u)(v)),  v := ((u, v))

2.4.  ???

Well, the last one is not such a fall off the log,
but that is exactly the purpose for which we have
been developing all of the foregoing machinations.

Here is what I got when I just went ahead and
calculated the finite differences willy-nilly:

Incipit 1.  <u, v> = <0, 0> 
o-----o-----o-----o-----o-----o-----o
| d d | d d | d d | d d | d d | ... |
| 0 0 | 1 1 | 2 2 | 3 3 | 4 4 | ... |
| u v | u v | u v | u v | u v | ... |
o-----o-----o-----o-----o-----o-----o
| 0 0 | 0 1 | 1 0 | 0 1 | 1 0 | ... |
| 0 1 | 1 1 | 1 1 | 1 1 | 1 1 | ... |
| 1 0 | 0 0 | 0 0 | 0 0 | 0 0 | ... |
| 1 0 | 0 0 | 0 0 | 0 0 | 0 0 | ... |
| " " | " " | " " | " " | " " | """ |
o-----o-----o-----o-----o-----o-----o

Incipit 2.  <u, v> = <1, 1> 
o-----o-----o-----o-----o
| d d | d d | d d | d d |
| 0 0 | 1 1 | 2 2 | 3 3 |
| u v | u v | u v | u v |
o-----o-----o-----o-----o
| 1 1 | 0 0 | 0 0 | = = |
| 1 1 | 0 0 | 0 0 | = = |
| " " | " " | " " | " " |
o-----o-----o-----o-----o

To be honest, I have never thought of trying
to hack the problem in such a brute-force way
until just now, and so I know enough to expect
a not unappreciable probability of error about
all that I've taken the risk to write out here,
but let me forge ahead and see what I can see.

What we are looking for is -- one rule to rule them all,
a rule that applies to every state and works every time.

What we see at first sight in the tables above are patterns
of differential features that attach to the states in each
orbit of the dynamics.  Looked at locally to these orbits,
the isolated fixed point at <1, 1> is no problem, as the
rule du = dv = 0 describes it pithily enough.  When it
comes to the other orbit, the first thing that comes
to mind is to write out the law du = v, dv = (u).

I am going to take a nap to clear my head.

Note 13

I think that it ought to be clear at this point that we
need a more systematic symbolic method for computing the
differentials of logical transformations, using the term
"differential" in a loose way at present for all sorts of
finite differences and derivatives, leaving it to another
discussion to sharpen up its more exact technical senses.

For convenience of reference, let's recast our current
example in the form F = <f, g> = <((u)(v)), ((u, v))>.

In their application to this logical transformation the operators
E and D respectively produce the "enlarged map" EF = <Ef, Eg> and
the "difference map" DF = <Df, Dg>, whose components can be given
as follows, if the reader, in the absence of a special format for
logical parentheses, can forgive syntactically 2-lingual phrases:

Ef  =  ((u + du)(v + dv))

Eg  =  ((u + du, v + dv))

Df  =  ((u)(v))  +  ((u + du)(v + dv))

Dg  =  ((u, v))  +  ((u + du, v + dv))

But these initial formulas are purely definitional,
and help us little to understand either the purpose
of the operators or the significance of the results.
Working symbolically, let's apply a more systematic
method to the separate components of the mapping F.

A sketch of this work is presented in the following series
of Figures, where each logical proposition is expanded over
the basic cells uv, u(v), (u)v, (u)(v) of the 2-dimensional
universe of discourse U% = [u, v].

Computation Summary for f<u, v> = ((u)(v))

Figure 1.1 expands f = ((u)(v)) over [u, v] to produce
the equivalent exclusive disjunction uv + u(v) + (u)v.

o---------------------------------------o
|```````````````````````````````````````|
|```````````````````o```````````````````|
|``````````````````/%\``````````````````|
|`````````````````/%%%\`````````````````|
|````````````````/%%%%%\````````````````|
|```````````````o%%%%%%%o```````````````|
|``````````````/%\%%%%%/%\``````````````|
|`````````````/%%%\%%%/%%%\`````````````|
|````````````/%%%%%\%/%%%%%\````````````|
|```````````o%%%%%%%o%%%%%%%o```````````|
|``````````/%\%%%%%/%\%%%%%/%\``````````|
|`````````/%%%\%%%/%%%\%%%/%%%\`````````|
|````````/%%%%%\%/%%%%%\%/%%%%%\````````|
|```````o%%%%%%%o%%%%%%%o%%%%%%%o```````|
|``````/%\%%%%%/%\%%%%%/%\%%%%%/%\``````|
|`````/%%%\%%%/%%%\%%%/%%%\%%%/%%%\`````|
|````/%%%%%\%/%%%%%\%/%%%%%\%/%%%%%\````|
|```o%%%%%%%o%%%%%%%o%%%%%%%o%%%%%%%o```|
|```|\%%%%%/%\%%%%%/`\%%%%%/%\%%%%%/|```|
|```|`\%%%/%%%\%%%/```\%%%/%%%\%%%/`|```|
|```|``\%/%%%%%\%/`````\%/%%%%%\%/``|```|
|```|```o%%%%%%%o```````o%%%%%%%o```|```|
|```|```|\%%%%%/`\`````/`\%%%%%/|```|```|
|```|```|`\%%%/```\```/```\%%%/`|```|```|
|```|`u`|``\%/`````\`/`````\%/``|`v`|```|
|```o---+---o```````o```````o---+---o```|
|```````|````\`````/`\`````/````|```````|
|```````|`````\```/```\```/`````|```````|
|```````|`du```\`/`````\`/```dv`|```````|
|```````o-------o```````o-------o```````|
|````````````````\`````/````````````````|
|`````````````````\```/`````````````````|
|``````````````````\`/``````````````````|
|```````````````````o```````````````````|
|```````````````````````````````````````|
o---------------------------------------o
Figure 1.1.  f = ((u)(v))

Figure 1.2 expands Ef = ((u + du)(v + dv)) over [u, v] to give:

uv.(du dv) + u(v).(du (dv)) + (u)v.((du) dv) + (u)(v).((du)(dv))

o---------------------------------------o
|```````````````````````````````````````|
|```````````````````o```````````````````|
|``````````````````/%\``````````````````|
|`````````````````/%%%\`````````````````|
|````````````````/%%%%%\````````````````|
|```````````````o%%%%%%%o```````````````|
|``````````````/%\%%%%%/%\``````````````|
|`````````````/%%%\%%%/%%%\`````````````|
|````````````/%%%%%\%/%%%%%\````````````|
|```````````o%%%%%%%o%%%%%%%o```````````|
|``````````/%\%%%%%/`\%%%%%/%\``````````|
|`````````/%%%\%%%/```\%%%/%%%\`````````|
|````````/%%%%%\%/`````\%/%%%%%\````````|
|```````o%%%%%%%o```````o%%%%%%%o```````|
|``````/%\%%%%%/%\`````/%\%%%%%/%\``````|
|`````/%%%\%%%/%%%\```/%%%\%%%/%%%\`````|
|````/%%%%%\%/%%%%%\`/%%%%%\%/%%%%%\````|
|```o%%%%%%%o%%%%%%%o%%%%%%%o%%%%%%%o```|
|```|\%%%%%/`\%%%%%/%\%%%%%/`\%%%%%/|```|
|```|`\%%%/```\%%%/%%%\%%%/```\%%%/`|```|
|```|``\%/`````\%/%%%%%\%/`````\%/``|```|
|```|```o```````o%%%%%%%o```````o```|```|
|```|```|\`````/%\%%%%%/%\`````/|```|```|
|```|```|`\```/%%%\%%%/%%%\```/`|```|```|
|```|`u`|``\`/%%%%%\%/%%%%%\`/``|`v`|```|
|```o---+---o%%%%%%%o%%%%%%%o---+---o```|
|```````|````\%%%%%/`\%%%%%/````|```````|
|```````|`````\%%%/```\%%%/`````|```````|
|```````|`du```\%/`````\%/```dv`|```````|
|```````o-------o```````o-------o```````|
|````````````````\`````/````````````````|
|`````````````````\```/`````````````````|
|``````````````````\`/``````````````````|
|```````````````````o```````````````````|
|```````````````````````````````````````|
o---------------------------------------o
Figure 1.2.  Ef = ((u + du)(v + dv))

Figure 1.3 expands Df = f + Ef over [u, v] to produce:

uv.du dv + u(v).du(dv) + (u)v.(du)dv + (u)(v).((du)(dv))

o---------------------------------------o
|```````````````````````````````````````|
|```````````````````o```````````````````|
|``````````````````/`\``````````````````|
|`````````````````/```\`````````````````|
|````````````````/`````\````````````````|
|```````````````o```````o```````````````|
|``````````````/`\`````/`\``````````````|
|`````````````/```\```/```\`````````````|
|````````````/`````\`/`````\````````````|
|```````````o```````o```````o```````````|
|``````````/`\`````/%\`````/`\``````````|
|`````````/```\```/%%%\```/```\`````````|
|````````/`````\`/%%%%%\`/`````\````````|
|```````o```````o%%%%%%%o```````o```````|
|``````/`\`````/`\%%%%%/`\`````/`\``````|
|`````/```\```/```\%%%/```\```/```\`````|
|````/`````\`/`````\%/`````\`/`````\````|
|```o```````o```````o```````o```````o```|
|```|\`````/%\`````/%\`````/%\`````/|```|
|```|`\```/%%%\```/%%%\```/%%%\```/`|```|
|```|``\`/%%%%%\`/%%%%%\`/%%%%%\`/``|```|
|```|```o%%%%%%%o%%%%%%%o%%%%%%%o```|```|
|```|```|\%%%%%/%\%%%%%/%\%%%%%/|```|```|
|```|```|`\%%%/%%%\%%%/%%%\%%%/`|```|```|
|```|`u`|``\%/%%%%%\%/%%%%%\%/``|`v`|```|
|```o---+---o%%%%%%%o%%%%%%%o---+---o```|
|```````|````\%%%%%/`\%%%%%/````|```````|
|```````|`````\%%%/```\%%%/`````|```````|
|```````|`du```\%/`````\%/```dv`|```````|
|```````o-------o```````o-------o```````|
|````````````````\`````/````````````````|
|`````````````````\```/`````````````````|
|``````````````````\`/``````````````````|
|```````````````````o```````````````````|
|```````````````````````````````````````|
o---------------------------------------o
Figure 1.3.  Df = f + Ef

I'll break this here in case anyone wants
to try and do the work for g on their own.

Note 14

No doubt everybody who's still awake sacrificed
the few spare moments of their sleep last night
that it took to Figure all this out already,
but just for the record here's what I got:

Computation Summary for g<u, v> = ((u, v))

Figure 2.1 expands g = ((u, v)) over [u, v] into
the equivalent exclusive disjunction uv + (u)(v).

o---------------------------------------o
|```````````````````````````````````````|
|```````````````````o```````````````````|
|``````````````````/%\``````````````````|
|`````````````````/%%%\`````````````````|
|````````````````/%%%%%\````````````````|
|```````````````o%%%%%%%o```````````````|
|``````````````/%\%%%%%/%\``````````````|
|`````````````/%%%\%%%/%%%\`````````````|
|````````````/%%%%%\%/%%%%%\````````````|
|```````````o%%%%%%%o%%%%%%%o```````````|
|``````````/`\%%%%%/%\%%%%%/`\``````````|
|`````````/```\%%%/%%%\%%%/```\`````````|
|````````/`````\%/%%%%%\%/`````\````````|
|```````o```````o%%%%%%%o```````o```````|
|``````/`\`````/`\%%%%%/`\`````/`\``````|
|`````/```\```/```\%%%/```\```/```\`````|
|````/`````\`/`````\%/`````\`/`````\````|
|```o```````o```````o```````o```````o```|
|```|\`````/`\`````/%\`````/`\`````/|```|
|```|`\```/```\```/%%%\```/```\```/`|```|
|```|``\`/`````\`/%%%%%\`/`````\`/``|```|
|```|```o```````o%%%%%%%o```````o```|```|
|```|```|\`````/%\%%%%%/%\`````/|```|```|
|```|```|`\```/%%%\%%%/%%%\```/`|```|```|
|```|`u`|``\`/%%%%%\%/%%%%%\`/``|`v`|```|
|```o---+---o%%%%%%%o%%%%%%%o---+---o```|
|```````|````\%%%%%/%\%%%%%/````|```````|
|```````|`````\%%%/%%%\%%%/`````|```````|
|```````|`du```\%/%%%%%\%/```dv`|```````|
|```````o-------o%%%%%%%o-------o```````|
|````````````````\%%%%%/````````````````|
|`````````````````\%%%/`````````````````|
|``````````````````\%/``````````````````|
|```````````````````o```````````````````|
|```````````````````````````````````````|
o---------------------------------------o
Figure 2.1.  g = ((u, v))

Figure 2.2 expands Eg = ((u + du, v + dv)) over [u, v] to give:

uv.((du, dv)) + u(v).(du, dv) + (u)v.(du, dv) + (u)(v).((du, dv))

o---------------------------------------o
|```````````````````````````````````````|
|```````````````````o```````````````````|
|``````````````````/%\``````````````````|
|`````````````````/%%%\`````````````````|
|````````````````/%%%%%\````````````````|
|```````````````o%%%%%%%o```````````````|
|``````````````/`\%%%%%/`\``````````````|
|`````````````/```\%%%/```\`````````````|
|````````````/`````\%/`````\````````````|
|```````````o```````o```````o```````````|
|``````````/%\`````/%\`````/%\``````````|
|`````````/%%%\```/%%%\```/%%%\`````````|
|````````/%%%%%\`/%%%%%\`/%%%%%\````````|
|```````o%%%%%%%o%%%%%%%o%%%%%%%o```````|
|``````/`\%%%%%/`\%%%%%/`\%%%%%/`\``````|
|`````/```\%%%/```\%%%/```\%%%/```\`````|
|````/`````\%/`````\%/`````\%/`````\````|
|```o```````o```````o```````o```````o```|
|```|\`````/%\`````/%\`````/%\`````/|```|
|```|`\```/%%%\```/%%%\```/%%%\```/`|```|
|```|``\`/%%%%%\`/%%%%%\`/%%%%%\`/``|```|
|```|```o%%%%%%%o%%%%%%%o%%%%%%%o```|```|
|```|```|\%%%%%/`\%%%%%/`\%%%%%/|```|```|
|```|```|`\%%%/```\%%%/```\%%%/`|```|```|
|```|`u`|``\%/`````\%/`````\%/``|`v`|```|
|```o---+---o```````o```````o---+---o```|
|```````|````\`````/%\`````/````|```````|
|```````|`````\```/%%%\```/`````|```````|
|```````|`du```\`/%%%%%\`/```dv`|```````|
|```````o-------o%%%%%%%o-------o```````|
|````````````````\%%%%%/````````````````|
|`````````````````\%%%/`````````````````|
|``````````````````\%/``````````````````|
|```````````````````o```````````````````|
|```````````````````````````````````````|
o---------------------------------------o
Figure 2.2.  Eg = ((u + du, v + dv))

Figure 2.3 expands Dg = g + Eg over [u, v] to yield the form:

uv.(du, dv) + u(v).(du, dv) + (u)v.(du, dv) + (u)(v).(du, dv)

o---------------------------------------o
|```````````````````````````````````````|
|```````````````````o```````````````````|
|``````````````````/`\``````````````````|
|`````````````````/```\`````````````````|
|````````````````/`````\````````````````|
|```````````````o```````o```````````````|
|``````````````/%\`````/%\``````````````|
|`````````````/%%%\```/%%%\`````````````|
|````````````/%%%%%\`/%%%%%\````````````|
|```````````o%%%%%%%o%%%%%%%o```````````|
|``````````/%\%%%%%/`\%%%%%/%\``````````|
|`````````/%%%\%%%/```\%%%/%%%\`````````|
|````````/%%%%%\%/`````\%/%%%%%\````````|
|```````o%%%%%%%o```````o%%%%%%%o```````|
|``````/`\%%%%%/`\`````/`\%%%%%/`\``````|
|`````/```\%%%/```\```/```\%%%/```\`````|
|````/`````\%/`````\`/`````\%/`````\````|
|```o```````o```````o```````o```````o```|
|```|\`````/%\`````/`\`````/%\`````/|```|
|```|`\```/%%%\```/```\```/%%%\```/`|```|
|```|``\`/%%%%%\`/`````\`/%%%%%\`/``|```|
|```|```o%%%%%%%o```````o%%%%%%%o```|```|
|```|```|\%%%%%/%\`````/%\%%%%%/|```|```|
|```|```|`\%%%/%%%\```/%%%\%%%/`|```|```|
|```|`u`|``\%/%%%%%\`/%%%%%\%/``|`v`|```|
|```o---+---o%%%%%%%o%%%%%%%o---+---o```|
|```````|````\%%%%%/`\%%%%%/````|```````|
|```````|`````\%%%/```\%%%/`````|```````|
|```````|`du```\%/`````\%/```dv`|```````|
|```````o-------o```````o-------o```````|
|````````````````\`````/````````````````|
|`````````````````\```/`````````````````|
|``````````````````\`/``````````````````|
|```````````````````o```````````````````|
|```````````````````````````````````````|
o---------------------------------------o
Figure 2.3.  Dg = g + Eg

Note 15

| 'Tis a derivative from me to mine,
|  And only that I stand for.
|
| Winter's Tale, 3.2.43-44

We've talked about differentials long enough
that I think it's past time we met with some.

When the term is being used with its more exact sense,
a "differential" is a locally linear approximation to
a function, in the context of this logical discussion,
then, a locally linear approximation to a proposition.

I think that it would be best to just go ahead and
exhibit the simplest form of a differential dF for
the current example of a logical transformation F,
after which the majority of the easiest questions
will've been answered in visually intuitive terms.

For F = <f, g> we have dF = <df, dg>, and so we can proceed
componentwise, patching the pieces back together at the end.

We have prepared the ground already by computing these terms:

Ef  =  ((u + du)(v + dv))

Eg  =  ((u + du, v + dv))

Df  =  ((u)(v))  +  ((u + du)(v + dv))

Dg  =  ((u, v))  +  ((u + du, v + dv))

As a matter of fact, computing the symmetric differences
Df = f + Ef and Dg = g + Eg has already taken care of the
"localizing" part of the task by subtracting out the forms
of f and g from the forms of Ef and Eg, respectively.  Thus
all we have left to do is to decide what linear propositions
best approximate the difference maps Df and Dg, respectively.

This raises the question:  What is a linear proposition?

The answer that makes the most sense in this context is this:
A proposition is just a boolean-valued function, so a linear
proposition is a linear function into the boolean space B.

In particular, the linear functions that we want will be
linear functions in the differential variables du and dv.

As it turns out, there are just four linear propositions
in the associated "differential universe" dU% = [du, dv],
and these are the propositions that are commonly denoted:
0, du, dv, du + dv, in other words, (), du, dv, (du, dv).

Note 16

| for equalities are so weighed
| that curiosity in neither can
| make choice of either's moiety.
|
| King Lear, Sc.1.5-7, (Quarto)

| for qualities are so weighed
| that curiosity in neither can
| make choice of either's moiety.
|
| King Lear, 1.1.5-6, (Folio)

Justifying a notion of approximation is a little more
involved in general, and especially in these discrete
logical spaces, than it would be expedient for people
in a hurry to tangle with right now.  I will just say
that there are "naive" or "obvious" notions and there
are "sophisticated" or "subtle" notions that we might
choose among.  The later would engage us in trying to
construct proper logical analogues of Lie derivatives,
and so let's save that for when we have become subtle
or sophisticated or both.  Against or toward that day,
as you wish, let's begin with an option in plain view.

Figure 1.4 illustrates one way of ranging over the cells of the
underlying universe U% = [u, v] and selecting at each cell the
linear proposition in dU% = [du, dv] that best approximates
the patch of the difference map Df that is located there,
yielding the following formula for the differential df.

df  =  uv.0 + u(v).du + (u)v.dv + (u)(v).(du, dv)

o---------------------------------------o
|```````````````````````````````````````|
|```````````````````o```````````````````|
|``````````````````/`\``````````````````|
|`````````````````/```\`````````````````|
|````````````````/`````\````````````````|
|```````````````o```````o```````````````|
|``````````````/`\`````/`\``````````````|
|`````````````/```\```/```\`````````````|
|````````````/`````\`/`````\````````````|
|```````````o```````o```````o```````````|
|``````````/`\`````/`\`````/`\``````````|
|`````````/```\```/```\```/```\`````````|
|````````/`````\`/`````\`/`````\````````|
|```````o```````o```````o```````o```````|
|``````/`\`````/%\`````/%\`````/`\``````|
|`````/```\```/%%%\```/%%%\```/```\`````|
|````/`````\`/%%%%%\`/%%%%%\`/`````\````|
|```o```````o%%%%%%%o%%%%%%%o```````o```|
|```|\`````/%\%%%%%/`\%%%%%/%\`````/|```|
|```|`\```/%%%\%%%/```\%%%/%%%\```/`|```|
|```|``\`/%%%%%\%/`````\%/%%%%%\`/``|```|
|```|```o%%%%%%%o```````o%%%%%%%o```|```|
|```|```|\%%%%%/%\`````/%\%%%%%/|```|```|
|```|```|`\%%%/%%%\```/%%%\%%%/`|```|```|
|```|`u`|``\%/%%%%%\`/%%%%%\%/``|`v`|```|
|```o---+---o%%%%%%%o%%%%%%%o---+---o```|
|```````|````\%%%%%/`\%%%%%/````|```````|
|```````|`````\%%%/```\%%%/`````|```````|
|```````|`du```\%/`````\%/```dv`|```````|
|```````o-------o```````o-------o```````|
|````````````````\`````/````````````````|
|`````````````````\```/`````````````````|
|``````````````````\`/``````````````````|
|```````````````````o```````````````````|
|```````````````````````````````````````|
o---------------------------------------o
Figure 1.4.  df = linear approx to Df

Figure 2.4. illustrates one way of ranging over the cells of the
underlying universe U% = [u, v] and selecting at each cell the
linear proposition in dU% = [du, dv] that best approximates
the patch of the difference map Dg that is located there,
yielding the following formula for the differential dg.

dg  =  uv.(du, dv) + u(v).(du, dv) + (u)v.(du, dv) + (u)(v).(du, dv)

o---------------------------------------o
|```````````````````````````````````````|
|```````````````````o```````````````````|
|``````````````````/`\``````````````````|
|`````````````````/```\`````````````````|
|````````````````/`````\````````````````|
|```````````````o```````o```````````````|
|``````````````/%\`````/%\``````````````|
|`````````````/%%%\```/%%%\`````````````|
|````````````/%%%%%\`/%%%%%\````````````|
|```````````o%%%%%%%o%%%%%%%o```````````|
|``````````/%\%%%%%/`\%%%%%/%\``````````|
|`````````/%%%\%%%/```\%%%/%%%\`````````|
|````````/%%%%%\%/`````\%/%%%%%\````````|
|```````o%%%%%%%o```````o%%%%%%%o```````|
|``````/`\%%%%%/`\`````/`\%%%%%/`\``````|
|`````/```\%%%/```\```/```\%%%/```\`````|
|````/`````\%/`````\`/`````\%/`````\````|
|```o```````o```````o```````o```````o```|
|```|\`````/%\`````/`\`````/%\`````/|```|
|```|`\```/%%%\```/```\```/%%%\```/`|```|
|```|``\`/%%%%%\`/`````\`/%%%%%\`/``|```|
|```|```o%%%%%%%o```````o%%%%%%%o```|```|
|```|```|\%%%%%/%\`````/%\%%%%%/|```|```|
|```|```|`\%%%/%%%\```/%%%\%%%/`|```|```|
|```|`u`|``\%/%%%%%\`/%%%%%\%/``|`v`|```|
|```o---+---o%%%%%%%o%%%%%%%o---+---o```|
|```````|````\%%%%%/`\%%%%%/````|```````|
|```````|`````\%%%/```\%%%/`````|```````|
|```````|`du```\%/`````\%/```dv`|```````|
|```````o-------o```````o-------o```````|
|````````````````\`````/````````````````|
|`````````````````\```/`````````````````|
|``````````````````\`/``````````````````|
|```````````````````o```````````````````|
|```````````````````````````````````````|
o---------------------------------------o
Figure 2.4.  dg = linear approx to Dg

Well, g, that was easy, seeing as how Dg
is already linear at each locus, dg = Dg.

Note 17

We have been conducting the differential analysis
of the logical transformation F : [u, v] -> [u, v]
defined as F : <u, v> ~> <((u)(v)), ((u, v))>, and
this means starting with the extended transformation
EF : [u, v, du, dv] -> [u, v, du, dv] and breaking it
into an analytic series, EF = F + dF + d^2.F + ..., and
so on until there is nothing left to analyze any further.

As a general rule, one proceeds by way of the following stages:

1.  EF      =  [d^0]F + [r^0]F

2.  [r^0]F  =  [d^1]F + [r^1]F

3.  [r^1]F  =  [d^2]F + [r^2]F

4.  ...

In our analysis of the current transformation F,
we carried out Step 1 in the more familiar form
EF = F + DF, and we have just reached Step 2 in
the form DF = dF + rF, where rF is the residual
term that remains for us to examine next.

NB.  I'm am trying to give quick overview here,
and this forces me to omit many picky details.
The picky reader may wish to consult the more
detailed presentation of this material in the
following ur-neighborhoods:

Jon Awbrey, "Differential Logic and Dynamic Systems"

DLOG D.  http://stderr.org/pipermail/inquiry/2003-May/thread.html#478
DLOG D.  http://stderr.org/pipermail/inquiry/2003-June/thread.html#553

Especially:

DLOG D40.  http://stderr.org/pipermail/inquiry/2003-May/000521.html
DLOG D71.  http://stderr.org/pipermail/inquiry/2003-June/000554.html

Take your pick, Gimli ...

Note 18

Let's push on with the analysis of the transformation:

F : <u, v> ~> <f<u, v>, g<u, v>> = <((u)(v)), ((u, v))>

For ease of comparison and computation, I will collect
the Figures that we need for the remainder of the work
together on one page.

Computation Summary for f<u, v> = ((u)(v))

Figure 1.1 expands f = ((u)(v)) over [u, v] to produce
the equivalent exclusive disjunction uv + u(v) + (u)v.

Figure 1.2 expands Ef = ((u + du)(v + dv)) over [u, v] to arrive at
Ef = uv (du dv) + u(v) (du (dv)) + (u)v ((du) dv) + (u)(v)((du)(dv)).

Ef tells you what you would have to do, from where you are in the
universe [u, v], if you want to end up in a place where f is true.
In this case, where the prevailing proposition f is ((u)(v)), the
indication uv (du dv) of Ef tells you this:  If u and v are both
true where you are, then just don't change both u and v at once,
and you will end up in a place where ((u)(v)) is true.

Figure 1.3 expands Df over [u, v] to end up with the formula:
Df = uv du dv + u(v) du(dv) + (u)v (du)dv + (u)(v)((du)(dv)).

Df tells you what you would have to do, from where you are in the
universe [u, v], if you want to bring about a change in the value
of f, that is, if you want to get to a place where the value of f
is different from what it is where you are.  In the present case,
where the reigning proposition f is ((u)(v)), the term uv du dv
of Df tells you this:  If u and v are both true where you are,
then you would have to change both u and v in order to reach
a place where the value of f is different from what it is
where you are.

Figure 1.4 approximates Df by the linear form
df = uv 0 + u(v) du + (u)v dv + (u)(v)(du, dv).

Figure 1.5 shows what remains of the difference map Df
when the first order linear contribution df is removed:
rf = uv du dv + u(v) du dv + (u)v du dv + (u)(v) du dv.
This form can be written more succinctly as rf = du dv.

o---------------------------------------o
|```````````````````````````````````````|
|```````````````````o```````````````````|
|``````````````````/%\``````````````````|
|`````````````````/%%%\`````````````````|
|````````````````/%%%%%\````````````````|
|```````````````o%%%%%%%o```````````````|
|``````````````/%\%%%%%/%\``````````````|
|`````````````/%%%\%%%/%%%\`````````````|
|````````````/%%%%%\%/%%%%%\````````````|
|```````````o%%%%%%%o%%%%%%%o```````````|
|``````````/%\%%%%%/%\%%%%%/%\``````````|
|`````````/%%%\%%%/%%%\%%%/%%%\`````````|
|````````/%%%%%\%/%%%%%\%/%%%%%\````````|
|```````o%%%%%%%o%%%%%%%o%%%%%%%o```````|
|``````/%\%%%%%/%\%%%%%/%\%%%%%/%\``````|
|`````/%%%\%%%/%%%\%%%/%%%\%%%/%%%\`````|
|````/%%%%%\%/%%%%%\%/%%%%%\%/%%%%%\````|
|```o%%%%%%%o%%%%%%%o%%%%%%%o%%%%%%%o```|
|```|\%%%%%/%\%%%%%/`\%%%%%/%\%%%%%/|```|
|```|`\%%%/%%%\%%%/```\%%%/%%%\%%%/`|```|
|```|``\%/%%%%%\%/`````\%/%%%%%\%/``|```|
|```|```o%%%%%%%o```````o%%%%%%%o```|```|
|```|```|\%%%%%/`\`````/`\%%%%%/|```|```|
|```|```|`\%%%/```\```/```\%%%/`|```|```|
|```|`u`|``\%/`````\`/`````\%/``|`v`|```|
|```o---+---o```````o```````o---+---o```|
|```````|````\`````/`\`````/````|```````|
|```````|`````\```/```\```/`````|```````|
|```````|`du```\`/`````\`/```dv`|```````|
|```````o-------o```````o-------o```````|
|````````````````\`````/````````````````|
|`````````````````\```/`````````````````|
|``````````````````\`/``````````````````|
|```````````````````o```````````````````|
|```````````````````````````````````````|
o---------------------------------------o
Figure 1.1.  f = ((u)(v))

o---------------------------------------o
|```````````````````````````````````````|
|```````````````````o```````````````````|
|``````````````````/%\``````````````````|
|`````````````````/%%%\`````````````````|
|````````````````/%%%%%\````````````````|
|```````````````o%%%%%%%o```````````````|
|``````````````/%\%%%%%/%\``````````````|
|`````````````/%%%\%%%/%%%\`````````````|
|````````````/%%%%%\%/%%%%%\````````````|
|```````````o%%%%%%%o%%%%%%%o```````````|
|``````````/%\%%%%%/`\%%%%%/%\``````````|
|`````````/%%%\%%%/```\%%%/%%%\`````````|
|````````/%%%%%\%/`````\%/%%%%%\````````|
|```````o%%%%%%%o```````o%%%%%%%o```````|
|``````/%\%%%%%/%\`````/%\%%%%%/%\``````|
|`````/%%%\%%%/%%%\```/%%%\%%%/%%%\`````|
|````/%%%%%\%/%%%%%\`/%%%%%\%/%%%%%\````|
|```o%%%%%%%o%%%%%%%o%%%%%%%o%%%%%%%o```|
|```|\%%%%%/`\%%%%%/%\%%%%%/`\%%%%%/|```|
|```|`\%%%/```\%%%/%%%\%%%/```\%%%/`|```|
|```|``\%/`````\%/%%%%%\%/`````\%/``|```|
|```|```o```````o%%%%%%%o```````o```|```|
|```|```|\`````/%\%%%%%/%\`````/|```|```|
|```|```|`\```/%%%\%%%/%%%\```/`|```|```|
|```|`u`|``\`/%%%%%\%/%%%%%\`/``|`v`|```|
|```o---+---o%%%%%%%o%%%%%%%o---+---o```|
|```````|````\%%%%%/`\%%%%%/````|```````|
|```````|`````\%%%/```\%%%/`````|```````|
|```````|`du```\%/`````\%/```dv`|```````|
|```````o-------o```````o-------o```````|
|````````````````\`````/````````````````|
|`````````````````\```/`````````````````|
|``````````````````\`/``````````````````|
|```````````````````o```````````````````|
|```````````````````````````````````````|
o---------------------------------------o
Figure 1.2.  Ef = ((u + du)(v + dv))

o---------------------------------------o
|```````````````````````````````````````|
|```````````````````o```````````````````|
|``````````````````/`\``````````````````|
|`````````````````/```\`````````````````|
|````````````````/`````\````````````````|
|```````````````o```````o```````````````|
|``````````````/`\`````/`\``````````````|
|`````````````/```\```/```\`````````````|
|````````````/`````\`/`````\````````````|
|```````````o```````o```````o```````````|
|``````````/`\`````/%\`````/`\``````````|
|`````````/```\```/%%%\```/```\`````````|
|````````/`````\`/%%%%%\`/`````\````````|
|```````o```````o%%%%%%%o```````o```````|
|``````/`\`````/`\%%%%%/`\`````/`\``````|
|`````/```\```/```\%%%/```\```/```\`````|
|````/`````\`/`````\%/`````\`/`````\````|
|```o```````o```````o```````o```````o```|
|```|\`````/%\`````/%\`````/%\`````/|```|
|```|`\```/%%%\```/%%%\```/%%%\```/`|```|
|```|``\`/%%%%%\`/%%%%%\`/%%%%%\`/``|```|
|```|```o%%%%%%%o%%%%%%%o%%%%%%%o```|```|
|```|```|\%%%%%/%\%%%%%/%\%%%%%/|```|```|
|```|```|`\%%%/%%%\%%%/%%%\%%%/`|```|```|
|```|`u`|``\%/%%%%%\%/%%%%%\%/``|`v`|```|
|```o---+---o%%%%%%%o%%%%%%%o---+---o```|
|```````|````\%%%%%/`\%%%%%/````|```````|
|```````|`````\%%%/```\%%%/`````|```````|
|```````|`du```\%/`````\%/```dv`|```````|
|```````o-------o```````o-------o```````|
|````````````````\`````/````````````````|
|`````````````````\```/`````````````````|
|``````````````````\`/``````````````````|
|```````````````````o```````````````````|
|```````````````````````````````````````|
o---------------------------------------o
Figure 1.3.  Difference Map Df = f + Ef

o---------------------------------------o
|```````````````````````````````````````|
|```````````````````o```````````````````|
|``````````````````/`\``````````````````|
|`````````````````/```\`````````````````|
|````````````````/`````\````````````````|
|```````````````o```````o```````````````|
|``````````````/`\`````/`\``````````````|
|`````````````/```\```/```\`````````````|
|````````````/`````\`/`````\````````````|
|```````````o```````o```````o```````````|
|``````````/`\`````/`\`````/`\``````````|
|`````````/```\```/```\```/```\`````````|
|````````/`````\`/`````\`/`````\````````|
|```````o```````o```````o```````o```````|
|``````/`\`````/%\`````/%\`````/`\``````|
|`````/```\```/%%%\```/%%%\```/```\`````|
|````/`````\`/%%%%%\`/%%%%%\`/`````\````|
|```o```````o%%%%%%%o%%%%%%%o```````o```|
|```|\`````/%\%%%%%/`\%%%%%/%\`````/|```|
|```|`\```/%%%\%%%/```\%%%/%%%\```/`|```|
|```|``\`/%%%%%\%/`````\%/%%%%%\`/``|```|
|```|```o%%%%%%%o```````o%%%%%%%o```|```|
|```|```|\%%%%%/%\`````/%\%%%%%/|```|```|
|```|```|`\%%%/%%%\```/%%%\%%%/`|```|```|
|```|`u`|``\%/%%%%%\`/%%%%%\%/``|`v`|```|
|```o---+---o%%%%%%%o%%%%%%%o---+---o```|
|```````|````\%%%%%/`\%%%%%/````|```````|
|```````|`````\%%%/```\%%%/`````|```````|
|```````|`du```\%/`````\%/```dv`|```````|
|```````o-------o```````o-------o```````|
|````````````````\`````/````````````````|
|`````````````````\```/`````````````````|
|``````````````````\`/``````````````````|
|```````````````````o```````````````````|
|```````````````````````````````````````|
o---------------------------------------o
Figure 1.4.  Linear Proxy df for Df

o---------------------------------------o
|```````````````````````````````````````|
|```````````````````o```````````````````|
|``````````````````/`\``````````````````|
|`````````````````/```\`````````````````|
|````````````````/`````\````````````````|
|```````````````o```````o```````````````|
|``````````````/`\`````/`\``````````````|
|`````````````/```\```/```\`````````````|
|````````````/`````\`/`````\````````````|
|```````````o```````o```````o```````````|
|``````````/`\`````/%\`````/`\``````````|
|`````````/```\```/%%%\```/```\`````````|
|````````/`````\`/%%%%%\`/`````\````````|
|```````o```````o%%%%%%%o```````o```````|
|``````/`\`````/%\%%%%%/%\`````/`\``````|
|`````/```\```/%%%\%%%/%%%\```/```\`````|
|````/`````\`/%%%%%\%/%%%%%\`/`````\````|
|```o```````o%%%%%%%o%%%%%%%o```````o```|
|```|\`````/`\%%%%%/%\%%%%%/`\`````/|```|
|```|`\```/```\%%%/%%%\%%%/```\```/`|```|
|```|``\`/`````\%/%%%%%\%/`````\`/``|```|
|```|```o```````o%%%%%%%o```````o```|```|
|```|```|\`````/`\%%%%%/`\`````/|```|```|
|```|```|`\```/```\%%%/```\```/`|```|```|
|```|`u`|``\`/`````\%/`````\`/``|`v`|```|
|```o---+---o```````o```````o---+---o```|
|```````|````\`````/`\`````/````|```````|
|```````|`````\```/```\```/`````|```````|
|```````|`du```\`/`````\`/```dv`|```````|
|```````o-------o```````o-------o```````|
|````````````````\`````/````````````````|
|`````````````````\```/`````````````````|
|``````````````````\`/``````````````````|
|```````````````````o```````````````````|
|```````````````````````````````````````|
o---------------------------------------o
Figure 1.5.  Remainder rf = Df + df

Computation Summary for g<u, v> = ((u, v))

Exercise for the Reader.

Note 19

I'd never rob readers of exercise ...
but for my ain sense of an ending ---

Computation Summary for g<u, v> = ((u, v))

Figure 2.1 expands g = ((u, v)) over [u, v] to get
the equivalent exclusive disjunction u v + (u)(v).

Figure 2.2 expands Eg = ((u + du, v + dv)) over [u, v] to arrive at
Eg = uv((du, dv)) + u(v)(du, dv) + (u)v (du, dv) + (u)(v)((du, dv)).

Eg tells you what you would have to do, from where you are in the
universe [u, v], if you want to end up in a place where g is true.
In this case, where the prevailing proposition g is ((u, v)), the
component uv((du, dv)) of Eg tells you this:  If u and v are both
true where you are, then change either both or neither u and v at
the same time, and you will attain a place where ((u, v)) is true.

Figure 2.3 expands Dg over [u, v] to obtain the following formula:
Dg = uv (du, dv) + u(v)(du, dv) + (u)v (du, dv) + (u)(v) (du, dv).

Dg tells you what you would have to do, from where you are in the
universe [u, v], if you want to bring about a change in the value
of g, that is, if you want to get to a place where the value of g
is different from what it is where you are.  In the present case,
where the ruling proposition g is ((u, v)), the term uv (du, dv)
of Dg tells you this:  If u and v are both true where you are,
then you would have to change one or the other but not both
u and v in order to reach a place where the value of g is
different from what it is where you are.

Figure 2.4 approximates Dg in the proxy of the linear proposition
dg = uv (du, dv) + u(v)(du, dv) + (u)v (du, dv) + (u)(v) (du, dv).
Noting the caste of the constant factor (du, dv) distributed over
the expansion of a tautology, dg may be digested as dg = (du, dv).

Figure 2.5 shows what remains of the difference map Dg
when the first order linear contribution dg is removed,
and this is nothing but nothing at all, leaving rg = 0.

o---------------------------------------o
|```````````````````````````````````````|
|```````````````````o```````````````````|
|``````````````````/%\``````````````````|
|`````````````````/%%%\`````````````````|
|````````````````/%%%%%\````````````````|
|```````````````o%%%%%%%o```````````````|
|``````````````/%\%%%%%/%\``````````````|
|`````````````/%%%\%%%/%%%\`````````````|
|````````````/%%%%%\%/%%%%%\````````````|
|```````````o%%%%%%%o%%%%%%%o```````````|
|``````````/`\%%%%%/%\%%%%%/`\``````````|
|`````````/```\%%%/%%%\%%%/```\`````````|
|````````/`````\%/%%%%%\%/`````\````````|
|```````o```````o%%%%%%%o```````o```````|
|``````/`\`````/`\%%%%%/`\`````/`\``````|
|`````/```\```/```\%%%/```\```/```\`````|
|````/`````\`/`````\%/`````\`/`````\````|
|```o```````o```````o```````o```````o```|
|```|\`````/`\`````/%\`````/`\`````/|```|
|```|`\```/```\```/%%%\```/```\```/`|```|
|```|``\`/`````\`/%%%%%\`/`````\`/``|```|
|```|```o```````o%%%%%%%o```````o```|```|
|```|```|\`````/%\%%%%%/%\`````/|```|```|
|```|```|`\```/%%%\%%%/%%%\```/`|```|```|
|```|`u`|``\`/%%%%%\%/%%%%%\`/``|`v`|```|
|```o---+---o%%%%%%%o%%%%%%%o---+---o```|
|```````|````\%%%%%/%\%%%%%/````|```````|
|```````|`````\%%%/%%%\%%%/`````|```````|
|```````|`du```\%/%%%%%\%/```dv`|```````|
|```````o-------o%%%%%%%o-------o```````|
|````````````````\%%%%%/````````````````|
|`````````````````\%%%/`````````````````|
|``````````````````\%/``````````````````|
|```````````````````o```````````````````|
|```````````````````````````````````````|
o---------------------------------------o
Figure 2.1.  g = ((u, v))

o---------------------------------------o
|```````````````````````````````````````|
|```````````````````o```````````````````|
|``````````````````/%\``````````````````|
|`````````````````/%%%\`````````````````|
|````````````````/%%%%%\````````````````|
|```````````````o%%%%%%%o```````````````|
|``````````````/`\%%%%%/`\``````````````|
|`````````````/```\%%%/```\`````````````|
|````````````/`````\%/`````\````````````|
|```````````o```````o```````o```````````|
|``````````/%\`````/%\`````/%\``````````|
|`````````/%%%\```/%%%\```/%%%\`````````|
|````````/%%%%%\`/%%%%%\`/%%%%%\````````|
|```````o%%%%%%%o%%%%%%%o%%%%%%%o```````|
|``````/`\%%%%%/`\%%%%%/`\%%%%%/`\``````|
|`````/```\%%%/```\%%%/```\%%%/```\`````|
|````/`````\%/`````\%/`````\%/`````\````|
|```o```````o```````o```````o```````o```|
|```|\`````/%\`````/%\`````/%\`````/|```|
|```|`\```/%%%\```/%%%\```/%%%\```/`|```|
|```|``\`/%%%%%\`/%%%%%\`/%%%%%\`/``|```|
|```|```o%%%%%%%o%%%%%%%o%%%%%%%o```|```|
|```|```|\%%%%%/`\%%%%%/`\%%%%%/|```|```|
|```|```|`\%%%/```\%%%/```\%%%/`|```|```|
|```|`u`|``\%/`````\%/`````\%/``|`v`|```|
|```o---+---o```````o```````o---+---o```|
|```````|````\`````/%\`````/````|```````|
|```````|`````\```/%%%\```/`````|```````|
|```````|`du```\`/%%%%%\`/```dv`|```````|
|```````o-------o%%%%%%%o-------o```````|
|````````````````\%%%%%/````````````````|
|`````````````````\%%%/`````````````````|
|``````````````````\%/``````````````````|
|```````````````````o```````````````````|
|```````````````````````````````````````|
o---------------------------------------o
Figure 2.2.  Eg = ((u + du, v + dv))

o---------------------------------------o
|```````````````````````````````````````|
|```````````````````o```````````````````|
|``````````````````/`\``````````````````|
|`````````````````/```\`````````````````|
|````````````````/`````\````````````````|
|```````````````o```````o```````````````|
|``````````````/%\`````/%\``````````````|
|`````````````/%%%\```/%%%\`````````````|
|````````````/%%%%%\`/%%%%%\````````````|
|```````````o%%%%%%%o%%%%%%%o```````````|
|``````````/%\%%%%%/`\%%%%%/%\``````````|
|`````````/%%%\%%%/```\%%%/%%%\`````````|
|````````/%%%%%\%/`````\%/%%%%%\````````|
|```````o%%%%%%%o```````o%%%%%%%o```````|
|``````/`\%%%%%/`\`````/`\%%%%%/`\``````|
|`````/```\%%%/```\```/```\%%%/```\`````|
|````/`````\%/`````\`/`````\%/`````\````|
|```o```````o```````o```````o```````o```|
|```|\`````/%\`````/`\`````/%\`````/|```|
|```|`\```/%%%\```/```\```/%%%\```/`|```|
|```|``\`/%%%%%\`/`````\`/%%%%%\`/``|```|
|```|```o%%%%%%%o```````o%%%%%%%o```|```|
|```|```|\%%%%%/%\`````/%\%%%%%/|```|```|
|```|```|`\%%%/%%%\```/%%%\%%%/`|```|```|
|```|`u`|``\%/%%%%%\`/%%%%%\%/``|`v`|```|
|```o---+---o%%%%%%%o%%%%%%%o---+---o```|
|```````|````\%%%%%/`\%%%%%/````|```````|
|```````|`````\%%%/```\%%%/`````|```````|
|```````|`du```\%/`````\%/```dv`|```````|
|```````o-------o```````o-------o```````|
|````````````````\`````/````````````````|
|`````````````````\```/`````````````````|
|``````````````````\`/``````````````````|
|```````````````````o```````````````````|
|```````````````````````````````````````|
o---------------------------------------o
Figure 2.3.  Difference Map Dg = g + Eg

o---------------------------------------o
|```````````````````````````````````````|
|```````````````````o```````````````````|
|``````````````````/`\``````````````````|
|`````````````````/```\`````````````````|
|````````````````/`````\````````````````|
|```````````````o```````o```````````````|
|``````````````/%\`````/%\``````````````|
|`````````````/%%%\```/%%%\`````````````|
|````````````/%%%%%\`/%%%%%\````````````|
|```````````o%%%%%%%o%%%%%%%o```````````|
|``````````/%\%%%%%/`\%%%%%/%\``````````|
|`````````/%%%\%%%/```\%%%/%%%\`````````|
|````````/%%%%%\%/`````\%/%%%%%\````````|
|```````o%%%%%%%o```````o%%%%%%%o```````|
|``````/`\%%%%%/`\`````/`\%%%%%/`\``````|
|`````/```\%%%/```\```/```\%%%/```\`````|
|````/`````\%/`````\`/`````\%/`````\````|
|```o```````o```````o```````o```````o```|
|```|\`````/%\`````/`\`````/%\`````/|```|
|```|`\```/%%%\```/```\```/%%%\```/`|```|
|```|``\`/%%%%%\`/`````\`/%%%%%\`/``|```|
|```|```o%%%%%%%o```````o%%%%%%%o```|```|
|```|```|\%%%%%/%\`````/%\%%%%%/|```|```|
|```|```|`\%%%/%%%\```/%%%\%%%/`|```|```|
|```|`u`|``\%/%%%%%\`/%%%%%\%/``|`v`|```|
|```o---+---o%%%%%%%o%%%%%%%o---+---o```|
|```````|````\%%%%%/`\%%%%%/````|```````|
|```````|`````\%%%/```\%%%/`````|```````|
|```````|`du```\%/`````\%/```dv`|```````|
|```````o-------o```````o-------o```````|
|````````````````\`````/````````````````|
|`````````````````\```/`````````````````|
|``````````````````\`/``````````````````|
|```````````````````o```````````````````|
|```````````````````````````````````````|
o---------------------------------------o
Figure 2.4.  Linear Proxy dg for Dg

o---------------------------------------o
|```````````````````````````````````````|
|```````````````````o```````````````````|
|``````````````````/`\``````````````````|
|`````````````````/```\`````````````````|
|````````````````/`````\````````````````|
|```````````````o```````o```````````````|
|``````````````/`\`````/`\``````````````|
|`````````````/```\```/```\`````````````|
|````````````/`````\`/`````\````````````|
|```````````o```````o```````o```````````|
|``````````/`\`````/`\`````/`\``````````|
|`````````/```\```/```\```/```\`````````|
|````````/`````\`/`````\`/`````\````````|
|```````o```````o```````o```````o```````|
|``````/`\`````/`\`````/`\`````/`\``````|
|`````/```\```/```\```/```\```/```\`````|
|````/`````\`/`````\`/`````\`/`````\````|
|```o```````o```````o```````o```````o```|
|```|\`````/`\`````/`\`````/`\`````/|```|
|```|`\```/```\```/```\```/```\```/`|```|
|```|``\`/`````\`/`````\`/`````\`/``|```|
|```|```o```````o```````o```````o```|```|
|```|```|\`````/`\`````/`\`````/|```|```|
|```|```|`\```/```\```/```\```/`|```|```|
|```|`u`|``\`/`````\`/`````\`/``|`v`|```|
|```o---+---o```````o```````o---+---o```|
|```````|````\`````/`\`````/````|```````|
|```````|`````\```/```\```/`````|```````|
|```````|`du```\`/`````\`/```dv`|```````|
|```````o-------o```````o-------o```````|
|````````````````\`````/````````````````|
|`````````````````\```/`````````````````|
|``````````````````\`/``````````````````|
|```````````````````o```````````````````|
|```````````````````````````````````````|
o---------------------------------------o
Figure 2.5.  Remainder rg = Dg + dg

| Have I carved enough, my lord --
| Child, you are a bone.
|
| Leonard Cohen, "Teachers" (1967)

Note 20

In my work on "Differential Logic and Dynamic Systems",
I found it useful to develop several different ways of
visualizing logical transformations, indeed, I devised
four distinct styles of picture for the job.  Thus far
in our work on the mapping F : [u, v] -> [u, v], we've
been making use of what I call the "areal view" of the
extended universe of discourse, [u, v, du, dv], but as
the number of dimensions climbs beyond four, it's time
to bid this genre adieu, and look for a style that can
scale a little better.  At any rate, before we proceed
any further, let's first assemble the information that
we have gathered about F from several different angles,
and see if it can be fitted into a coherent picture of
the transformation F : <u, v> ~> <((u)(v)), ((u, v))>.

In our first crack at the transformation F, we simply
plotted the state transitions and applied the utterly
stock technique of calculating the finite differences.

Orbit 1.  u v
o-----o-----o
| ` ` | d d |
| u v | u v |
o=====o=====o
| 1 1 | 0 0 |
| " " | " " |
o-----o-----o

A quick inspection of the first Table suggests a rule
to cover the case when u = v = 1, namely, du = dv = 0.
To put it another way, the Table characterizes Orbit 1
by means of the data:  <u, v, du, dv>  =  <1, 1, 0, 0>.
Last but not least, yet another way to convey the same
information is by means of the (first order) extended
proposition:  u v (du)(dv).

Orbit 2.  (u v)
o-----o-----o-----o
| ` ` | ` ` | d d |
| ` ` | d d | 2 2 |
| u v | u v | u v |
o=====o=====o=====o
| 0 0 | 0 1 | 1 0 |
| 0 1 | 1 1 | 1 1 |
| 1 0 | 0 0 | 0 0 |
| " " | " " | " " |
o-----o-----o-----o

A more fine combing of the second Table brings to mind
a rule that partly covers the remaining cases, that is,
du = v, dv = (u).  To vary the formulation, this Table
characterizes Orbit 2 by means of the following vector
equation:  <du, dv> = <v, (u)>.  This much information
about Orbit 2 is also encapsulated by the (first order)
extended proposition, (uv)((du, v))(dv, u), which says
that u and v are not both true at the same time, while
du is equal in value to v, and dv is the opposite of u.

Note 21

By way of providing a simple illustration of Cook's Theorem,
namely, that "Propositional Satisfiability is NP-Complete",
I will describe one way to translate finite approximations
of turing machines into propositional expressions, using
the cactus language syntax for propositional calculus
that I will describe in more detail as we proceed.

Notation:

  Stilt(k)  =
  space and time limited turing machine,
  with k units of space and k units of time.

  Stunt(k)  =
  space and time limited turing machine,
  for computing the parity of a bit string,
  with number of tape cells of input equal to k.

I will follow the pattern of discussion in the book
by Herbert Wilf, 'Algorithms and Complexity' (1986),
pp. 188-201, but translate his logical formalism into
cactus language, which is more efficient in regard to
the number of propositional clauses that are required.

A turing machine for computing the parity of a bit string
is described by means of the following Figure and Table.

o-------------------------------------------------o
|                                                 |
|                     1/1/+1                      |
|                    -------->                    |
|                /\ /         \ /\                |
|       0/0/+1  ^  0           1  ^  0/0/+1       |
|                \/|\         /|\/                |
|                  | <-------- |                  |
|          #/#/-1  |  1/1/+1   |  #/#/-1          |
|                  |           |                  |
|                  v           v                  |
|                  #           *                  |
|                                                 |
o-------------------------------------------------o
Figure 21-a.  Parity Machine

Table 21-b.  Parity Machine
o-------o--------o-------------o---------o------------o
| State | Symbol | Next Symbol | Ratchet | Next State |
|   Q   |   S    |     S'      |   dR    |     Q'     |
o-------o--------o-------------o---------o------------o
|   0   |   0    |     0       |   +1    |     0      |
|   0   |   1    |     1       |   +1    |     1      |
|   0   |   #    |     #       |   -1    |     #      |
|   1   |   0    |     0       |   +1    |     1      |
|   1   |   1    |     1       |   +1    |     0      |
|   1   |   #    |     #       |   -1    |     *      |
o-------o--------o-------------o---------o------------o

The TM has a "finite automaton" (FA) as one component.
Let us refer to this particular FA by the name of "M".

The "tape-head" (that is, the "read-unit") will be called "H".
The "registers" are also called "tape-cells" or "tape-squares".

Note 22

To see how each finite approximation to a given turing machine
can be given a purely propositional description, one fixes the
parameter k and limits the rest of the discussion to describing
Stilt(k), which is not really a full-fledged TM anymore but just
a finite automaton in disguise.

In this example, for the sake of a minimal illustration,
we choose k = 2, and discuss Stunt(2).  Since the zeroth
tape cell and the last tape cell are occupied with the
bof and eof marks "#", this amounts to only one digit
of significant computation.

To translate Stunt(2) into propositional form we
use the following collection of basic propositions,
boolean variables, or logical features, depending on
what one prefers to call them:

The basic propositions for describing the
"present state function" QF : P -> Q are
these:

   p0_q#, p0_q*, p0_q0, p0_q1,
   p1_q#, p1_q*, p1_q0, p1_q1,
   p2_q#, p2_q*, p2_q0, p2_q1,
   p3_q#, p3_q*, p3_q0, p3_q1.

The proposition of the form pi_qj says:

   At the point-in-time p_i,
   the finite machine M is in the state q_j.

The basic propositions for describing the
"present register function" RF : P -> R
are these:

   p0_r0, p0_r1, p0_r2, p0_r3,
   p1_r0, p1_r1, p1_r2, p1_r3,
   p2_r0, p2_r1, p2_r2, p2_r3,
   p3_r0, p3_r1, p3_r2, p3_r3.

The proposition of the form pi_rj says:

   At the point-in-time p_i,
   the tape-head H is on the tape-cell r_j.

The basic propositions for describing the
"present symbol function" SF : P -> (R -> S)
are these:

   p0_r0_s#, p0_r0_s*, p0_r0_s0, p0_r0_s1,
   p0_r1_s#, p0_r1_s*, p0_r1_s0, p0_r1_s1,
   p0_r2_s#, p0_r2_s*, p0_r2_s0, p0_r2_s1,
   p0_r3_s#, p0_r3_s*, p0_r3_s0, p0_r3_s1,
   p1_r0_s#, p1_r0_s*, p1_r0_s0, p1_r0_s1,
   p1_r1_s#, p1_r1_s*, p1_r1_s0, p1_r1_s1,
   p1_r2_s#, p1_r2_s*, p1_r2_s0, p1_r2_s1,
   p1_r3_s#, p1_r3_s*, p1_r3_s0, p1_r3_s1,
   p2_r0_s#, p2_r0_s*, p2_r0_s0, p2_r0_s1,
   p2_r1_s#, p2_r1_s*, p2_r1_s0, p2_r1_s1,
   p2_r2_s#, p2_r2_s*, p2_r2_s0, p2_r2_s1,
   p2_r3_s#, p2_r3_s*, p2_r3_s0, p2_r3_s1,
   p3_r0_s#, p3_r0_s*, p3_r0_s0, p3_r0_s1,
   p3_r1_s#, p3_r1_s*, p3_r1_s0, p3_r1_s1,
   p3_r2_s#, p3_r2_s*, p3_r2_s0, p3_r2_s1,
   p3_r3_s#, p3_r3_s*, p3_r3_s0, p3_r3_s1.

The proposition of the form pi_rj_sk says:

   At the point-in-time p_i,
   the tape-cell r_j bears the mark s_k.

Note 23

Given but a single free square on the tape, there are just
two different sets of initial conditions for Stunt(2), the
finite approximation to the parity turing machine that we
are presently considering.

Initial Conditions for Tape Input "0"

The following conjunction of 5 basic propositions
describes the initial conditions when Stunt(2) is
started with an input of "0" in its free square:

   p0_q0

   p0_r1

   p0_r0_s#
   p0_r1_s0
   p0_r2_s#

This conjunction of basic propositions may be read as follows:

   At time p_0, M is in the state q_0, and 
   At time p_0, H is reading cell r_1, and
   At time p_0, cell r_0 contains "#", and
   At time p_0, cell r_1 contains "0", and
   At time p_0, cell r_2 contains "#".

Initial Conditions for Tape Input "1"

The following conjunction of 5 basic propositions
describes the initial conditions when Stunt(2) is
started with an input of "1" in its free square:

   p0_q0

   p0_r1

   p0_r0_s#
   p0_r1_s1
   p0_r2_s#

This conjunction of basic propositions may be read as follows:

   At time p_0, M is in the state q_0, and
   At time p_0, H is reading cell r_1, and
   At time p_0, cell r_0 contains "#", and
   At time p_0, cell r_1 contains "1", and
   At time p_0, cell r_2 contains "#".

Note 24

A complete description of Stunt(2) in propositional form is obtained by
conjoining one of the above choices for initial conditions with all of
the following sets of propositions, that serve in effect as a simple
type of "declarative program", telling us all that we need to know
about the anatomy and behavior of the truncated TM in question.

Mediate Conditions:

   ( p0_q# ( p1_q# ))
   ( p0_q* ( p1_q* ))

   ( p1_q# ( p2_q# ))
   ( p1_q* ( p2_q* ))

Terminal Conditions:

   (( p2_q# )( p2_q* ))

State Partition:

   (( p0_q0 ),( p0_q1 ),( p0_q# ),( p0_q* ))
   (( p1_q0 ),( p1_q1 ),( p1_q# ),( p1_q* ))
   (( p2_q0 ),( p2_q1 ),( p2_q# ),( p2_q* ))

Register Partition:

   (( p0_r0 ),( p0_r1 ),( p0_r2 ))
   (( p1_r0 ),( p1_r1 ),( p1_r2 ))
   (( p2_r0 ),( p2_r1 ),( p2_r2 ))

Symbol Partition:

   (( p0_r0_s0 ),( p0_r0_s1 ),( p0_r0_s# ))
   (( p0_r1_s0 ),( p0_r1_s1 ),( p0_r1_s# ))
   (( p0_r2_s0 ),( p0_r2_s1 ),( p0_r2_s# ))

   (( p1_r0_s0 ),( p1_r0_s1 ),( p1_r0_s# ))
   (( p1_r1_s0 ),( p1_r1_s1 ),( p1_r1_s# ))
   (( p1_r2_s0 ),( p1_r2_s1 ),( p1_r2_s# ))

   (( p2_r0_s0 ),( p2_r0_s1 ),( p2_r0_s# ))
   (( p2_r1_s0 ),( p2_r1_s1 ),( p2_r1_s# ))
   (( p2_r2_s0 ),( p2_r2_s1 ),( p2_r2_s# ))

Interaction Conditions:

   (( p0_r0 ) p0_r0_s0 ( p1_r0_s0 ))
   (( p0_r0 ) p0_r0_s1 ( p1_r0_s1 ))
   (( p0_r0 ) p0_r0_s# ( p1_r0_s# ))

   (( p0_r1 ) p0_r1_s0 ( p1_r1_s0 ))
   (( p0_r1 ) p0_r1_s1 ( p1_r1_s1 ))
   (( p0_r1 ) p0_r1_s# ( p1_r1_s# ))

   (( p0_r2 ) p0_r2_s0 ( p1_r2_s0 ))
   (( p0_r2 ) p0_r2_s1 ( p1_r2_s1 ))
   (( p0_r2 ) p0_r2_s# ( p1_r2_s# ))

   (( p1_r0 ) p1_r0_s0 ( p2_r0_s0 ))
   (( p1_r0 ) p1_r0_s1 ( p2_r0_s1 ))
   (( p1_r0 ) p1_r0_s# ( p2_r0_s# ))

   (( p1_r1 ) p1_r1_s0 ( p2_r1_s0 ))
   (( p1_r1 ) p1_r1_s1 ( p2_r1_s1 ))
   (( p1_r1 ) p1_r1_s# ( p2_r1_s# ))

   (( p1_r2 ) p1_r2_s0 ( p2_r2_s0 ))
   (( p1_r2 ) p1_r2_s1 ( p2_r2_s1 ))
   (( p1_r2 ) p1_r2_s# ( p2_r2_s# ))

Transition Relations:

   ( p0_q0  p0_r1  p0_r1_s0  ( p1_q0  p1_r2  p1_r1_s0 ))
   ( p0_q0  p0_r1  p0_r1_s1  ( p1_q1  p1_r2  p1_r1_s1 ))
   ( p0_q0  p0_r1  p0_r1_s#  ( p1_q#  p1_r0  p1_r1_s# ))
   ( p0_q0  p0_r2  p0_r2_s#  ( p1_q#  p1_r1  p1_r2_s# ))

   ( p0_q1  p0_r1  p0_r1_s0  ( p1_q1  p1_r2  p1_r1_s0 ))
   ( p0_q1  p0_r1  p0_r1_s1  ( p1_q0  p1_r2  p1_r1_s1 ))
   ( p0_q1  p0_r1  p0_r1_s#  ( p1_q*  p1_r0  p1_r1_s# ))
   ( p0_q1  p0_r2  p0_r2_s#  ( p1_q*  p1_r1  p1_r2_s# ))

   ( p1_q0  p1_r1  p1_r1_s0  ( p2_q0  p2_r2  p2_r1_s0 ))
   ( p1_q0  p1_r1  p1_r1_s1  ( p2_q1  p2_r2  p2_r1_s1 ))
   ( p1_q0  p1_r1  p1_r1_s#  ( p2_q#  p2_r0  p2_r1_s# ))
   ( p1_q0  p1_r2  p1_r2_s#  ( p2_q#  p2_r1  p2_r2_s# ))

   ( p1_q1  p1_r1  p1_r1_s0  ( p2_q1  p2_r2  p2_r1_s0 ))
   ( p1_q1  p1_r1  p1_r1_s1  ( p2_q0  p2_r2  p2_r1_s1 ))
   ( p1_q1  p1_r1  p1_r1_s#  ( p2_q*  p2_r0  p2_r1_s# ))
   ( p1_q1  p1_r2  p1_r2_s#  ( p2_q*  p2_r1  p2_r2_s# ))

Note 25

Interpretation of the Propositional Program

Let us now run through the propositional specification of Stunt(2),
our truncated TM, and paraphrase what it says in ordinary language.

Mediate Conditions:

   ( p0_q# ( p1_q# ))
   ( p0_q* ( p1_q* ))

   ( p1_q# ( p2_q# ))
   ( p1_q* ( p2_q* ))

In the interpretation of the cactus language for propositional logic
that we are using here, an expression of the form "(p (q))" expresses
a conditional, an implication, or an if-then proposition, commonly read
as:  "not p without q", "if p then q", "p implies q", "p => q", and so on.

A text string expression of the form "(p (q))" corresponds
to a graph-theoretic data-structure of the following form:

o---------------------------------------o
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` p ` q ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` o---o ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` | ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` @ ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
o---------------------------------------o
| ` ` ` ` ` ` ` ( p ( q ))` ` ` ` ` ` ` |
o---------------------------------------o

Taken together, the Mediate Conditions state the following:

   If M at p_0 is in state q_#, then M at p_1 is in state q_#, and
   If M at p_0 is in state q_*, then M at p_1 is in state q_*, and
   If M at p_1 is in state q_#, then M at p_2 is in state q_#, and
   If M at p_1 is in state q_*, then M at p_2 is in state q_*.

Note 26

Interpretation of the Propositional Program (cont.)

Terminal Conditions:

   (( p2_q# )( p2_q* ))

In cactus syntax, an expression of the form "((p)(q))"
expresses the disjunction "p or q".  The corresponding
cactus graph, here just a tree, has the following shape:

o---------------------------------------o
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` p ` q ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` o ` o ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` `\`/` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` ` o ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` ` | ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` ` @ ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
o---------------------------------------o
| ` ` ` ` ` ` ` ((p) (q)) ` ` ` ` ` ` ` |
o---------------------------------------o

In effect, the Terminal Conditions state the following:

   At time p_2, M is in state q_#, or
   At time p_2, M is in state q_*.

Note 27

Interpretation of the Propositional Program (cont.)

State Partition:

   (( p0_q0 ),( p0_q1 ),( p0_q# ),( p0_q* ))
   (( p1_q0 ),( p1_q1 ),( p1_q# ),( p1_q* ))
   (( p2_q0 ),( p2_q1 ),( p2_q# ),( p2_q* ))

In cactus syntax, an expression of the form "((e_1),(e_2),(...),(e_k))"
expresses the fact that "exactly one of the e_j is true, for j = 1 to k".
Expressions of this form are called "universal partition" expressions, and
the corresponding "painted and rooted cactus" (PARC) has the following shape:

o---------------------------------------o
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` e_1 ` e_2 ` ... ` e_k ` ` ` ` |
| ` ` ` ` `o` ` `o` ` ` ` ` `o` ` ` ` ` |
| ` ` ` ` `|` ` `|` ` ` ` ` `|` ` ` ` ` |
| ` ` ` ` `o-----o--- ... ---o` ` ` ` ` |
| ` ` ` ` ` \ ` ` ` ` ` ` ` / ` ` ` ` ` |
| ` ` ` ` ` `\` ` ` ` ` ` `/` ` ` ` ` ` |
| ` ` ` ` ` ` \ ` ` ` ` ` / ` ` ` ` ` ` |
| ` ` ` ` ` ` `\` ` ` ` `/` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` \ ` ` ` / ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` `\` ` `/` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` \ ` / ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` `\`/` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` ` @ ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
o---------------------------------------o
| ` ` ` ((e_1),(e_2),(...),(e_k)) ` ` ` |
o---------------------------------------o

The State Partition segment of the propositional program
consists of three universal partition expressions, taken
in conjunction expressing the condition that M has to be
in one and only one of its states at each point in time
under consideration.  In short, we have the constraint:

   At each of the points in time p_i, for i in the set {0, 1, 2}
   M can be in exactly one state q_j, for j in the set {0, 1, #, *}.

Note 28

Interpretation of the Propositional Program (cont.)

Register Partition:

   (( p0_r0 ),( p0_r1 ),( p0_r2 ))
   (( p1_r0 ),( p1_r1 ),( p1_r2 ))
   (( p2_r0 ),( p2_r1 ),( p2_r2 ))

The Register Partition segment of the propositional program
consists of three universal partition expressions, taken in
conjunction saying that the read head H must be reading one
and only one of the registers or tape cells available to it
at each of the points in time under consideration.  In sum:

   At each of the points in time p_i, for i = 0, 1, 2,
   H is reading exactly one cell r_j, for j = 0, 1, 2.

Note 29

Interpretation of the Propositional Program (cont.)

Symbol Partition:

   (( p0_r0_s0 ),( p0_r0_s1 ),( p0_r0_s# ))
   (( p0_r1_s0 ),( p0_r1_s1 ),( p0_r1_s# ))
   (( p0_r2_s0 ),( p0_r2_s1 ),( p0_r2_s# ))

   (( p1_r0_s0 ),( p1_r0_s1 ),( p1_r0_s# ))
   (( p1_r1_s0 ),( p1_r1_s1 ),( p1_r1_s# ))
   (( p1_r2_s0 ),( p1_r2_s1 ),( p1_r2_s# ))

   (( p2_r0_s0 ),( p2_r0_s1 ),( p2_r0_s# ))
   (( p2_r1_s0 ),( p2_r1_s1 ),( p2_r1_s# ))
   (( p2_r2_s0 ),( p2_r2_s1 ),( p2_r2_s# ))

The Symbol Partition segment of the propositional program for Stunt(2)
consists of nine universal partition expressions, taken in conjunction
stipulating that there has to be one and only one symbol in each of the
registers at each point in time under consideration.  In short, we have:

   At each of the points in time p_i, for i in {0, 1, 2},
   in each of the tape registers r_j, for j in {0, 1, 2}, 
   there can be exactly one sign s_k, for k in {0, 1, #}.

Note 30

Interpretation of the Propositional Program (cont.)

Interaction Conditions:

   (( p0_r0 ) p0_r0_s0 ( p1_r0_s0 ))
   (( p0_r0 ) p0_r0_s1 ( p1_r0_s1 ))
   (( p0_r0 ) p0_r0_s# ( p1_r0_s# ))

   (( p0_r1 ) p0_r1_s0 ( p1_r1_s0 ))
   (( p0_r1 ) p0_r1_s1 ( p1_r1_s1 ))
   (( p0_r1 ) p0_r1_s# ( p1_r1_s# ))

   (( p0_r2 ) p0_r2_s0 ( p1_r2_s0 ))
   (( p0_r2 ) p0_r2_s1 ( p1_r2_s1 ))
   (( p0_r2 ) p0_r2_s# ( p1_r2_s# ))

   (( p1_r0 ) p1_r0_s0 ( p2_r0_s0 ))
   (( p1_r0 ) p1_r0_s1 ( p2_r0_s1 ))
   (( p1_r0 ) p1_r0_s# ( p2_r0_s# ))

   (( p1_r1 ) p1_r1_s0 ( p2_r1_s0 ))
   (( p1_r1 ) p1_r1_s1 ( p2_r1_s1 ))
   (( p1_r1 ) p1_r1_s# ( p2_r1_s# ))

   (( p1_r2 ) p1_r2_s0 ( p2_r2_s0 ))
   (( p1_r2 ) p1_r2_s1 ( p2_r2_s1 ))
   (( p1_r2 ) p1_r2_s# ( p2_r2_s# ))

In briefest terms, the Interaction Conditions merely express
the circumstance that the mark on a tape cell cannot change
between two points in time unless the tape head is over the
cell in question at the initial one of those points in time.
All that we have to do is to see how they manage to say this.

Consider a cactus expression of the following form:

   (( p<i>_r<j> ) p<i>_r<j>_s<k> ( p<i+1>_r<j>_s<k> ))

This expression has the corresponding cactus graph:

o---------------------------------------o
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` p<i>_r<j> ` p<i+1>_r<j>_s<k>` |
| ` ` ` ` ` ` ` ` o ` o ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` `\`/` ` ` ` ` ` ` ` ` |
| ` `p<i>_r<j>_s<k> o ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` ` | ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` ` @ ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
o---------------------------------------o

A propositional expression of this form can be read as follows:

   IF:

   At the time p<i>, the tape cell r<j> bears the mark s<k>,

   BUT it is NOT the case that:

   At the time p<i>, the tape head is on the tape cell r<j>,

   THEN:

   At the time p<i+1>, the tape cell r<j> bears the mark s<k>.

The eighteen clauses of the Interaction Conditions simply impose
one such constraint on symbol changes for each combination of the
times p_0, p_1, registers r_0, r_1, r_2, and symbols s_0, s_1, s_#.

Note 31

Interpretation of the Propositional Program (cont.)

Transition Relations:

   ( p0_q0  p0_r1  p0_r1_s0  ( p1_q0  p1_r2  p1_r1_s0 ))
   ( p0_q0  p0_r1  p0_r1_s1  ( p1_q1  p1_r2  p1_r1_s1 ))
   ( p0_q0  p0_r1  p0_r1_s#  ( p1_q#  p1_r0  p1_r1_s# ))
   ( p0_q0  p0_r2  p0_r2_s#  ( p1_q#  p1_r1  p1_r2_s# ))

   ( p0_q1  p0_r1  p0_r1_s0  ( p1_q1  p1_r2  p1_r1_s0 ))
   ( p0_q1  p0_r1  p0_r1_s1  ( p1_q0  p1_r2  p1_r1_s1 ))
   ( p0_q1  p0_r1  p0_r1_s#  ( p1_q*  p1_r0  p1_r1_s# ))
   ( p0_q1  p0_r2  p0_r2_s#  ( p1_q*  p1_r1  p1_r2_s# ))

   ( p1_q0  p1_r1  p1_r1_s0  ( p2_q0  p2_r2  p2_r1_s0 ))
   ( p1_q0  p1_r1  p1_r1_s1  ( p2_q1  p2_r2  p2_r1_s1 ))
   ( p1_q0  p1_r1  p1_r1_s#  ( p2_q#  p2_r0  p2_r1_s# ))
   ( p1_q0  p1_r2  p1_r2_s#  ( p2_q#  p2_r1  p2_r2_s# ))

   ( p1_q1  p1_r1  p1_r1_s0  ( p2_q1  p2_r2  p2_r1_s0 ))
   ( p1_q1  p1_r1  p1_r1_s1  ( p2_q0  p2_r2  p2_r1_s1 ))
   ( p1_q1  p1_r1  p1_r1_s#  ( p2_q*  p2_r0  p2_r1_s# ))
   ( p1_q1  p1_r2  p1_r2_s#  ( p2_q*  p2_r1  p2_r2_s# ))

The Transition Relation segment of the propositional program
for Stunt(2) consists of sixteen implication statements with
complex antecedents and consequents.  Taken together, these
give propositional expression to the TM Figure and Table
that were given at the outset.

Just by way of a single example, consider the clause:

   ( p0_q0  p0_r1  p0_r1_s1  ( p1_q1  p1_r2  p1_r1_s1 ))

This complex implication statement can be read to say:

   IF:

   At time p_0, M is in the state q_0, and
   At time p_0, H is reading cell r_1, and
   At time p_0, cell r_1 contains "1",

   THEN:

   At time p_1, M is in the state q_1, and
   At time p_1, H is reading cell r_2, and
   At time p_1, cell r_1 contains "1".

Note 32

Interpretation of the Propositional Program (cont.)

The propositional program for Stunt(2) uses the following set
of (9 + 12 + 36) = 57 basic propositions or boolean variables:

   p0_r0, p0_r1, p0_r2,
   p1_r0, p1_r1, p1_r2,
   p2_r0, p2_r1, p2_r2.

   p0_q#, p0_q*, p0_q0, p0_q1,
   p1_q#, p1_q*, p1_q0, p1_q1,
   p2_q#, p2_q*, p2_q0, p2_q1.

   p0_r0_s#, p0_r0_s*, p0_r0_s0, p0_r0_s1,
   p0_r1_s#, p0_r1_s*, p0_r1_s0, p0_r1_s1,
   p0_r2_s#, p0_r2_s*, p0_r2_s0, p0_r2_s1,

   p1_r0_s#, p1_r0_s*, p1_r0_s0, p1_r0_s1,
   p1_r1_s#, p1_r1_s*, p1_r1_s0, p1_r1_s1,
   p1_r2_s#, p1_r2_s*, p1_r2_s0, p1_r2_s1,

   p2_r0_s#, p2_r0_s*, p2_r0_s0, p2_r0_s1,
   p2_r1_s#, p2_r1_s*, p2_r1_s0, p2_r1_s1,
   p2_r2_s#, p2_r2_s*, p2_r2_s0, p2_r2_s1.

This means that the propositional program itself is nothing more or
less than a single proposition or a boolean function P : B^57 -> B.

An assignment of boolean values to the above set of boolean variables
is called an "interpretation" of P, and any interpretation of P that
makes the proposition P : B^57 -> B evaluate to 1 is referred to as
a "satisfying interpretation" of the proposition P.  Another way to
specify interpretations, instead of giving them as bit vectors in
B^57 and trying to remember some arbitrary ordering of variables,
is to give them in the form of "singular propositions", that is,
a conjunction of the form "e_1 & ... & e_57" where each e_j is
either "v_j" or "(v_j)", that is, either the assertion or the
negation of the boolean variable v_j, as j runs from 1 to 57.
Even more briefly, the same information can be communicated
simply by giving the conjunction of the asserted variables,
with the understanding that each of the others is negated.

A satisfying interpretation of the proposition P supplies us
with all the information of a complete execution history for
the corresponding program, and so all we have to do in order
to get the output of the program P is to read off the proper
part of the data from the expression of this interpretation.

Note 33

Interpretation of the Propositional Program (concl.)

One component of the Theme One program that I wrote some years ago
finds all the satisfying interpretations of propositions expressed
in cactus syntax.  It's not a polynomial time algorithm, as you may
guess, but it was just barely efficient enough to do this example
in the 500 K of spare memory that I had on an old 286 PC in about
1989, so I will give you the actual outputs from those trials.

Output Conditions for Tape Input "0"

Let P_0 be the proposition that we get by conjoining
the proposition that describes the initial conditions
for tape input "0" with the proposition that describes
the truncated turing machine Stunt(2).  As it turns out,
P_0 has a single satisfying interpretation, and this is
represented as a singular proposition in terms of its
positive logical features in the following display:

o-------------------------------------------------o
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| p0_q0 ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| `p0_r1` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` p0_r0_s#` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` `p0_r1_s0 ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` p0_r2_s#` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` `p1_q0` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` p1_r2 ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` `p1_r2_s# ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` p1_r0_s#` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` `p1_r1_s0 ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` p2_q# ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` `p2_r1` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` p2_r0_s#` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` `p2_r1_s0 ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` p2_r2_s#` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
o-------------------------------------------------o

The Output Conditions for Tape Input "0" can be read as follows:

   At the time p_0, M is in the state q_0, and
   At the time p_0, H is reading cell r_1, and
   At the time p_0, cell r_0 contains "#", and
   At the time p_0, cell r_1 contains "0", and
   At the time p_0, cell r_2 contains "#", and

   At the time p_1, M is in the state q_0, and
   At the time p_1, H is reading cell r_2, and
   At the time p_1, cell r_0 contains "#", and
   At the time p_1, cell r_1 contains "0", and
   At the time p_1, cell r_2 contains "#", and

   At the time p_2, M is in the state q_#, and
   At the time p_2, H is reading cell r_1, and
   At the time p_2, cell r_0 contains "#", and
   At the time p_2, cell r_1 contains "0", and
   At the time p_2, cell r_2 contains "#".

The output of Stunt(2) being the symbol that rests under
the tape head H if and when the machine M reaches one of
its resting states, we get the result that Parity(0) = 0.

Output Conditions for Tape Input "1"

Let P_1 be the proposition that we get by conjoining
the proposition that describes the initial conditions
for tape input "1" with the proposition that describes
the truncated turing machine Stunt(2).  As it turns out,
P_1 has a single satisfying interpretation, and this is
represented as a singular proposition in terms of its
positive logical features in the following display:

o-------------------------------------------------o
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| p0_q0 ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| `p0_r1` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` p0_r0_s#` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` `p0_r1_s1 ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` p0_r2_s#` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` `p1_q1` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` p1_r2 ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` `p1_r2_s# ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` p1_r0_s#` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` `p1_r1_s1 ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` p2_q* ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` `p2_r1` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` p2_r0_s#` ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` `p2_r1_s1 ` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` p2_r2_s#` ` ` ` ` ` ` ` ` ` ` ` ` |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
o-------------------------------------------------o

The Output Conditions for Tape Input "1" can be read as follows:

   At the time p_0, M is in the state q_0, and
   At the time p_0, H is reading cell r_1, and
   At the time p_0, cell r_0 contains "#", and
   At the time p_0, cell r_1 contains "1", and
   At the time p_0, cell r_2 contains "#", and

   At the time p_1, M is in the state q_1, and
   At the time p_1, H is reading cell r_2, and
   At the time p_1, cell r_0 contains "#", and
   At the time p_1, cell r_1 contains "1", and
   At the time p_1, cell r_2 contains "#", and

   At the time p_2, M is in the state q_*, and
   At the time p_2, H is reading cell r_1, and
   At the time p_2, cell r_0 contains "#", and
   At the time p_2, cell r_1 contains "1", and
   At the time p_2, cell r_2 contains "#".

The output of Stunt(2) being the symbol that rests under
the tape head H when and if the machine M reaches one of
its resting states, we get the result that Parity(1) = 1.

Work Area

DATA 20.  http://forum.wolframscience.com/showthread.php?postid=791#post791

Let's see how this information about the transformation F,
arrived at by eyeballing the raw data, comports with what
we derived through a more systematic symbolic computation.

The results of the various operator actions that we have just
computed are summarized in Tables 66-i and 66-ii from my paper,
and I have attached these as a text file below.

Table 66-i.  Computation Summary for f<u, v> = ((u)(v))
o--------------------------------------------------------------------------------o
|                                                                                |
| !e!f  =  uv.    1      + u(v).    1      + (u)v.    1      + (u)(v).    0      |
|                                                                                |
|   Ef  =  uv. (du  dv)  + u(v). (du (dv)) + (u)v.((du) dv)  + (u)(v).((du)(dv)) |
|                                                                                |
|   Df  =  uv.  du  dv   + u(v).  du (dv)  + (u)v. (du) dv   + (u)(v).((du)(dv)) |
|                                                                                |
|   df  =  uv.    0      + u(v).  du       + (u)v.      dv   + (u)(v). (du, dv)  |
|                                                                                |
|   rf  =  uv.  du  dv   + u(v).  du  dv   + (u)v.  du  dv   + (u)(v).  du  dv   |
|                                                                                |
o--------------------------------------------------------------------------------o

Table 66-ii.  Computation Summary for g<u, v> = ((u, v))
o--------------------------------------------------------------------------------o
|                                                                                |
| !e!g  =  uv.    1      + u(v).    0      + (u)v.    0      + (u)(v).    1      |
|                                                                                |
|   Eg  =  uv.((du, dv)) + u(v). (du, dv)  + (u)v. (du, dv)  + (u)(v).((du, dv)) |
|                                                                                |
|   Dg  =  uv. (du, dv)  + u(v). (du, dv)  + (u)v. (du, dv)  + (u)(v). (du, dv)  |
|                                                                                |
|   dg  =  uv. (du, dv)  + u(v). (du, dv)  + (u)v. (du, dv)  + (u)(v). (du, dv)  |
|                                                                                |
|   rg  =  uv.    0      + u(v).    0      + (u)v.    0      + (u)(v).    0      |
|                                                                                |
o--------------------------------------------------------------------------------o


o---------------------------------------o
|                                       |
|                   o                   |
|                  / \                  |
|                 /   \                 |
|                /     \                |
|               o       o               |
|              / \     / \              |
|             /   \   /   \             |
|            /     \ /     \            |
|           o       o       o           |
|          / \     / \     / \          |
|         /   \   /   \   /   \         |
|        /     \ /     \ /     \        |
|       o       o       o       o       |
|      / \     / \     / \     / \      |
|     /   \   /   \   /   \   /   \     |
|    /     \ /     \ /     \ /     \    |
|   o       o       o       o       o   |
|   |\     / \     / \     / \     /|   |
|   | \   /   \   /   \   /   \   / |   |
|   |  \ /     \ /     \ /     \ /  |   |
|   |   o       o       o       o   |   |
|   |   |\     / \     / \     /|   |   |
|   |   | \   /   \   /   \   / |   |   |
|   | u |  \ /     \ /     \ /  | v |   |
|   o---+---o       o       o---+---o   |
|       |    \     / \     /    |       |
|       |     \   /   \   /     |       |
|       | du   \ /     \ /   dv |       |
|       o-------o       o-------o       |
|                \     /                |
|                 \   /                 |
|                  \ /                  |
|                   o                   |
|                                       |
o---------------------------------------o

Discussion

PD = Philip Dutton

PD: I've been watching your posts.

PD: I am not an expert on logic infrastructures but I find the posts
    interesting (despite not understanding much of it).  I am like
    the diagrams.  I have recently been trying to understand CA's
    using a particular perspective:  sinks and sources.  I think
    that all CA's are simply combinations of sinks and sources.
    How they interact (or intrude into each other's domains)
    would most likely be a result of the rules (and initial
    configuration of on or off cells).

PD: Anyway, to be short, I "see" diamond shapes quite often in
    your diagrams.  Triangles (either up or down) or diamonds
    (combination of an up and down triangle) make me think
    soley of sinks and sources.  I think of the diamond to
    be a source which, during the course of progression,
    is expanding (because it is producing) and then starts
    to act as a sink  (because it consumes) -- and hence the
    diamond.  I can't stop thinking about sinks and sources in
    CA's and so I thought I would ask you if there is some way
    to tie the two worlds together (CA's of sinks and sources
    together with your differential constructs).

PD: Any thoughts?

Yes, I'm hoping that there's a lot of stuff analogous to
R-world dynamics to be discovered in this B-world variety,
indeed, that's kind of why I set out on this investigation --
oh, gee, has it been that long? -- I guess about 1989 or so,
when I started to see this "differential logic" angle on what
I had previously studied in systems theory as the "qualitative
approach to differential equations".  I think we used to use the
words "attractor" and "basin" more often than "sink", but a source
is still a source as time goes by, and I do remember using the word
"sink" a lot when I was a freshperson in physics, before I got logic.

I have spent the last 15 years doing a funny mix of practice in stats
and theory in math, but I did read early works by Von Neumann, Burks,
Ulam, and later stuff by Holland on CA's.  Still, it may be a while
before I have re-heated my concrete intuitions about them in the
NKS way of thinking.

There are some fractal-looking pictures that emerge when
I turn to "higher order propositional expressions" (HOPE's).
I have discussed this topic elswhere on the web and can look
it up now if your are interested, but I am trying to make my
e-positions somewhat clearer for the NKS forum than I have
tried to do before.

But do not hestitate to dialogue all this stuff on the boards,
as that's what always seems to work the best.  What I've found
works best for me, as I can hardly remember what I was writing
last month without Google, is to archive a copy at one of the
other Google-visible discussion lists that I'm on at present.

Document History

DATA.  Differential Analytic Turing Automata

Ontology List

01.  http://suo.ieee.org/ontology/msg05457.html
02.  http://suo.ieee.org/ontology/msg05458.html
03.  http://suo.ieee.org/ontology/msg05459.html
04.  http://suo.ieee.org/ontology/msg05460.html
05.  http://suo.ieee.org/ontology/msg05461.html
06.  http://suo.ieee.org/ontology/msg05462.html
07.  http://suo.ieee.org/ontology/msg05463.html
08.  http://suo.ieee.org/ontology/msg05464.html
09.  http://suo.ieee.org/ontology/msg05465.html
10.  http://suo.ieee.org/ontology/msg05466.html
11.  http://suo.ieee.org/ontology/msg05467.html
12.  http://suo.ieee.org/ontology/msg05469.html
13.  http://suo.ieee.org/ontology/msg05470.html
14.  http://suo.ieee.org/ontology/msg05471.html
15.  http://suo.ieee.org/ontology/msg05472.html
16.  http://suo.ieee.org/ontology/msg05473.html
17.  http://suo.ieee.org/ontology/msg05474.html
18.  http://suo.ieee.org/ontology/msg05475.html
19.  http://suo.ieee.org/ontology/msg05476.html
20.  http://suo.ieee.org/ontology/msg05479.html

Inquiry List

00.  http://stderr.org/pipermail/inquiry/2004-February/thread.html#1228
00.  http://stderr.org/pipermail/inquiry/2004-March/thread.html#1235
00.  http://stderr.org/pipermail/inquiry/2004-March/thread.html#1240
00.  http://stderr.org/pipermail/inquiry/2004-June/thread.html#1630

01.  http://stderr.org/pipermail/inquiry/2004-February/001228.html
02.  http://stderr.org/pipermail/inquiry/2004-February/001230.html
03.  http://stderr.org/pipermail/inquiry/2004-February/001231.html
04.  http://stderr.org/pipermail/inquiry/2004-February/001232.html
05.  http://stderr.org/pipermail/inquiry/2004-February/001233.html
06.  http://stderr.org/pipermail/inquiry/2004-February/001234.html
07.  http://stderr.org/pipermail/inquiry/2004-March/001235.html
08.  http://stderr.org/pipermail/inquiry/2004-March/001236.html
09.  http://stderr.org/pipermail/inquiry/2004-March/001237.html
10.  http://stderr.org/pipermail/inquiry/2004-March/001238.html
11.  http://stderr.org/pipermail/inquiry/2004-March/001240.html
12.  http://stderr.org/pipermail/inquiry/2004-March/001242.html
13.  http://stderr.org/pipermail/inquiry/2004-March/001243.html
14.  http://stderr.org/pipermail/inquiry/2004-March/001244.html
15.  http://stderr.org/pipermail/inquiry/2004-March/001245.html
16.  http://stderr.org/pipermail/inquiry/2004-March/001246.html
17.  http://stderr.org/pipermail/inquiry/2004-March/001247.html
18.  http://stderr.org/pipermail/inquiry/2004-March/001248.html
19.  http://stderr.org/pipermail/inquiry/2004-March/001249.html
20.  http://stderr.org/pipermail/inquiry/2004-March/001255.html
21.  http://stderr.org/pipermail/inquiry/2004-June/001630.html
22.  http://stderr.org/pipermail/inquiry/2004-June/001631.html
23.  http://stderr.org/pipermail/inquiry/2004-June/001632.html
24.  http://stderr.org/pipermail/inquiry/2004-June/001633.html
25.  http://stderr.org/pipermail/inquiry/2004-June/001634.html
26.  http://stderr.org/pipermail/inquiry/2004-June/001635.html
27.  http://stderr.org/pipermail/inquiry/2004-June/001636.html
28.  http://stderr.org/pipermail/inquiry/2004-June/001637.html
29.  http://stderr.org/pipermail/inquiry/2004-June/001638.html
30.  http://stderr.org/pipermail/inquiry/2004-June/001639.html
31.  http://stderr.org/pipermail/inquiry/2004-June/001640.html
32.  http://stderr.org/pipermail/inquiry/2004-June/001641.html
33.  http://stderr.org/pipermail/inquiry/2004-June/001642.html

NKS Forum

00.  http://forum.wolframscience.com/showthread.php?threadid=228
01.  http://forum.wolframscience.com/showthread.php?postid=664#post664
02.  http://forum.wolframscience.com/showthread.php?postid=666#post666
03.  http://forum.wolframscience.com/showthread.php?postid=677#post677
04.  http://forum.wolframscience.com/showthread.php?postid=684#post684
05.  http://forum.wolframscience.com/showthread.php?postid=689#post689
06.  http://forum.wolframscience.com/showthread.php?postid=697#post697
07.  http://forum.wolframscience.com/showthread.php?postid=708#post708
08.  http://forum.wolframscience.com/showthread.php?postid=721#post721
09.  http://forum.wolframscience.com/showthread.php?postid=722#post722
10.  http://forum.wolframscience.com/showthread.php?postid=725#post725
11.  http://forum.wolframscience.com/showthread.php?postid=733#post733
12.  http://forum.wolframscience.com/showthread.php?postid=756#post756
13.  http://forum.wolframscience.com/showthread.php?postid=759#post759
14.  http://forum.wolframscience.com/showthread.php?postid=764#post764
15.  http://forum.wolframscience.com/showthread.php?postid=766#post766
16.  http://forum.wolframscience.com/showthread.php?postid=767#post767
17.  http://forum.wolframscience.com/showthread.php?postid=773#post773
18.  http://forum.wolframscience.com/showthread.php?postid=775#post775
19.  http://forum.wolframscience.com/showthread.php?postid=777#post777
20.  http://forum.wolframscience.com/showthread.php?postid=791#post791
21.  http://forum.wolframscience.com/showthread.php?postid=1458#post1458
22.  http://forum.wolframscience.com/showthread.php?postid=1461#post1461
23.  http://forum.wolframscience.com/showthread.php?postid=1463#post1463
24.  http://forum.wolframscience.com/showthread.php?postid=1464#post1464
25.  http://forum.wolframscience.com/showthread.php?postid=1467#post1467
26.  http://forum.wolframscience.com/showthread.php?postid=1469#post1469
27.  http://forum.wolframscience.com/showthread.php?postid=1470#post1470
28.  http://forum.wolframscience.com/showthread.php?postid=1471#post1471
29.  http://forum.wolframscience.com/showthread.php?postid=1473#post1473
30.  http://forum.wolframscience.com/showthread.php?postid=1475#post1475
31.  http://forum.wolframscience.com/showthread.php?postid=1479#post1479
32.  http://forum.wolframscience.com/showthread.php?postid=1489#post1489
33.  http://forum.wolframscience.com/showthread.php?postid=1490#post1490