Difference between revisions of "Directory:Jon Awbrey/Papers/Semiotic Information"

MyWikiBiz, Author Your Legacy — Friday December 27, 2024
Jump to navigationJump to search
(→‎See also: re-org)
(update)
 
(11 intermediate revisions by the same user not shown)
Line 1: Line 1:
 
{{DISPLAYTITLE:Semiotic Information}}
 
{{DISPLAYTITLE:Semiotic Information}}
<font size="3">&#9758;</font> This page belongs to resource collections on [[Logic Live|Logic]] and [[Inquiry Live|Inquiry]].
+
'''Author: [[User:Jon Awbrey|Jon Awbrey]]'''
  
'''Semiotic information''' is the [[information]] content of signs as conceived within the [[semeiotic]] or [[sign-relational]] framework developed by [[Charles Sanders Peirce]].
+
'''Semiotic information''' is the information content of signs as conceived within the [[semeiotic]] or [[sign-relational]] framework developed by [[Charles Sanders Peirce]].
  
 
==Once over quickly==
 
==Once over quickly==
Line 10: Line 10:
 
The good of information is its use in reducing our uncertainty about some issue that comes before us.  Generally speaking, uncertainty comes in several flavors, and so the information that serves to reduce uncertainty can be applied in several different ways.  The situations of uncertainty that human agents commonly find themselves facing have been investigated under many headings, literally for ages, and the classifications that subtle thinkers arrived at long before the dawn of modern information theory still have their uses in setting the stage of an introduction.
 
The good of information is its use in reducing our uncertainty about some issue that comes before us.  Generally speaking, uncertainty comes in several flavors, and so the information that serves to reduce uncertainty can be applied in several different ways.  The situations of uncertainty that human agents commonly find themselves facing have been investigated under many headings, literally for ages, and the classifications that subtle thinkers arrived at long before the dawn of modern information theory still have their uses in setting the stage of an introduction.
  
Picking an example of a subtle thinker almost at random, the philosopher-scientist [[Immanuel Kant]] divided the principal questions of human existence into three parts:
+
Picking an example of a subtle thinker almost at random, the philosopher-scientist Immanuel Kant divided the principal questions of human existence into three parts:
  
 
:* What's true?
 
:* What's true?
Line 16: Line 16:
 
:* What's to hope?
 
:* What's to hope?
  
The third question is a bit too subtle for the present frame of discussion, but the first and second are easily recognizable as staking out the two main axes of information theory, namely, the dual dimensions of ''[[information]]'' and ''[[control theory|control]]''.  Roughly the same space of concerns is elsewhere spanned by the dual axes of ''[[competence]]'' and ''[[performance]]'', ''[[specification]]'' and ''[[optimization]]'', or just plain ''[[knowledge]]'' and ''[[skill]]''.
+
The third question is a bit too subtle for the present frame of discussion, but the first and second are easily recognizable as staking out the two main axes of information theory, namely, the dual dimensions of ''information'' and ''control''.  Roughly the same space of concerns is elsewhere spanned by the dual axes of ''competence'' and ''performance'', ''specification'' and ''optimization'', or just plain ''knowledge'' and ''skill''.
  
 
A question of ''what's true'' is a ''descriptive question'', and there exist what are called ''[[descriptive science]]s'' devoted to answering descriptive questions about any domain of phenomena that one might care to name.
 
A question of ''what's true'' is a ''descriptive question'', and there exist what are called ''[[descriptive science]]s'' devoted to answering descriptive questions about any domain of phenomena that one might care to name.
Line 22: Line 22:
 
A question of ''what's to do'', in other words, what must be done by way of achieving a given aim, is a ''normative question'', and there exist what are called ''[[normative science]]s'' devoted to answering normative questions about any domain of problems that one might care to address.
 
A question of ''what's to do'', in other words, what must be done by way of achieving a given aim, is a ''normative question'', and there exist what are called ''[[normative science]]s'' devoted to answering normative questions about any domain of problems that one might care to address.
  
Since information plays its role on a stage set by uncertainty, a big part of saying what information is will necessarily involve saying what uncertainty is.  There is little chance that the vagueness of a word like 'uncertainty', given the nuances of its ordinary, poetic, and technical uses, can be corralled by a single pen, but there do exist established models and formal theories that address definable aspects of uncertainty, and these have enough uses to make them worth looking into.
+
Since information plays its role on a stage set by uncertainty, a big part of saying what information is will necessarily involve saying what uncertainty is.  There is little chance that the vagueness of a word like ''uncertainty'', given the nuances of its ordinary, poetic, and technical uses, can be corralled by a single pen, but there do exist established models and formal theories that address definable aspects of uncertainty, and these have enough uses to make them worth looking into.
  
 
===What is information that a sign may bear it?===
 
===What is information that a sign may bear it?===
Line 35: Line 35:
 
Human beings are initially concerned solely with their own lives, but then a world obtrudes on their subjective existence, and so they find themselves forced to take an interest in the objective realities of its nature.
 
Human beings are initially concerned solely with their own lives, but then a world obtrudes on their subjective existence, and so they find themselves forced to take an interest in the objective realities of its nature.
  
In pragmatic terms our initial aim, concern, interest, object, or '[[pragma]]' is expressed by the verbal infinitive 'to live', but the infinitive is soon reified into the derivative substantial forms of 'nature', 'reality', 'the world', and so on.  Against this backdrop we find ourselves cast as the protagonists on a 'scene of uncertainty'.  The situation may be pictured as a juncture from which a manifold of options fan out before us.  It may be an issue of ''truth'', ''duty'', or ''hope'', the last codifying a special type of uncertainty as to ''what regulative principle has any chance of success'', but the chief uncertainty is that we are called on to make a choice and find that we all too often have almost no clue as to which of the options is most fit to pick.
+
In pragmatic terms our initial aim, concern, interest, object, or ''pragma'' is expressed by the verbal infinitive ''to live'', but the infinitive is soon reified into the derivative substantial forms of ''nature'', ''reality'', ''the world'', and so on.  Against this backdrop we find ourselves cast as the protagonists on a ''scene of uncertainty''.  The situation may be pictured as a juncture from which a manifold of options fan out before us.  It may be an issue of ''truth'', ''duty'', or ''hope'', the last codifying a special type of uncertainty as to ''what regulative principle has any chance of success'', but the chief uncertainty is that we are called on to make a choice and find that we all too often have almost no clue as to which of the options is most fit to pick.
  
 
Just to make up a discrete example, let us suppose that the cardinality of this choice is a finite ''n'', and just to make it fully concrete let us say that ''n''&nbsp;=&nbsp;5.  Figure 1 affords a rough picture of the situation.
 
Just to make up a discrete example, let us suppose that the cardinality of this choice is a finite ''n'', and just to make it fully concrete let us say that ''n''&nbsp;=&nbsp;5.  Figure 1 affords a rough picture of the situation.
  
o-------------------------------------------------o
+
{| align="center" cellspacing="6" style="text-align:center; width:60%"
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|
| ` ` ` ` ` `?` ` `?` ` `?` ` `?` ` `?` ` ` ` ` ` |
+
<pre>
| ` ` ` ` ` `o` ` `o` ` `o` ` `o` ` `o` ` ` ` ` ` |
+
o-------------------------------------------------o
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` `o` ` o ` `o` ` o ` `o` ` ` ` ` ` ` |
+
|           ?     ?     ?     ?     ?           |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|           o     o     o     o     o           |
| ` ` ` ` ` ` ` `o` `o` `o` `o` `o` ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|             o   o   o   o   o             |
| ` ` ` ` ` ` ` ` `o` o `o` o `o` ` ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|               o   o   o   o   o               |
| ` ` ` ` ` ` ` ` ` `o o o o o` ` ` ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|                 o o o o o                 |
| ` ` ` ` ` ` ` ` ` ` `ooooo` ` ` ` ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|                   o o o o o                   |
| ` ` ` ` ` ` ` ` ` ` ` `O` ` ` ` ` ` ` ` `n = 5` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|                     ooooo                     |
o-------------------------------------------------o
+
|                                                 |
Figure 1.  Juncture of Degree 5
+
|                       O                 n = 5 |
 +
|                                                 |
 +
o-------------------------------------------------o
 +
Figure 1.  Juncture of Degree 5
 +
</pre>
 +
|}
  
This pictures a juncture, represented by "O", where there are ''n'' options for the outcome of a conduct, and we have no clue as to which it must be.  In a sense, the degree of this node, in this case ''n''&nbsp;=&nbsp;5, measures the uncertainty that we have at this point.
+
This pictures a juncture, represented by &ldquo;O&rdquo;, where there are ''n'' options for the outcome of a conduct, and we have no clue as to which it must be.  In a sense, the degree of this node, in this case ''n''&nbsp;=&nbsp;5, measures the uncertainty that we have at this point.
  
 
This is the minimal sort of setting in which a sign can make any sense at all.  A sign has significance for an agent, interpreter,  or observer because its actualization, its being given or its being present, serves to reduce the uncertainty of a decision that the agent has to make, whether it concerns the actions that the agent ought to take in order to achieve some objective of interest, or whether it concerns the predicates that the agent ought to treat as being true of some object in the world.
 
This is the minimal sort of setting in which a sign can make any sense at all.  A sign has significance for an agent, interpreter,  or observer because its actualization, its being given or its being present, serves to reduce the uncertainty of a decision that the agent has to make, whether it concerns the actions that the agent ought to take in order to achieve some objective of interest, or whether it concerns the predicates that the agent ought to treat as being true of some object in the world.
Line 65: Line 70:
 
The way that signs enter the scene is shown in Figure 2.
 
The way that signs enter the scene is shown in Figure 2.
  
o-------------------------------------------------o
+
{| align="center" cellspacing="6" style="text-align:center; width:60%"
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|
| ` ` ` ` ` ` ` k_1 = 3 ` ` ` `k_2 = 2` ` ` ` ` ` |
+
<pre>
| ` ` ` ` ` `o-----o-----o` ` `o-----o` ` ` ` ` ` |
+
o-------------------------------------------------o
| ` ` ` ` ` ` ` ` "A" ` ` ` ` ` "B" ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` `o----o----o` ` o----o` ` ` ` ` ` ` |
+
|               k_1 = 3       k_2 = 2           |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|           o-----o-----o     o-----o           |
| ` ` ` ` ` ` ` `o---o---o` `o---o` ` ` ` ` ` ` ` |
+
|                 "A"           "B"               |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|             o----o----o   o----o             |
| ` ` ` ` ` ` ` ` `o--o--o` o--o` ` ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|               o---o---o   o---o               |
| ` ` ` ` ` ` ` ` ` `o-o-o o-o` ` ` ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|                 o--o--o o--o                 |
| ` ` ` ` ` ` ` ` ` ` `ooooo` ` ` ` ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|                   o-o-o o-o                   |
| ` ` ` ` ` ` ` ` ` ` ` `O` ` ` ` ` ` ` ` `n = 5` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|                     ooooo                     |
o-------------------------------------------------o
+
|                                                 |
Figure 2.  Partition of Degrees 3 and 2
+
|                       O                 n = 5 |
 +
|                                                 |
 +
o-------------------------------------------------o
 +
Figure 2.  Partition of Degrees 3 and 2
 +
</pre>
 +
|}
  
 
This illustrates a situation of uncertainty that has been augmented by a classification.
 
This illustrates a situation of uncertainty that has been augmented by a classification.
  
In the particular pattern of classification that is shown here, the first three outcomes fall under the sign "A", and the next two outcomes fall under the sign "B".  If the outcomes make up a set of ''things that might be true about an object'', then the signs could be read as nomens (terms) or notions (concepts) of a relevant empirical, ontological, taxonomical, or theoretical scheme, that is, as predicates and predictions of the outcomes.  If the outcomes make up a set of ''things that might be good to do in order to achieve an objective'', then the signs could be read as bits of advice or other sorts of indicators that tell us what to do in the situation, relative to our active goals.
+
In the particular pattern of classification that is shown here, the first three outcomes fall under the sign &ldquo;A&rdquo;, and the next two outcomes fall under the sign &ldquo;B&rdquo;.  If the outcomes make up a set of ''things that might be true about an object'', then the signs could be read as nomens (terms) or notions (concepts) of a relevant empirical, ontological, taxonomical, or theoretical scheme, that is, as predicates and predictions of the outcomes.  If the outcomes make up a set of ''things that might be good to do in order to achieve an objective'', then the signs could be read as bits of advice or other sorts of indicators that tell us what to do in the situation, relative to our active goals.
  
 
This is the basic framework for talking about information and signs in regard to communication, decision, and the uncertainties thereof.
 
This is the basic framework for talking about information and signs in regard to communication, decision, and the uncertainties thereof.
Line 93: Line 103:
 
Just to unpack some of the many things that may be getting glossed over in this little word ''sign'', it encompasses all of the ''data of the senses'' (DOTS) that we take as informing us about inner and outer worlds, plus all of the concepts and terms that we use to argue, to communicate, to inquire, or even to speculate, both about our ontologies for beings in the worlds and about our policies for action in the world.
 
Just to unpack some of the many things that may be getting glossed over in this little word ''sign'', it encompasses all of the ''data of the senses'' (DOTS) that we take as informing us about inner and outer worlds, plus all of the concepts and terms that we use to argue, to communicate, to inquire, or even to speculate, both about our ontologies for beings in the worlds and about our policies for action in the world.
  
Here is one of the places where it is tempting to try to collapse the 3-adic sign relation into a 2-adic relation.  For if these DOTS are so closely identified with objects that we can scarcely imagine how they might be discrepant, then it will appear to us that one role of beings can be eliminated from our picture of the world.  In this event, the only things that we are required to inform ourselves about, via the inspection of these DOTS, are yet more DOTS, whether past, or present, or prospective, just more DOTS.  This is the special form to which we frequently find the idea of an information channel being reduced, namely, to a 'source' that has nothing more to tell us about than its own conceivable conducts or its own potential issues.
+
Here is one of the places where it is tempting to try to collapse the 3-adic sign relation into a 2-adic relation.  For if these DOTS are so closely identified with objects that we can scarcely imagine how they might be discrepant, then it will appear to us that one role of beings can be eliminated from our picture of the world.  In this event, the only things that we are required to inform ourselves about, via the inspection of these DOTS, are yet more DOTS, whether past, or present, or prospective, just more DOTS.  This is the special form to which we frequently find the idea of an information channel being reduced, namely, to a ''source'' that has nothing more to tell us about than its own conceivable conducts or its own potential issues.
  
 
As a matter of fact, at least in this discrete type of case, it would be possible to use the degree of the node as a measure of uncertainty, but it would operate as a multiplicative measure rather than the sort of additive measure that we would normally prefer.  To illustrate how this would work out, let us consider an easier example, one where the degree of the choice point is 4.
 
As a matter of fact, at least in this discrete type of case, it would be possible to use the degree of the node as a measure of uncertainty, but it would operate as a multiplicative measure rather than the sort of additive measure that we would normally prefer.  To illustrate how this would work out, let us consider an easier example, one where the degree of the choice point is 4.
  
o-------------------------------------------------o
+
{| align="center" cellspacing="6" style="text-align:center; width:60%"
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|
| ` ` ` ` ` `?` ` `?` ` ` ` ` `?` ` `?` ` ` ` ` ` |
+
<pre>
| ` ` ` ` ` `o` ` `o` ` ` ` ` `o` ` `o` ` ` ` ` ` |
+
o-------------------------------------------------o
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` `o` ` o ` ` ` ` o ` `o` ` ` ` ` ` ` |
+
|           ?     ?           ?     ?           |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|           o     o           o     o           |
| ` ` ` ` ` ` ` `o` `o` ` ` `o` `o` ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|             o   o         o   o             |
| ` ` ` ` ` ` ` ` `o` o ` ` o `o` ` ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|               o   o       o   o               |
| ` ` ` ` ` ` ` ` ` `o o` `o o` ` ` ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|                 o o     o o                 |
| ` ` ` ` ` ` ` ` ` ` `oo oo` ` ` ` ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|                   o o   o o                   |
| ` ` ` ` ` ` ` ` ` ` ` `O` ` ` ` ` ` ` ` `n = 4` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|                     oo oo                     |
o-------------------------------------------------o
+
|                                                 |
Figure 3.  Juncture of Degree 4
+
|                       O                 n = 4 |
 +
|                                                 |
 +
o-------------------------------------------------o
 +
Figure 3.  Juncture of Degree 4
 +
</pre>
 +
|}
  
Suppose that we contemplate making another decision after the present issue has been decided, one that has a degree of 2 in every case.  The compound situation is depicted in Figure 4.
+
Suppose that we contemplate making another decision after the present issue has been decided, one that has a degree of 2 in every case.  The compound situation is depicted in Figure&nbsp;4.
  
o-------------------------------------------------o
+
{| align="center" cellspacing="6" style="text-align:center; width:60%"
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|
| ` ` ` ` `o` `o o` `o` ` ` `o` `o o` `o` ` ` ` ` |
+
<pre>
| ` ` ` ` ` \ / ` \ / ` ` ` ` \ / ` \ / ` ` ` ` ` |
+
o-------------------------------------------------o
| ` ` ` ` ` `o` ` `o` ` ` ` ` `o` ` `o` `n_2 = 2` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|         o   o o   o       o   o o   o         |
| ` ` ` ` ` ` `o` ` o ` ` ` ` o ` `o` ` ` ` ` ` ` |
+
|           \ /   \ /         \ /   \ /           |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|           o     o           o     o   n_2 = 2 |
| ` ` ` ` ` ` ` `o` `o` ` ` `o` `o` ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|             o   o         o   o             |
| ` ` ` ` ` ` ` ` `o` o ` ` o `o` ` ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|               o   o       o   o               |
| ` ` ` ` ` ` ` ` ` `o o` `o o` ` ` ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|                 o o     o o                 |
| ` ` ` ` ` ` ` ` ` ` `oo oo` ` ` ` ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|                   o o   o o                   |
| ` ` ` ` ` ` ` ` ` ` ` `O` ` ` ` ` ` ` `n_1 = 4` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|                     oo oo                     |
o-------------------------------------------------o
+
|                                                 |
Figure 4.  Compound Junctures of Degrees 4 and 2
+
|                       O               n_1 = 4 |
 +
|                                                 |
 +
o-------------------------------------------------o
 +
Figure 4.  Compound Junctures of Degrees 4 and 2
 +
</pre>
 +
|}
  
This illustrates the fact that the compound uncertainty, 8, is the product of the two component uncertainties, 4 times 2.  To convert this to an additive measure, one simply takes the logarithms to a convenient base, say 2, and thus arrives at the not too astounding fact that the uncertainty of the first choice is 2 bits, the uncertainty of the next choice is 1 bit, and the compound uncertainty is 2 + 1 = 3 bits.
+
This illustrates the fact that the compound uncertainty, 8, is the product of the two component uncertainties, 4&nbsp;times&nbsp;2.  To convert this to an additive measure, one simply takes the logarithms to a convenient base, say 2, and thus arrives at the not too astounding fact that the uncertainty of the first choice is 2 bits, the uncertainty of the next choice is 1 bit, and the compound uncertainty is 2&nbsp;+&nbsp;1&nbsp;=&nbsp;3 bits.
  
 
In many ways, the provision of information, a process that reduces uncertainty, is the inverse process to the kind of uncertainty augmentation that occurs in compound decisions.  By way of illustrating this relationship, let us return to our initial example.
 
In many ways, the provision of information, a process that reduces uncertainty, is the inverse process to the kind of uncertainty augmentation that occurs in compound decisions.  By way of illustrating this relationship, let us return to our initial example.
Line 146: Line 166:
 
A set of signs enters on a setup like this as a system of ''middle terms'', a collection of signs that one may regard, aptly enough, as constellating a ''medium''.
 
A set of signs enters on a setup like this as a system of ''middle terms'', a collection of signs that one may regard, aptly enough, as constellating a ''medium''.
  
o-------------------------------------------------o
+
{| align="center" cellspacing="6" style="text-align:center; width:60%"
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|
| ` ` ` ` ` ` ` k_1 = 3 ` ` ` `k_2 = 2` ` ` ` ` ` |
+
<pre>
| ` ` ` ` ` `o-----o-----o` ` `o-----o` ` ` ` ` ` |
+
o-------------------------------------------------o
| ` ` ` ` ` ` ` ` "A" ` ` ` ` ` "B" ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` `o----o----o` ` o----o` ` ` ` ` ` ` |
+
|               k_1 = 3       k_2 = 2           |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|           o-----o-----o     o-----o           |
| ` ` ` ` ` ` ` `o---o---o` `o---o` ` ` ` ` ` ` ` |
+
|                 "A"           "B"               |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|             o----o----o   o----o             |
| ` ` ` ` ` ` ` ` `o--o--o` o--o` ` ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|               o---o---o   o---o               |
| ` ` ` ` ` ` ` ` ` `o-o-o o-o` ` ` ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|                 o--o--o o--o                 |
| ` ` ` ` ` ` ` ` ` ` `ooooo` ` ` ` ` ` ` ` ` ` ` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|                   o-o-o o-o                   |
| ` ` ` ` ` ` ` ` ` ` ` `O` ` ` ` ` ` ` ` `n = 5` |
+
|                                                 |
| ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` |
+
|                     ooooo                     |
o-------------------------------------------------o
+
|                                                 |
Figure 5.  Partition of Degrees 3 and 2
+
|                       O                 n = 5 |
 +
|                                                 |
 +
o-------------------------------------------------o
 +
Figure 5.  Partition of Degrees 3 and 2
 +
</pre>
 +
|}
  
The ''language'' or ''medium'' here is the set of signs {"A", "B"}.  On the assumption that the initial 5 outcomes are equally likely, one may associate a [[frequency distribution]] (''k''<sub>1</sub>, ''k''<sub>2</sub>) = (3, 2) and thus a [[probability distribution]] (''p''<sub>1</sub>, ''p''<sub>2</sub>) = (3/5, 2/5) = (0.6, 0.4) with this language, and thus define a communication ''[[channel (communications)|channel]]''.
+
The ''language'' or ''medium'' here is the set of signs {&ldquo;A&rdquo;,&nbsp;&ldquo;B&rdquo;}.  On the assumption that the initial 5 outcomes are equally likely, one may associate a frequency distribution (''k''<sub>1</sub>, ''k''<sub>2</sub>) = (3, 2) and thus a probability distribution (''p''<sub>1</sub>, ''p''<sub>2</sub>) = (3/5, 2/5) = (0.6, 0.4) with this language, and thus define a communication ''channel''.
  
The most important thing here is really just to get a handle on the 'conditions for the possibility of signs making sense', but once we have this much of a setup we find that we can begin to construct some rough and ready bits of information-theoretic furniture, like [[measure (mathematics)|measure]]s of [[uncertainty]], [[channel capacity]], and the amount of information that can be associated with the reception or the recognition of a single sign.  Still, before we get into all of this, it needs to be emphasized that, even when these measures are too ad hoc and insufficient to be of much use per se, the significance of the setup that it takes to support them is not at all diminished.
+
The most important thing here is really just to get a handle on the ''conditions for the possibility of signs making sense'', but once we have this much of a setup we find that we can begin to construct some rough and ready bits of information-theoretic furniture, like measures of uncertainty, channel capacity, and the amount of information that can be associated with the reception or the recognition of a single sign.  Still, before we get into all of this, it needs to be emphasized that, even when these measures are too ad&nbsp;hoc and insufficient to be of much use per&nbsp;se, the significance of the setup that it takes to support them is not at all diminished.
  
Consider the classification-augmented or sign-enhanced situation of uncertainty that was depicted above.  What happens if one or the other of the two signs, "A" or "B", is observed or received, on the constant assumption that its significance is recognized on receipt?
+
Consider the classification-augmented or sign-enhanced situation of uncertainty that was depicted above.  What happens if one or the other of the two signs, &ldquo;A&rdquo; or &ldquo;B&rdquo;, is observed or received, on the constant assumption that its significance is recognized on receipt?
  
:* If we receive "A" our uncertainty is reduced from <math>\log 5</math> to <math>\log 3.</math>
+
:* If we receive &ldquo;A&rdquo; our uncertainty is reduced from <math>\log 5</math> to <math>\log 3.</math>
  
:* If we receive "B" our uncertainty is reduced from <math>\log 5</math> to <math>\log 2.</math>
+
:* If we receive &ldquo;B&rdquo; our uncertainty is reduced from <math>\log 5</math> to <math>\log 2.</math>
  
It is from these characteristics that the ''[[information capacity]]'' of a communication channel can be defined, specifically, as the 'average uncertainty reduction on receiving a sign', a formula with the splendid mnemonic 'AURORAS'.
+
It is from these characteristics that the ''information capacity'' of a communication channel can be defined, specifically, as the ''average uncertainty reduction on receiving a sign'', a formula with the splendid mnemonic &ldquo;AURORAS&rdquo;.
  
:* On receiving the message "A", the additive measure of uncertainty is reduced from <math>\log 5</math> to <math>\log 3</math>, so the net reduction is <math>(\log 5 - \log 3).</math>
+
:* On receiving the message &ldquo;A&rdquo;, the additive measure of uncertainty is reduced from <math>\log 5</math> to <math>\log 3</math>, so the net reduction is <math>(\log 5 - \log 3).</math>
  
:* On receiving the message "B", the additive measure of uncertainty is reduced from <math>\log 5</math> to <math>\log 2</math>, so the net reduction is <math>(\log 5 - \log 2).</math>
+
:* On receiving the message &ldquo;B&rdquo;, the additive measure of uncertainty is reduced from <math>\log 5</math> to <math>\log 2</math>, so the net reduction is <math>(\log 5 - \log 2).</math>
  
The 'average uncertainty reduction' per sign of the language is computed by taking a ''[[weighted mean|weighted average]]'' of the reductions that occur in the channel, where the weight of each reduction is the number of options or outcomes that fall under the associated sign.
+
The average uncertainty reduction per sign of the language is computed by taking a ''weighted average'' of the reductions that occur in the channel, where the weight of each reduction is the number of options or outcomes that fall under the associated sign.
  
:* The uncertainty reduction of <math>(\log 5 - \log 3)</math> gets a weight of 3.
+
:* The uncertainty reduction of <math>(\log 5 - \log 3)\!</math> gets a weight of 3.
  
:* The uncertainty reduction of <math>(\log 5 - \log 2)</math> gets a weight of 2.
+
:* The uncertainty reduction of <math>(\log 5 - \log 2)\!</math> gets a weight of 2.
  
 
Finally, the weighted average of these two reductions is:
 
Finally, the weighted average of these two reductions is:
  
: <math>{1 \over {2 + 3}}(3(\log 5 - \log 3) + 2(\log 5 - \log 2))</math>
+
: <math>{1 \over {2 + 3}}(3(\log 5 - \log 3) + 2(\log 5 - \log 2))\!</math>
  
Extracting the general pattern of this calculation yields the following worksheet for computing the capacity of a 2-symbol channel with frequencies that [[integer partition|partition]] as ''n''&nbsp;=&nbsp;''k''<sub>1</sub>&nbsp;+&nbsp;''k''<sub>2</sub>.
+
Extracting the general pattern of this calculation yields the following worksheet for computing the capacity of a 2-symbol channel with frequencies that partition as <math>n = k_1 + k_2.\!</math>
  
 
{| cellspacing="6"  
 
{| cellspacing="6"  
 
| Capacity
 
| Capacity
| of a channel {"A", "B"} that bears the odds of 60 "A" to 40 "B"
+
| of a channel {&ldquo;A&rdquo;, &ldquo;B&rdquo;} that bears the odds of 60 &ldquo;A&rdquo; to 40 &ldquo;B&rdquo;
 
|-
 
|-
 
| &nbsp;
 
| &nbsp;
| <math>=\quad {1 \over n}(k_1(\log n - \log k_1) + k_2(\log n - \log k_2))</math>
+
| <math>=\quad {1 \over n}(k_1(\log n - \log k_1) + k_2(\log n - \log k_2))\!</math>
 
|-
 
|-
 
| &nbsp;
 
| &nbsp;
| <math>=\quad {k_1 \over n}(\log n - \log k_1) + {k_2 \over n}(\log n - \log k_2)</math>
+
| <math>=\quad {k_1 \over n}(\log n - \log k_1) + {k_2 \over n}(\log n - \log k_2)\!</math>
 
|-
 
|-
 
| &nbsp;
 
| &nbsp;
| <math>=\quad -{k_1 \over n}(\log k_1 - \log n) -{k_2 \over n}(\log k_2 - \log n)</math>
+
| <math>=\quad -{k_1 \over n}(\log k_1 - \log n) -{k_2 \over n}(\log k_2 - \log n)\!</math>
 
|-
 
|-
 
| &nbsp;
 
| &nbsp;
| <math>=\quad -{k_1 \over n}(\log {k_1 \over n}) - {k_2 \over n}(\log {k_2 \over n})</math>
+
| <math>=\quad -{k_1 \over n}(\log {k_1 \over n}) - {k_2 \over n}(\log {k_2 \over n})\!</math>
 
|-
 
|-
 
| &nbsp;
 
| &nbsp;
| <math>=\quad -(p_1 \log p_1 + p_2 \log p_2)</math>
+
| <math>=\quad -(p_1 \log p_1 + p_2 \log p_2)\!</math>
 
|-
 
|-
 
| &nbsp;
 
| &nbsp;
| <math>=\quad - (0.6 \log 0.6 + 0.4 \log 0.4)</math>
+
| <math>=\quad - (0.6 \log 0.6 + 0.4 \log 0.4)\!</math>
 
|-
 
|-
 
| &nbsp;
 
| &nbsp;
| <math>=\quad 0.971</math>
+
| <math>=\quad 0.971\!</math>
 
|}
 
|}
  
In other words, the capacity of this channel is slightly under 1 bit.  This makes intuitive sense, since  3 against 2 is a near-even split of 5, and the measure of the channel capacity or the ''[[information entropy|entropy]]'' is supposed to attain its maximum of 1 bit whenever a two-way partition is 50-50, that is to say, when it's as ''[[uniform distribution|uniform]]'' a distribution as it can be.
+
In other words, the capacity of this channel is slightly under 1 bit.  This makes intuitive sense, since  3 against 2 is a near-even split of 5, and the measure of the channel capacity or the ''entropy'' is supposed to attain its maximum of 1 bit whenever a two-way partition is 50-50, that is to say, when it's as ''uniform'' a distribution as it can be.
  
 
==Bibliography==
 
==Bibliography==
Line 226: Line 251:
 
* [[Charles Sanders Peirce (Bibliography)]]
 
* [[Charles Sanders Peirce (Bibliography)]]
  
* Peirce, C.S. (1867), "Upon Logical Comprehension and Extension", [http://www.iupui.edu/~peirce/writings/v2/w2/w2_06/v2_06.htm Online].
+
* Peirce, C.S. (1867), &ldquo;Upon Logical Comprehension and Extension&rdquo;, [http://www.iupui.edu/~peirce/writings/v2/w2/w2_06/v2_06.htm Online].
  
 
==See also==
 
==See also==
Line 238: Line 263:
 
* [[Boolean function]]
 
* [[Boolean function]]
 
* [[Boolean-valued function]]
 
* [[Boolean-valued function]]
 +
{{col-break}}
 
* [[Logical graph]]
 
* [[Logical graph]]
 
* [[Logical matrix]]
 
* [[Logical matrix]]
Line 247: Line 273:
 
* [[Propositional calculus]]
 
* [[Propositional calculus]]
 
* [[Semeiotic]]
 
* [[Semeiotic]]
 +
{{col-break}}
 
* [[Sign relation]]
 
* [[Sign relation]]
 
* [[Triadic relation]]
 
* [[Triadic relation]]
Line 253: Line 280:
 
{{col-end}}
 
{{col-end}}
  
===Related articles and projects===
+
===Related articles===
  
* [[Futures Of Logical Graphs]]
+
{{col-begin}}
* [[Semiotic Theory Of Information]]
+
{{col-break}}
* [[Information Equals Comprehension Times Extension]]
+
* [http://intersci.ss.uci.edu/wiki/index.php/Peirce%27s_Logic_Of_Information Peirce's Logic Of Information]
* [[Inquiry Driven Systems]]
+
* [http://intersci.ss.uci.edu/wiki/index.php/Information_%3D_Comprehension_%C3%97_Extension Information = Comprehension × Extension]
* [[Introduction to Inquiry Driven Systems]]
+
* [http://intersci.ss.uci.edu/wiki/index.php/Differential_Logic_:_Introduction Differential Logic : Introduction]
* [[Peirce’s Logic Of Information]]
+
* [http://intersci.ss.uci.edu/wiki/index.php/Differential_Propositional_Calculus Differential Propositional Calculus]
* [[Pragmatic Theory Of Information]]
+
* [http://intersci.ss.uci.edu/wiki/index.php/Differential_Logic_and_Dynamic_Systems_2.0 Differential Logic and Dynamic Systems]
* [[Propositional Equation Reasoning Systems]]
+
{{col-break}}
 
+
* [http://intersci.ss.uci.edu/wiki/index.php/Futures_Of_Logical_Graphs Futures Of Logical Graphs]
<br><sharethis />
+
* [http://intersci.ss.uci.edu/wiki/index.php/Propositional_Equation_Reasoning_Systems Propositional Equation Reasoning Systems]
 +
* [http://intersci.ss.uci.edu/wiki/index.php/Prospects_for_Inquiry_Driven_Systems Prospects for Inquiry Driven Systems]
 +
* [http://intersci.ss.uci.edu/wiki/index.php/Introduction_to_Inquiry_Driven_Systems Introduction to Inquiry Driven Systems]
 +
* [http://intersci.ss.uci.edu/wiki/index.php/Inquiry_Driven_Systems Inquiry Driven Systems : Inquiry Into Inquiry]
 +
{{col-end}}
  
<!--semantic tags-->
+
[[Category:Charles Sanders Peirce]]
[[Author::Jon Awbrey|&nbsp;]]
+
[[Category:Information Theory]]
[[Paper Name::Semiotic Information|&nbsp;]]
+
[[Category:Inquiry]]
[[Paper Of::Directory:Jon Awbrey|&nbsp;]]
+
[[Category:Inquiry Driven Systems]]
 +
[[Category:Logic]]
 +
[[Category:Pragmatism]]
 +
[[Category:Scientific Method]]
 +
[[Category:Semiotics]]
 +
[[Category:Sign Relations]]

Latest revision as of 14:10, 29 October 2016

Author: Jon Awbrey

Semiotic information is the information content of signs as conceived within the semeiotic or sign-relational framework developed by Charles Sanders Peirce.

Once over quickly

What's it good for?

The good of information is its use in reducing our uncertainty about some issue that comes before us. Generally speaking, uncertainty comes in several flavors, and so the information that serves to reduce uncertainty can be applied in several different ways. The situations of uncertainty that human agents commonly find themselves facing have been investigated under many headings, literally for ages, and the classifications that subtle thinkers arrived at long before the dawn of modern information theory still have their uses in setting the stage of an introduction.

Picking an example of a subtle thinker almost at random, the philosopher-scientist Immanuel Kant divided the principal questions of human existence into three parts:

  • What's true?
  • What's to do?
  • What's to hope?

The third question is a bit too subtle for the present frame of discussion, but the first and second are easily recognizable as staking out the two main axes of information theory, namely, the dual dimensions of information and control. Roughly the same space of concerns is elsewhere spanned by the dual axes of competence and performance, specification and optimization, or just plain knowledge and skill.

A question of what's true is a descriptive question, and there exist what are called descriptive sciences devoted to answering descriptive questions about any domain of phenomena that one might care to name.

A question of what's to do, in other words, what must be done by way of achieving a given aim, is a normative question, and there exist what are called normative sciences devoted to answering normative questions about any domain of problems that one might care to address.

Since information plays its role on a stage set by uncertainty, a big part of saying what information is will necessarily involve saying what uncertainty is. There is little chance that the vagueness of a word like uncertainty, given the nuances of its ordinary, poetic, and technical uses, can be corralled by a single pen, but there do exist established models and formal theories that address definable aspects of uncertainty, and these have enough uses to make them worth looking into.

What is information that a sign may bear it?

Three more questions arise at this juncture:

  1. How is a sign empowered to contain information?
  2. What is the practical context of communication?
  3. Why do we care about these bits of information?

A very rough answer to these questions might begin as follows:

Human beings are initially concerned solely with their own lives, but then a world obtrudes on their subjective existence, and so they find themselves forced to take an interest in the objective realities of its nature.

In pragmatic terms our initial aim, concern, interest, object, or pragma is expressed by the verbal infinitive to live, but the infinitive is soon reified into the derivative substantial forms of nature, reality, the world, and so on. Against this backdrop we find ourselves cast as the protagonists on a scene of uncertainty. The situation may be pictured as a juncture from which a manifold of options fan out before us. It may be an issue of truth, duty, or hope, the last codifying a special type of uncertainty as to what regulative principle has any chance of success, but the chief uncertainty is that we are called on to make a choice and find that we all too often have almost no clue as to which of the options is most fit to pick.

Just to make up a discrete example, let us suppose that the cardinality of this choice is a finite n, and just to make it fully concrete let us say that n = 5. Figure 1 affords a rough picture of the situation.

o-------------------------------------------------o
|                                                 |
|            ?     ?     ?     ?     ?            |
|            o     o     o     o     o            |
|                                                 |
|              o    o    o    o    o              |
|                                                 |
|                o   o   o   o   o                |
|                                                 |
|                  o  o  o  o  o                  |
|                                                 |
|                    o o o o o                    |
|                                                 |
|                      ooooo                      |
|                                                 |
|                        O                 n = 5  |
|                                                 |
o-------------------------------------------------o
Figure 1.  Juncture of Degree 5

This pictures a juncture, represented by “O”, where there are n options for the outcome of a conduct, and we have no clue as to which it must be. In a sense, the degree of this node, in this case n = 5, measures the uncertainty that we have at this point.

This is the minimal sort of setting in which a sign can make any sense at all. A sign has significance for an agent, interpreter, or observer because its actualization, its being given or its being present, serves to reduce the uncertainty of a decision that the agent has to make, whether it concerns the actions that the agent ought to take in order to achieve some objective of interest, or whether it concerns the predicates that the agent ought to treat as being true of some object in the world.

The way that signs enter the scene is shown in Figure 2.

o-------------------------------------------------o
|                                                 |
|               k_1 = 3        k_2 = 2            |
|            o-----o-----o     o-----o            |
|                 "A"           "B"               |
|              o----o----o    o----o              |
|                                                 |
|                o---o---o   o---o                |
|                                                 |
|                  o--o--o  o--o                  |
|                                                 |
|                    o-o-o o-o                    |
|                                                 |
|                      ooooo                      |
|                                                 |
|                        O                 n = 5  |
|                                                 |
o-------------------------------------------------o
Figure 2.  Partition of Degrees 3 and 2

This illustrates a situation of uncertainty that has been augmented by a classification.

In the particular pattern of classification that is shown here, the first three outcomes fall under the sign “A”, and the next two outcomes fall under the sign “B”. If the outcomes make up a set of things that might be true about an object, then the signs could be read as nomens (terms) or notions (concepts) of a relevant empirical, ontological, taxonomical, or theoretical scheme, that is, as predicates and predictions of the outcomes. If the outcomes make up a set of things that might be good to do in order to achieve an objective, then the signs could be read as bits of advice or other sorts of indicators that tell us what to do in the situation, relative to our active goals.

This is the basic framework for talking about information and signs in regard to communication, decision, and the uncertainties thereof.

Just to unpack some of the many things that may be getting glossed over in this little word sign, it encompasses all of the data of the senses (DOTS) that we take as informing us about inner and outer worlds, plus all of the concepts and terms that we use to argue, to communicate, to inquire, or even to speculate, both about our ontologies for beings in the worlds and about our policies for action in the world.

Here is one of the places where it is tempting to try to collapse the 3-adic sign relation into a 2-adic relation. For if these DOTS are so closely identified with objects that we can scarcely imagine how they might be discrepant, then it will appear to us that one role of beings can be eliminated from our picture of the world. In this event, the only things that we are required to inform ourselves about, via the inspection of these DOTS, are yet more DOTS, whether past, or present, or prospective, just more DOTS. This is the special form to which we frequently find the idea of an information channel being reduced, namely, to a source that has nothing more to tell us about than its own conceivable conducts or its own potential issues.

As a matter of fact, at least in this discrete type of case, it would be possible to use the degree of the node as a measure of uncertainty, but it would operate as a multiplicative measure rather than the sort of additive measure that we would normally prefer. To illustrate how this would work out, let us consider an easier example, one where the degree of the choice point is 4.

o-------------------------------------------------o
|                                                 |
|            ?     ?           ?     ?            |
|            o     o           o     o            |
|                                                 |
|              o    o         o    o              |
|                                                 |
|                o   o       o   o                |
|                                                 |
|                  o  o     o  o                  |
|                                                 |
|                    o o   o o                    |
|                                                 |
|                      oo oo                      |
|                                                 |
|                        O                 n = 4  |
|                                                 |
o-------------------------------------------------o
Figure 3.  Juncture of Degree 4

Suppose that we contemplate making another decision after the present issue has been decided, one that has a degree of 2 in every case. The compound situation is depicted in Figure 4.

o-------------------------------------------------o
|                                                 |
|          o   o o   o       o   o o   o          |
|           \ /   \ /         \ /   \ /           |
|            o     o           o     o   n_2 = 2  |
|                                                 |
|              o    o         o    o              |
|                                                 |
|                o   o       o   o                |
|                                                 |
|                  o  o     o  o                  |
|                                                 |
|                    o o   o o                    |
|                                                 |
|                      oo oo                      |
|                                                 |
|                        O               n_1 = 4  |
|                                                 |
o-------------------------------------------------o
Figure 4.  Compound Junctures of Degrees 4 and 2

This illustrates the fact that the compound uncertainty, 8, is the product of the two component uncertainties, 4 times 2. To convert this to an additive measure, one simply takes the logarithms to a convenient base, say 2, and thus arrives at the not too astounding fact that the uncertainty of the first choice is 2 bits, the uncertainty of the next choice is 1 bit, and the compound uncertainty is 2 + 1 = 3 bits.

In many ways, the provision of information, a process that reduces uncertainty, is the inverse process to the kind of uncertainty augmentation that occurs in compound decisions. By way of illustrating this relationship, let us return to our initial example.

A set of signs enters on a setup like this as a system of middle terms, a collection of signs that one may regard, aptly enough, as constellating a medium.

o-------------------------------------------------o
|                                                 |
|               k_1 = 3        k_2 = 2            |
|            o-----o-----o     o-----o            |
|                 "A"           "B"               |
|              o----o----o    o----o              |
|                                                 |
|                o---o---o   o---o                |
|                                                 |
|                  o--o--o  o--o                  |
|                                                 |
|                    o-o-o o-o                    |
|                                                 |
|                      ooooo                      |
|                                                 |
|                        O                 n = 5  |
|                                                 |
o-------------------------------------------------o
Figure 5.  Partition of Degrees 3 and 2

The language or medium here is the set of signs {“A”, “B”}. On the assumption that the initial 5 outcomes are equally likely, one may associate a frequency distribution (k1, k2) = (3, 2) and thus a probability distribution (p1, p2) = (3/5, 2/5) = (0.6, 0.4) with this language, and thus define a communication channel.

The most important thing here is really just to get a handle on the conditions for the possibility of signs making sense, but once we have this much of a setup we find that we can begin to construct some rough and ready bits of information-theoretic furniture, like measures of uncertainty, channel capacity, and the amount of information that can be associated with the reception or the recognition of a single sign. Still, before we get into all of this, it needs to be emphasized that, even when these measures are too ad hoc and insufficient to be of much use per se, the significance of the setup that it takes to support them is not at all diminished.

Consider the classification-augmented or sign-enhanced situation of uncertainty that was depicted above. What happens if one or the other of the two signs, “A” or “B”, is observed or received, on the constant assumption that its significance is recognized on receipt?

  • If we receive “A” our uncertainty is reduced from \(\log 5\) to \(\log 3.\)
  • If we receive “B” our uncertainty is reduced from \(\log 5\) to \(\log 2.\)

It is from these characteristics that the information capacity of a communication channel can be defined, specifically, as the average uncertainty reduction on receiving a sign, a formula with the splendid mnemonic “AURORAS”.

  • On receiving the message “A”, the additive measure of uncertainty is reduced from \(\log 5\) to \(\log 3\), so the net reduction is \((\log 5 - \log 3).\)
  • On receiving the message “B”, the additive measure of uncertainty is reduced from \(\log 5\) to \(\log 2\), so the net reduction is \((\log 5 - \log 2).\)

The average uncertainty reduction per sign of the language is computed by taking a weighted average of the reductions that occur in the channel, where the weight of each reduction is the number of options or outcomes that fall under the associated sign.

  • The uncertainty reduction of \((\log 5 - \log 3)\!\) gets a weight of 3.
  • The uncertainty reduction of \((\log 5 - \log 2)\!\) gets a weight of 2.

Finally, the weighted average of these two reductions is:

\[{1 \over {2 + 3}}(3(\log 5 - \log 3) + 2(\log 5 - \log 2))\!\]

Extracting the general pattern of this calculation yields the following worksheet for computing the capacity of a 2-symbol channel with frequencies that partition as \(n = k_1 + k_2.\!\)

Capacity of a channel {“A”, “B”} that bears the odds of 60 “A” to 40 “B”
  \(=\quad {1 \over n}(k_1(\log n - \log k_1) + k_2(\log n - \log k_2))\!\)
  \(=\quad {k_1 \over n}(\log n - \log k_1) + {k_2 \over n}(\log n - \log k_2)\!\)
  \(=\quad -{k_1 \over n}(\log k_1 - \log n) -{k_2 \over n}(\log k_2 - \log n)\!\)
  \(=\quad -{k_1 \over n}(\log {k_1 \over n}) - {k_2 \over n}(\log {k_2 \over n})\!\)
  \(=\quad -(p_1 \log p_1 + p_2 \log p_2)\!\)
  \(=\quad - (0.6 \log 0.6 + 0.4 \log 0.4)\!\)
  \(=\quad 0.971\!\)

In other words, the capacity of this channel is slightly under 1 bit. This makes intuitive sense, since 3 against 2 is a near-even split of 5, and the measure of the channel capacity or the entropy is supposed to attain its maximum of 1 bit whenever a two-way partition is 50-50, that is to say, when it's as uniform a distribution as it can be.

Bibliography

  • Peirce, C.S. (1867), “Upon Logical Comprehension and Extension”, Online.

See also

Related topics

Template:Col-breakTemplate:Col-breakTemplate:Col-breakTemplate:Col-breakTemplate:Col-end

Related articles

Template:Col-breakTemplate:Col-breakTemplate:Col-end