Laws of Thought

February 2, 2008

The "laws of thought" are said to comprise the following

In philosophy, the law [of identity] is often mistakenly attributed to Aristotle,[1] who actually wrote:
"Now 'why a thing is itself' is a meaningless inquiry (for—to give meaning to the question 'why'—the fact or the existence of the thing must already be evident—e.g., that the moon is eclipsed—but the fact that a thing is itself is the single reason and the single cause to be given in answer to all such questions as why the man is man, or the musician musical, unless one were to answer, 'because each thing is inseparable from itself, and its being one just meant this.' This, however, is common to all things and is a short and easy way with the question.)" Metaphysics, Book VII, Part 17 (1)

In logic, the law of non-contradiction ... states, in the words of Aristotle, that
"one cannot say of something that it is and that it is not in the same respect and at the same time". (2)
[Note Aristotle's use of indices (respect & time).]

Aristotle wrote that ambiguity can arise from the use of ambiguous names, but cannot exist in the "facts" themselves:
It is impossible, then, that 'being a man' should mean precisely 'not being a man', if 'man' not only signifies something about one subject but also has one significance. … And it will not be possible to be and not to be the same thing, except in virtue of an ambiguity, just as if one whom we call 'man', and others were to call 'not-man'; but the point in question is not this, whether the same thing can at the same time be and not be a man in name, but whether it can be in fact. (Metaphysics 4.4, W.D. Ross (trans.), GBWW 8, 525–526). (3)

Of special note is the fact that Aristotle did not have language that distinguished between the use and mention of terms. The best he could do was is use a phrase like "said of something" But in his categorization system he talks about what is and what is not of a thing. In Aristotle both the logic and semantics are conjoined and spoken of at once.

Since then the distinction between use and mention and other developments give us a paradigm in which we can separate syntax, logic, and semantics. With the current distinction, these so-called "laws" apply in the logical categorization scheme. And this applies for determining consistency of the theory part of a model. The context of this activity is logical consistency.

When it comes to applying the theory to a domain, the combination becomes a model. This introduces a semantic context where a relation is presumed between names in the theory and objects in the domain.

The so called "law of identity" applies to the relation of symbols to objects, and it means in a semantic context that we presume a relatively invariant (over time) ability to use a symbol to designate an object. Class names, however, apply to "all objects", and we cannot verify that all objects presumed to be under a class name satisfy the model relations (except in formal model theory). For this reason we must say for each x the relation between x and r(x) remains unchanged from time1 to time2 within the period of relative invariance. This, however, applies to the individuals over which x ranges. The class X of which x is an arbitrary member is such that r(x) must be examined to determine whether or not it remains in the period of relative invariance.

In language use we presume that the symbol x refers to the object r(x) until such time as we discover otherwise. Consequently the "law of identity" in the semantic context holds conditionally on the period of relative invariance of the use of names. Without this assumption, names could never be used to perform indications. We point at something and say "moon", and the listener can repeat both the action and the verbalization. This propagates through the culture, and a relation (reference) is established between 'moon' and that which we point at when we say 'moon'. The word 'moon' (mentioned) comes to indicate the moon (used) ["disquotation" as developed by Quine.]. r('moon') -> the moon - the "thing" we use the word 'moon' to indicate.

Here's an example of recent evolution of a long standing category. From the time of its discover until quite recently Pluto has been a member of the class planet. Recently this classification has been challenged based on some examination of some properties. So the class planet applied to the 9 previously-designated-as-planets bodies in this solar system is beginning to show so variation in its "relative invariance". But we would properly express that as planet1967 is not planet2007. It would be an incoherent abuse of language to say planet is not planet. But that is exactly what a denial of "A is A" would result in. "A is not A" is contradictory and incoherent semantics. We assign the proposition "A = not A" the value "F" in propositional calculus. Consequently we cannot in either an "Aristotelian" or a "non-Aristotelian" frame of reference assert "A is not A". It is incomplete, and therefore nonsense. "A is A" means the symbol is constant, and it means that we can use the symbol to reliably (within the period of relative invariance) perform a repeatable indication.

Now, let's look at the other two so-called laws. The law of non-contradiction. The law of excluded middle. These do not apply to sets when the symbols refer to things or objects. In a semantic context, we can create some sets by denotation that satisfy these constraints, but sets in general do not. Whenever we pick out sets of objects or things from the "real world" or from our semantic reactions, they may overlap, and they may exhibit contrary characteristics.

In a semantic context involving a relation between linguistic categories where names represent objects or "things" covered by "concepts by intuition", we cannot depend on the logical relations between the categories to hold between the objects the names refer to. This does not warrant rejection of these relations at the category level; it only warrants rejecting them as "always" applying in the context of semantic reference. All three "laws" hold absolutely in a logical category context. Identity holds during periods or relative invariance in a semantic context, but may fail when the period of relative invariance is exceeded. The excluded middle and non-contradiction, technically do not apply in the world of objects under the usual "reference" relation, because they are strictly category concepts, and they are empirically disconfirmed by many different sets of examples.

A so-called "Aristotelian" approach applies in logic levels, and these are used as part of the "non-Aristotelian" approach when semantic levels are added. Aristotelian approach is used to insure the map is self-consistent. Non-Aristotelian approach is used when the determined-to-be-consistent-by-Aristotelian-means map is applied in navigation and when we go about using the navigation errors to update the map.

If we take the view that our category structure is absolute and applies always to the things in the world, then we are attempting to force our intensional structure onto reality, and this is called "intensional orientation". If we always look to the experiences and give them priority, this is called an "extensional orientation". As cultural systems, religion exhibits intensional orientation, and (empirical) science exhibits extensional orientation. Persons who insist that everything is good or bad, for or against, black or white, with us or against us, etc., exhibit two-valued orientation. and this is consistent with applying the binary categories in many (or all) aspects of life. But I would not call that an "Aristotelian" orientation. Aristotle himself had category structures as a way of organizing, and they included more that two values.

Suggesting that accept or reject the so-called thee laws of thought as distinguishing between "Aristotelian" and "non-Aristotelian" amounts flat or single level thinking. That is applying categories structures without regard for the distinction between predicate calculus and semantic models. We have multi-level thinking with each level having different ways of handling the these so-called "laws of thought", and the combined multi-level process using the rules for theory and categories, but being prepared for them to fail at semantic levels and a consciousness of the abstracting process that enables us to handle both levels differently at the same time gives us our entry to the so-called "non-Aristotelian" approach.

It just happens that this approach is and has been the approach of modern science and the scientists in his laboratory both with and without the benefit of general semantics terminology. Xenophanes, in the sixth century bce noted the uncertain relation between language and the world. Aristotle codified a system of categorization as well as a basic system of reasoning that supported categorization and allowed for strict consistency. The ancient Greeks knew that any theory must "save the appearances", that is, it must agree with observations. But "The Church" after the fall of the Roman empire dominated thought and suppressed science for nearly a thousand years. In the thirteenth century Roger Bacon (a Franciscan monk) emphasized the need for "experience" to "validate" what one might conclude from argument, and may be credited with initiating empiricism.


This page was updated by Ralph Kenyon on 2009/11/16 at 00:27 and has been accessed 10797 times at 295 hits per month.