# Four Forms Make a Universe

How could I not love a paper with this title? I’ve struggled with it for a bit, and I’ve only managed a couple of diagrams relating the author’s LICO (Linear Iconic) alphabet made up of 16 letters. However, I see that there are a few other papers by Schmeikal available on ResearchGate that look easier to understand. But also however, the first one says to read the “Four Forms” paper first!

At any rate, I present a sixteen-fold of the LICO alphabet, and another of the binary Boolean operators that are in a one-to-one mapping with LICO. There is much to understand from these papers, including much syncretism between various mathematical sixteen-folds, so please forgive me if I don’t explain it all with immediate ease. However, I believe it is well worth the effort to understand.

(Please note that the characters of the LICO alphabet are oriented so that the bottoms of the letters are downward, but the Boolean operators are oriented so that the bottoms of the equations are towards the right angles of the triangles.)

The title comes from the result that four elements of LICO can reproduce the other twelve via linear combinations. These four forms are 1) Boolean True (A or ~A), 2) A, 3) B, and 4) A=B. These are within the interior right-hand triangles in the LICO diagram. Of course, it is well known from Computer Science that the NAND operator (~A or ~B) can also generate all other fifteen operators, but this is by multiple nested operations instead of simple Boolean arithmetic. There are several other “universal” binary gates that can do this as well.

Two other representations that have four elements that can generate the other twelve via linear combinations come from CL(3,1), the Minkowski algebra. These representations are called “Idempotents” and  “Colorspace vectors”. Because of this algebra’s association with space and time in relativity, Schmeikal claims that LICO has ramifications in many far-ranging conceptualizations.

Bernd Schmeikal / Four Forms Make a Universe, in Advances in Applied Clifford Algebras (2015), Springer Basel (DOI 10.1007/s00006-015-0551-z)

Bernd Schmeikal / Free Linear Iconic Calculus – AlgLog Part 1: Adjunction, Disconfirmation and Multiplication Tables

Bernd Schmeikal / LICO a Reflexive Domain in Space-time (AlgLog Part 3)

https://en.wikipedia.org/wiki/Spacetime_algebra

[*9.145, *11.50]

<>

# The Marriage of Opposites, Part 3

Everything is dual; everything has poles; everything has its pair of opposites; like and unlike are the same; opposites are identical in nature; but different in degree.

— From The Kybalion by The Three Initiates

There are trivial truths and the great truths. The opposite of a trivial truth is plainly false. The opposite of a great truth is also true.

— Niels Bohr

I have mentioned the alchemical notion of the “Marriage of Opposites” several times (here and here). When opposites marry, what happens as a result? Do they cancel one another out, leaving just a boring average as result? Do they explode in a fiery conflagration, like matter and anti-matter releasing energy? Or do they create a new thing, something that is greater than the sum of their parts?

If opposites annihilate each other, what is the result, emptiness or a void? It is often said that nature abhors a vacuum (“horror vacui”), but I think it is far more true that the mind does. In dualistic thinking, everything that is not one thing must be its opposite. Not good is bad, not happy is sad, not black is white.

In classical logic, the Law of the Excluded Middle says that for any proposition “p”, either it is true or its negation “not p” is true. Thus, “p or not p” is necessarily true, a tautology. Similarly, their combination “p and not p”, cannot ever be true, and so is necessarily false. If one can assume “not p” and derive a contradiction, then “p” must be true (reductio ad absurdum).

In intuitionistic logic, one cannot deduce “p” simply from the falsity of “not p” (that is, “not not p”), one must actually prove that “p” is true. So “p or not p” may still be uncertain, if we don’t know how to prove “p”. However, “p and not p” is still false, based on the falsity of “not p”.

In the viewpoint of Dialetheism, it is offered that there are truths whose opposites are also true, called “true contradictions”. Dialetheisms cannot exist in formal logics because if “p and not p” is true, then you can deduce anything you want and your logic breaks down. Nonetheless, much thought through the years has been dedicated to dialetheisms and their ilk. Please see the recent work by philosopher Graham Priest.

When one considers something and its opposite at the same time, how can you reach an agreement between them? In magnetism, opposite charges attract and like charges repel. All too often, opposite viewpoints vigorously repel each other instead of reaching a happy medium. Each viewpoint considers the other “false” and so they push away at each other, instead of meeting halfway in compromise.

If there is empirical evidence supporting one viewpoint and not the other, and both parties can agree to it, then problem solved. But if viewpoints are more like ideologies, and one side shows evidence that the other side dismisses, what then? Are we only left to agree to disagree? That doesn’t seem like a long term solution.

In this blog I have insinuated but not stated explicitly that a marriage of opposites can often be achieved by combining it with another pair of opposites. Rather than meeting in the middle to a void or an annihilation, one can reach the other side by “going around” the danger, by way of intermediates. Much like Winter reaches Summer by passing through Spring and Summer reaches Winter via Fall, this type of structure is found everywhere in human thinking.

In fact, many systems of pluralistic philosophies are built on fourfolds instead of dualities. For example, see the work of Richard McKeon, specifically this paper.

https://en.wikipedia.org/wiki/The_Kybalion

https://en.wikipedia.org/wiki/Horror_vacui_(physics)

https://en.wikipedia.org/wiki/Law_of_excluded_middle

https://en.wikipedia.org/wiki/Consequentia_mirabilis

https://en.wikipedia.org/wiki/Intuitionistic_logic

https://en.wikipedia.org/wiki/Dialetheism

https://plato.stanford.edu/entries/dialetheism/

https://en.wikipedia.org/wiki/Graham_Priest

http://www.richardmckeon.org/

[*10.32]

<>

# A Rosetta Stone

Abstract of Physics, Topology, Logic and Computation: A Rosetta Stone by John Baez and Michael Stay:

In physics, Feynman diagrams are used to reason about quantum processes. In the 1980s, it became clear that underlying these diagrams is a powerful analogy between quantum physics and topology: namely, a linear operator behaves very much like a “cobordism”. Similar diagrams can be used to reason about logic, where they represent proofs, and computation, where they represent programs. With the rise of interest in quantum cryptography and quantum computation, it became clear that there is extensive network of analogies between physics, topology, logic and computation. In this expository paper, we make some of these analogies precise using the concept of “closed symmetric monoidal category”. We assume no prior knowledge of category theory, proof theory or computer science.

• Physics
• Logic
• Topology
• Computation

Perhaps Category Theory is a “Fifth Essence”?

https://arxiv.org/abs/0903.0340

[*9.168, *10.50]

<>

# Laws of Form

George Spencer-Brown, author of Laws of Form, recently passed away.

I’ve tried to appreciate this work in the past, but couldn’t really get started. I recently ran across the following four terms associated with the work,

• Compensation
-> (())
• Cancellation
(()) ->
• Condensation
()() -> ()
• Confirmation
() -> ()()

Compensation and Cancellation are both considered Order, and Condensation and Confirmation are both considered Number. Number and Order are distinguished by Distinction, and the pairs of the two distinctions are distinguished by Direction.

I understand Laws of Form starts with “Draw a distinction.” Perhaps I would say “Draw a distinction, then draw a distinction of that distinction.”

References:

http://www.telegraph.co.uk/obituaries/2016/09/13/george-spencer-brown-polymath-who-wrote-the-landmark-maths-book/

https://en.wikipedia.org/wiki/G._Spencer-Brown

https://en.wikipedia.org/wiki/Laws_of_Form

https://en.wikipedia.org/wiki/Distinction_(philosophy)

For my further reading:

An Esoteric Guide to Spencer Brown’s Laws of Form #4

https://larvalsubjects.wordpress.com/2011/03/19/distinction-on-deconstruction/

Notes:

Compensation (+2) (Pairs of parentheses)
Cancellation (-2) (Involutory?)
Condensation (-1) (Idempotence)
Confirmation (+1)

[*9.158]

<>

# The Semiotic Square

From Wikipedia:

The semiotic square, also known as the Greimas square, is a tool used in structural analysis of the relationships between semiotic signs through the opposition of concepts, such as feminine-masculine or beautiful-ugly, and of extending the relevant ontology.

Notes:

In an earlier post I combined an unusual representation of the semiotic square with that of the Tetralemma. Instead of using that one, please use this one instead.

References:

https://en.wikipedia.org/wiki/Semiotic_square

[*4.84]

<>

# Propositions as Types

For almost 100 years, there have been linkages forged between certain notions of logic and of computation. As more associations have been discovered, the bonds between the two have grown stronger and richer.

• Propositions in logic can be considered equivalent to types in programming languages.
• Proofs of propositions in logic can be considered equivalent to programs of given type in computation.
• The simplification of proofs of propositions in logic can be considered equivalent to the evaluation of programs of types in computation.

The separate work of various logicians and computer scientists (and their precursors) can be paired:

• Gerhard Gertzen’s work on proofs in intuitionistic natural deduction and Alonzo Church’s work on the simply typed lambda calculus.
• J. Roger Hindley and Robin Milner’s work on type systems for combinatory logic and programming languages, respectively.
• J. Y. Girard and John Reynold’s work on the second order lambda calculus and parametric polymorphic programs, respectively.
• Haskell Curry’s and W. A. Howard’s work on the overall correspondence between these notions of proofs as programs or positions as types.

Logic and computation are the sequential chains of efficient causation and actions. Propositions and types are the abstract grids of formal causation and structures. Proofs and programs are the normative cycles of final causation and functions. Simplification and evaluation are the reductive solids of material causation and parts.

References:

Philip Wadler / Propositions as Types, in Communications of the ACM, Vol. 58 No. 12 (Dec 2015) Pages 75-85.

http://cacm.acm.org/magazines/2015/12/194626-propositions-as-types/fulltext

Preprint at

Also see:

https://en.wikipedia.org/wiki/Hindley%E2%80%93Milner_type_system

https://en.wikipedia.org/wiki/System_F

https://en.wikipedia.org/wiki/Curry%E2%80%93Howard_correspondence

[*9.92-9.94]

<>

# J.-Y. Girard’s Transcendental Syntax, V2

Meaning is use.

– Ludwig Wittgenstein

If people never did silly things nothing intelligent would ever get done.

– Ludwig Wittgenstein

The latest two preprints by logician J.-Y. Girard continue his program for transcendental syntax, divided into deterministic and nondeterministic. He defines transcendental syntax as the study of the conditions of the possibility of language: to begin by discovering the preliminary suppositions in the creating of a logical sentence such as a proposition or deduction.

What are the presuppositions for using propositions? Girard claims the main one is the balance between the creation and the use of words, which is at the heart of meaning. But the notion that a proposition has a meaning that is well defined is prejudice, albeit one that allows us identify the terms of a sentence and thus to perform deductions.

Girard wants instead to find inner explanations of logical rules: explanations based on syntax instead of a semantics that correlates to a mandated “reality”. To emphasize this, he gives the term Derealism as another expression for transcendental syntax. Logical rules should have a normative aspect because of their utility, so his project appears to be one of pragmatism. Others have said that Linear Logic is the logic of the radical anti-realist.

Girard divides all of logical activity into four blocks that weave together: the Constat, the Performance (please forgive my shortening on the diagram above), L’usine (factory), and L’usage (use). These four blocks are partitioned by Kant’s analytic-synthetic and a priori-a posteriori distinctions. The analytic is said to have “no meaning”, that is, “locative”. The synthetic is said to have “meaning”, that is, “spiritual”. The a priori is said to be “implicit”, and the a posteriori is said to be “explicit”.
Can we find all the explanations we need to create logic internally? If so, perhaps it is only because of how the brain works, like how John Bolender posits that social relations described by the Relational Models Theory are created out of symmetry breaking structures of our nervous systems, which are in turn generated by our DNA. A realist would certainly say that our understanding of logical rules is enabled but also limited by our brains, whereas an idealist would say that our minds could “transcend” those limits. But it seems pragmatic to say that the mind is what the brain does.

I believe a closer analogy for the fourfold of Transcendental Syntax is to Hjelmslev’s Net than to Kant’s Analytic-Synthetic Distinction. If so, then Performance and L’usage are Content (Implicit), whereas Constat and L’usine are Expression (Explicit). Performance and Constat are Substance (Locative), and L’usine and L’usage are Form (Spiritual). Hjelmslev was a linguist that developed a theory of language as consisting of only internal rules.

Or even to analogy with Aristotle’s Four Causes, which is how I’ve arranged the first diagram: the Constat is the Material cause, the Performance is the Efficient cause, L’usine is the Formal cause, and L’usage is the Final cause. Material and Efficient causes are often considered mere matter in motion, which could be Locative, or meaningless (physical). Formal and Final could be Spiritual, or meaningful, as patterns of matter and motion, respectively.

Notes:

How can we know that a given named term is the same as another one in a different part of our formula? Rather than using names, or linking them through semantics or a well-defined meaning, we can tie terms together by their locations in our sentences and deductions.

References:

J.-Y. Girard / Transcendental syntax 1: deterministic case (January 2015 Preprint)

J.-Y. Girard / Transcendental syntax 2: non deterministic case (February 2015 Preprint)

http://iml.univ-mrs.fr/~girard/Articles.html

V. Michele Abrusci, Paolo Pistone / On Transcendental Syntax: a Kantian Program for Logic?