One of computer scientist and Professor Emeritus Vaughan Pratt’s most recent conference papers is on “linear process algebra,” which relates several of his previous interests on linear logic, Chu spaces, concurrent processes, events and states, etc.

The paper opens with a nice overview of computer science research primarily concerned with concurrent processes. Computation itself divides into the aspects of logical and algorithmic, formal methods into the logical and algebraic, concurrent computation into operational and denotational, and then the author gives a brief list of models of processes by a variety of mathematical structures until he comes to his theme of using Chu spaces.

As an example, he presents processes as Chu spaces over the set K, where K = { 0, T, 1, X}, with names and meanings :

0: Ready

T: Transition

1: Done

X: Cancelled

and then details four binary operations as working in Chu spaces over K:

“How to Solve It” (first published in 1945) is a small volume by mathematician George Pólya describing methods of problem solving. The book suggests the following steps when solving a mathematical problem:

1. Understand the problem.
2. Devise a plan to solve it.
3. Carry out the plan.
4. Revise and extend: look back on your work.

There is a somewhat different and older one that is based on the Chomsky Hierarchy (where ND stands for non-deterministic), and the associated language or grammar that they recognize:

My previous post on Wolfram’s physics mentioned the Pi calculus, but I liked this little diagram so much I decided to let it have its own mention. The rules aren’t really four in number, but oh well.

When I first started looking at Stephen Wolfram’s latest proposal to solve physics, I was somewhat disappointed. I was rather fond of his previous “New Kind of Science” based on the structural rigidity of cellular automata. However, I am now intrigued by his latest ideas, based on the looser but more flexible basis of networks.

And once you have pithy statements with space, time, energy, and matter (as momenta), you catch my attention:

Energy is flux of causal edges

through Spacelike hypersurfaces

Momentum is flux of causal edges

through Timelike hypersurfaces

I confess I haven’t read much about the project yet, but it seems to be using rewriting rules, perhaps similar to the notion of rewriting in Wolfram’s previous framework, cellular automata. Of course, cellular automata and also rewriting rule systems can be computationally universal or Turing complete.

Another idea might be to try some sort of computational metaphysics between nodes like the pi-calculus (or some other process calculus). After all, you have to support quantum entanglement! However if you can encode everything with simpler structures then do it!

The primary classifications of cellular automata, as outlined by Wolfram, are numbered one to four. They are, in order, automata in which patterns generally stabilize into homogeneity, automata in which patterns evolve into mostly stable or oscillating structures, automata in which patterns evolve in a seemingly chaotic fashion, and automata in which patterns become extremely complex and may last for a long time, with stable local structures. This last class are thought to be computationally universal, or capable of simulating a Turing machine.

In his science-fictional “Foundation Trilogy”, Isaac Asimov famously hypothesized a future science called “psychohistory”, a mathematically grounded theory of generalized and predictive human action, based on an amalgamation of psychology, history, and sociology. The future galactic empire was managed by this theory and practice (look out – almost seventy year old spoilers!) except for an exceptional character that was not anticipated and essentially unpredicable.

Asimov had in mind well validated continuous and statistical theories of physics, for example for idealized gases and their laws. I was stuck by an image for an explanation of turbulence that highlighted key elements of velocity, density, pressure, and viscosity, and how it was (in my mind) analogical to antagonistic individuals, dominating leaders, submissive society, and affiliated coteries. Of course, an article below states that turbulence is still too complicated to provably model correctly at this point in time.

I had no idea that psychohistory was claimed to be an actual field of study these days, albeit being somewhat controversial in its authenticity. And it doesn’t seem to have any mathematical basis yet, as far as I know. Mathematician Dan Crisan gave an inaugural talk a few years ago that was hypothesizing using heat equations instead of fluid dynamics as a basis. Even so, we can’t seem to properly model any sort of social action so how could psychohistory be within our grasp?

In these turbulent times perhaps we should make an effort to understand ourselves a bit better, as we hope to navigate between the Charybdisian whirlpool of civil discord and environmental collapse and the Scyllaian rocks of fascism, authoritarianism, and / or totalitarianism. But hey, isn’t Apple doing an Apple TV+ series based on Asimov’s books? Let’s all tune in!

I’ve been interested in Category Theory for a substantial number of years, but have never dedicated the time necessary to learn it properly. But it seems to me that these are great days (salad days?) for learning at least the rudiments of the subject. There are now wealths of copious materials available on-line for free for one’s self-study and enrichment, as well as new and classic treatises on Categories and their theory.

Category Theory has been called “abstract nonsense”, but while it is very abstract, it is hardly nonsense. Like set theory, it can be used to study the foundations of mathematics. Like algebra, it can be used to study generalized structure and relationships in math. Like the assortment of tools that is computer science, it can be used to study the essence of logic and computation. And like calculus, it can be used for all sorts of applied and scientific purposes.

Like all good math, Category Theory (CT) generalizes concepts that lie at the heart of many branches of mathematics. These concepts allow the mathematician to see similarities between these different branches, and carry them over into others as well. You might think “maths” is all one thing, and it is, roughly, but like science, it has evolved into a myriad of subjects and specialities. Ontologically (or perhaps even categorically), CT is listed as a topic under algebra, and it in turn has its own distinct branches.

And like all good math, CT benefits from a judicious choice of definitions and properties, that balance generality with precision to great expressive advantage. This balance between abstraction and concreteness gives it the power and utility that it has enjoyed for the better part of seventy-five years. This makes CT rather a new-comer to mathematics, but please don’t mistake youth for lack of expertise.

If I wanted to represent CT emblematically, I might suggest the diagram above. At root, a category merely consists of a collection of objects and the pairwise morphisms (or arrows) between them. But additionally, the arrows must also obey a small set of conditions, so the objects usually have a similar nature. This nature can be quite abstract though, and one of the most familiar “concrete” categories is that of sets and the functions between them.

If arrows are the mappings between objects, then “functors” are mappings between categories, that once again have to obey some rules to maintain structure. So you can think analogically that arrows are to objects as functors are to categories and you wouldn’t be too wrong. Next, you can imagine generalizing to a higher level that there are mappings between functors, and “natural transformations” are indeed defined to be so.

There are many types of entities and characters that inhabit the theory that serve to propel the plot threads along. Some turn out more important than others, but most are essential to the overarching tale. Mathematics as storytelling? What an interesting and novel concept!

In mathematical logic and type theory, the λ-cube is a framework introduced by Henk Barendregt^{} to investigate the different dimensions in which the calculus of constructions is a generalization of the simply typed λ-calculus. Each dimension of the cube corresponds to a new way of making objects depend on other objects, namely

terms allowed to depend on types, corresponding to polymorphism.

Guided only by their feelings for symmetry, simplicity, and generality, and an indefinable sense of the fitness of things, creative mathematicians now, as in the past, are inspired by the art of mathematics rather than by any prospect of ultimate usefullness.

— E. T. Bell

A smile fell on the grass.
Irretrievable!

And how will your night dances
Lose themselves. In mathematics?

Why Evolution is True is a blog written by Jerry Coyne, centered on evolution and biology but also dealing with diverse topics like politics, culture, and cats.