Emily Riehl and Terrence Tau on The Future of Mathematics
On invisible mathematics, see Paulson's Mechanizing Coinduction and Corecursion in Higher-order Logic (1998). Mike Gordon was also thinking about these things and it gives a nice example of Tau's idea of "post-rigorous" mathematics. See Corecursion and coinduction: what they are and how they relate to recursion and induction. The "illuminating blog post" he refers to on page 38 is here on the wayback machine.
24:07 See Vladimir Voevodsky on the concept of mathematical structure in his letter exchange with Andrei Rodin.
35:03 On using different formal systems. See About Logic Interview with Graham Priest.
39:51 On embedding Homotopy Type Theory in Lean, see https://github.com/sinhp/HoTTLean. And in answer to that question, why not try to produce embeddings in, say, three different systems and then see how much you can say that is mutually consistent across all three? See On the Origins of Bisimulation and Coinduction by Davide Sangiorgi:
In Computer Science, and in Philosophical Logic, and in Set Theory, bisimulation has been derived through refinements of notions of morphism between algebraic structures. Roughly, morphisms are maps that are “structure-preserving”. The notion is therefore fundamental in all mathematical theories in which the objects of study have some kind of structure, or algebra. The most basic forms of morphism are the homomorphisms. These essentially give us a way of embedding a structure (the source) into another one (the target), so that all the relations in the source are present in the target. The converse however, need not be true; for this, stronger notions of morphism are needed. One such notion is isomorphism, which is however extremely strong—isomorphic structures must be essentially the same, i.e., “algebraically identical”. It is a quest for notions in between homomorphism and isomorphism that led to the discovery of bisimulation.
See https://www.math.inc/vision.
Subscribe to Math Inc.
Jacobs and Rutten's "Introduction to (co)algebra and (co)induction" is hard to find online now since it was published by Cambridge University Press in Advanced Topics in Bisimulation and Coinduction , pp. 38 - 99. The preface to Jacobs' book Introduction to Coalgebra: Towards Mathematics of States and Observation is interesting reading:
Computer Science Is about Generated Behaviour
What is the essence of computing? What is the topic of the discipline of computer science? Answers that are often heard are ‘data processing’ or ‘symbol manipulation’. Here we follow a more behaviouristic approach and describe the subject of computer science as generated behaviour. This is the behaviour that can be observed on the outside of a computer, for instance via a screen or printer. It arises in interaction with the environment, as a result of the computer executing instructions, laid down in a computer program. The aim of computer programming is to make a computer do certain things, i.e. to generate behaviour. By executing a program a computer displays behaviour that is ultimately produced by humans, as programmers. This behaviouristic view allows us to understand the relation between computer science and the natural sciences: biology is about ‘spontaneous’ behaviour, and physics concentrates on lifeless natural phenomena, without autonomous behaviour. Behaviour of a system in biology or physics is often described as evolution, where evolutions in physics are transformational changes according to the laws of physics. Evolutions in biology seem to lack inherent directionality and predictability. Does this mean that behaviour is deterministic in (classical) physics, and non-deterministic in biology? And that coalgebras of corresponding kinds capture the situation? At this stage the coalgebraic theory of modelling has not yet demonstrated its usefulness in those areas. Therefore this text concentrates on coalgebras in mathematics and computer science.
What he means by 'spontaneous' behavior is not what most physicists would mean by spontaneous, I think. In physics spontaneous behaviour is the norm: the laws of thermodynamics being the canonical examples, e.g. that heat flows spontaneously from hot to cold. In biology the behaviour is not spontaneous in that sense, because it is produced by the conditions in the context in which it is established: thermodynamic flux being the most general example.
Comments
Post a Comment