Curt Jaimungal and Jonathan Gorard talking about Peer Reviewed Journals and Funding in Academia

This system wasn't designed, it's just what grew organically out of learned societies and was then developed and preserved by vested interests, sustained by the likes of Robert Maxwell.


Here's the full interview:

7:16 To me, the interesting thing about the ideas of computation and order theory is that they indicate that it is possible to represent a certain class of systems very precisely, yet completely independent from any underlying concrete representation. So for example you can establish a class of effective methods for describing an abstract process of 'computation' just by showing that three quite different formalisms are all capable of describing exactly the same class of computations, and thereby abstracting a general (i.e. Universal) class of computation. If the same idea were applied in physics, then you would be looking for a completely Universal description of physical systems that would be independent of any particular underlying (i.e. fundamental) material constituents. But the particular kind of processes that classical computation deals with are not necessarily  the same class as those which are physical, principally because of the deterministic (i.e. functional) nature of the types of process classical computing represents (see the discussion at 13:42 and at 51:33 on "computational irreducibility". See A Functorial Perspective on (Multi)computational Irreducibility by Jonathan Gorard). 1:45:28 on Turing machines.

From Robin Milner's The Polyadic π-Calculus: a Tutorial:

Any model of the world, or of computation, which is part of the world, makes some ontological commitment. I mean this in the loose sense of a commitment as to which phenomena it will try to capture and which mental constructions are seen to fit these phenomena best. This is obvious for the "denotational" models of computing, for example, the set-theoretic notion of function is chosen as the essence or abstract content of the deterministic sequential process by which a result is computed from arguments. But mathematical operations --- adding, taking square-roots --- existed long before set theory; and it seems that Church in creating the λ-calculus had "algorithm" more in mind than "function" in the abstract sense of the word. 

Nevertheless, the λ-calculus makes some ontological commitment about computation. It emphasizes the view of computation as taking arguments and yielding results.  By contrast, it gives no direct representation of a heterarchical family of agents, each with its changing state and an identity which persists from one computation to another. One may say that the λ-calculus owes its very success to its quite special focus upon argument-result computations.

If you read Turing's 1936 paper "On Computable Numbers" you see that his interests were wider than just the deterministic computability as embodied by automata (a-machines). As well as these he considered choice-machines and later, in his doctoral thesis, introduced the idea of oracle-machines, which are a kind of maximally consistent set of computations. In his 1936 paper he focused on the decision problem for arithmetic and in that considered only automatic computations: i.e. those performed by deterministic algorithms such as are represented in the λ-calculus.

1:38:48 Discussion about the second law of thermodynamics applied to the whole universe.

2:07:42 Grothendieck’s hypothesis.

Subscribe to Curt Jaimungal.

Comments

Popular posts from this blog

Steven Johnson - So You Think You Know How to Take Derivatives?

Hitachi HD44780U LCD Display Fonts

Welsh Republic Podcast Talking With Kars Collective on Armenia Azerbaijan Conflict