Gabriele Carcassi on Physical Convergence
This looks interesting. See the article here https://assumptionsofphysics.org/essays/2025-05-15-convergence. Reading that I realised that Claude AI is presumably named after Claude Shannon who wrote a Bell Labs Technical report with the title "A Mathematical Theory of Communication". In it he proves the source coding theorem:
In information theory, Shannon's source coding theorem (or noiseless coding theorem) establishes the statistical limits to possible data compression for data whose source is an independent identically-distributed random variable, and the operational meaning of the Shannon entropy.
Following Ernst Mach, one might think of any physical laboratory apparatus as a device for producing streams of symbols; then a physical theory is a formal mechanism for reproducing any experimentally observed sequence resulting from the input of parameters characterising the particular experimental setup:
"The goal which it (physical science) has set itself is the simplest and most economical abstract expression of facts.
When the human mind, with its limited powers, attempts to mirror in itself the rich life of the world, of which it itself is only a small part, and which it can never hope to exhaust, it has every reason for proceeding economically.
In reality, the law always contains less than the fact itself, because it does not reproduce the fact as a whole but only in that aspect of it which is important for us, the rest being intentionally or from necessity omitted.
In mentally separating a body from the changeable environment in which it moves, what we really do is to extricate a group of sensations on which our thoughts are fastened and which is of relatively greater stability than the others, from the stream of all our sensations. Suppose we were to attribute to nature the property of producing like effects in like circumstances; just these like circumstances we should not know how to find. Nature exists once only. Our schematic mental imitation alone produces like events."
[Update: Curt Jaimungal just posted this relevant clip of his interview with Gabriele Carcassi last year:
Subscribe to Curt Jaimungal.]
The significance of the entropy is explained here:
And there is some more about Lagrangian, Hamiltonian and Newtonian formulations:
And some more about dissipative systems here:
It's all very complicated! See also the Routhian formulation:
Routhian mechanics is a hybrid formulation of Lagrangian mechanics and Hamiltonian mechanics developed by Edward John Routh. Correspondingly, the Routhian is the function which replaces both the Lagrangian and Hamiltonian functions. Although Routhian mechanics is equivalent to Lagrangian mechanics and Hamiltonian mechanics, and introduces no new physics, it offers an alternative way to solve mechanical problems.
For more on the Routhian formulation see Modeling Dynamics of Dissipative Systems.
Subscribe to Gabriele Carcassi.
Comments
Post a Comment