Multiple realizability of replicator dynamics

Abstraction is my favorite part of mathematics. I find a certain beauty in seeing structures without their implementations, or structures that are preserved across various implementations. And although it seems possible to reason through analogy without (explicit) abstraction, I would not enjoy being restricted in such a way. In biology and medicine, however, I often find that one can get caught up in the concrete and particular. This makes it harder to remember that certain macro-dynamical properties can be abstracted and made independent of particular micro-dynamical implementations. In this post, I want to focus on a particular pet-peeve of mine: accounts of the replicator equation.

I will start with a brief philosophical detour through multiple realizability, and discuss the popular analogy of temperature. Then I will move on to the phenomenological definition of the replicator equation, and a few realizations. A particular target will be the statement I’ve been hearing too often recently: replicator dynamics are only true for a very large but fixed-size well-mixed population.

Read more of this post

Ecology of cancer: mimicry, eco-engineers, morphostats, and nutrition

One of my favorite parts of mathematical modeling is the opportunities it provides to carefully explore metaphors and analogies between disciplines. The connection most carefully explored at the MBI Workshop on the Ecology and Evolution of Cancer was, as you can guess from the name, between ecology and oncology. Looking at cancer from the perspective of evolutionary ecology can offer us several insights that the standard hallmarks of cancer approach (Hanahan & Weingerg, 2000) hides. I will start with some definitions for this view, discuss ecological concepts like mimicry and ecological engineers in the context of cancer, unify these concepts through the idea of morphostatic maintenance of tissue microarchitecture, and finish with the practical importance of diet to cancer.
Read more of this post

Transcendental idealism and Post’s variant of the Church-Turing thesis

KantPostOne of the exciting things in reading philosophy, its history in particular, is experiencing the tension between different schools of thought. This excitement turns to beauty if a clear synthesis emerges to reconcile the conflicting ideas. In the middle to late 18th century, as the Age of Enlightenment was giving way to the Romantic era, the tension was between rationalism and empiricism and the synthesis came from Immanuel Kant. His thought went on to influence or directly shape much of modern philosophy, and if you browse the table of contents of philosophical journals today then you will regularly encounter hermeneutic titles like “Kant on <semi-obscure modern topic>”. In this regard, my post is in keeping with modern practice because it could have very well been titled “Kant on computability”.

As stressed before, I think that it is productive to look at important concepts from multiple philosophical perspectives. The exercise can provide us with an increased insight into both the school of thought that is our eyes, and the concept that we behold. In this case, the concept is the Church-Turing thesis that states that anything that is computable is computable by a Turing machine. The perspective will be of (a kind of) cognitivism — thought consists of algorithmic manipulation of mental states. This perspective that can often be read directly into Turing, although Copeland & Shagrir (2013) better described him as a pragmatic noncognitivist. Hence, I prefer to attribute this view to Emil Post. Also, it would be simply too much of a mouthful to call it the Post-Turing variant of the Church-Turing thesis.
Read more of this post

Falsifiability and Gandy’s variant of the Church-Turing thesis

RobinGandyIn 1936, two years after Karl Popper published the first German version of The Logic of Scientific Discovery and introduced falsifiability; Alonzo Church, Alan Turing, and Emil Post each published independent papers on the Entscheidungsproblem and introducing the lambda calculus, Turing machines, and Post-Turing machines as mathematical models of computation. The years after saw many more models, all of which were shown to be equivalent to each other in what they could compute. This was summarized in the Church-Turing thesis: anything that is computable is computable by a Turing machine. An almost universally accepted, but also incredibly vague, statement. Of course, such an important thesis has developed many variants, and exploring or contrasting their formulations can be very insightful way to understand and contrast different philosophies.

I believe that the original and most foundational version of the thesis is what I called Kleene’s purely mathematical formulation. Delving into this variant allowed us explore the philosophy of mathematics; Platonism; and the purpose, power and limitations of proof. However, because of the popularity of physicalism and authority of science, I doubt that Kleene’s is the most popular variant. Instead, when people think of the Church-Turing thesis, they often think of what is computable in the world around them. I like to associate this variant with Turing’s long time friend and student — Robin Gandy. I want to explore Gandy’s physical variant of the Church-Turing thesis to better understand the philosophy of science, theory-based conceptions, and the limits of falsifiability. In particular, I want to address what seems to me like the common misconception that the Church-Turing thesis is falsifiable.
Read more of this post

Kleene’s variant of the Church-Turing thesis

KleeneIn 1936, Alonzo Church, Alan Turing, and Emil Post each published independent papers on the Entscheidungsproblem and introducing the lambda calculus, Turing machines, and Post-Turing machines as mathematical models of computation. A myriad of other models followed, many of them taking seemingly unrelated approaches to the computable: algebraic, combinatorial, linguistic, logical, mechanistic, etc. Of course, all of these models were shown to be equivalent in what they could compute and this great heuristic coherence lead mathematicians to formulate the Church-Turing thesis. As with many important philosophical notions, over the last three-quarters of a century, the thesis has gradually changed. In a semi-historic style, I will identify three progressively more empirical formulations with Kleene, Post, and Gandy. For this article, I will focus on the purely mathematical formulation by Kleene, and reserve the psychological and physical variants for next time.

Mathematicians and logicians begat the Church-Turing thesis, so at its inception it was a hypothesis about the Platonic world of mathematical ideas and not about the natural world. There are those that follow Russell (and to some extent Hilbert) and identify mathematics with tautologies. This view is not typically held among mathematicians, who following in the footsteps of Godel know how important it is to distinguish between the true and the provable. Here I side with Lakatos in viewing logic and formal systems as tools to verify and convince others about our intuitions of the mathematical world. Due to Godel’s incompleteness theorems and decades of subsequent results, we know that no single formal system will be a perfect lens on the world of mathematics, but we do have prefered one like ZFC.
Read more of this post

Evolution is a special kind of (machine) learning

Theoretical computer science has a long history of peering through the algorithmic lens at the brain, mind, and learning. In fact, I would argue that the field was born from the epistemological questions of what can our minds learn of mathematical truth through formal proofs. The perspective became more scientific with McCullock & Pitts’ (1943) introduction of finite state machines as models of neural networks and Turing’s B-type neural networks paving the way for our modern treatment of artificial intelligence and machine learning. The connections to biology, unfortunately, are less pronounced. Turing ventured into the field with his important work on morphogenesis, and I believe that he could have contributed to the study of evolution but did not get the chance. This work was followed up with the use of computers in biology, and with heuristic ideas from evolution entering computer science in the form of genetic algorithms. However, these areas remained non-mathematical, with very few provable statements or non-heuristic reasoning. The task of making strong connections between theoretical computer science and evolutionary biology has been left to our generation.

ValiantAlthough the militia of cstheorists reflecting on biology is small, Leslie Valiant is their standard-bearer for the steady march of theoretical computer science into both learning and evolution. Due in part to his efforts, artificial intelligence and machine learning are such well developed fields that their theory branch has its own name and conferences: computational learning theory (CoLT). Much of CoLT rests on Valiant’s (1984) introduction of probably-approximately correct (PAC) learning which — in spite of its name — is one of the most formal and careful ways to understand learnability. The importance of this model cannot be understated, and resulted in Valiant receiving (among many other distinctions) the 2010 Turing award (i.e. the Nobel prize of computer science). Most importantly, his attention was not confined only to pure cstheory, he took his algorithmic insights into biology, specifically computational neuroscience (see Valiant (1994; 2006) for examples), to understand human thought and learning.

Like any good thinker reflecting on biology, Valiant understands the importance of Dobzhansky’s observation that “nothing in biology makes sense except in the light of evolution”. Even for the algorithmic lens it helps to have this illumination. Any understanding of learning mechanisms like the brain is incomplete without an examination of the evolutionary dynamics that shaped these organs. In the mid-2000s, Valiant embarked on the quest of formalizing some of the insights cstheory can offer evolution, culminating in his PAC-based model of evolvability (Valiant, 2009). Although this paper is one of the most frequently cited on TheEGG, I’ve waited until today to give it a dedicated post.
Read more of this post

Computational complexity of evolutionary equilibria

FoundersThe first half of the 20th century is famous for revolutionary changes — paradigm shifts as Kuhn would say — across the sciences and other systems of thought. Disheartened by the scars of the First World War, European thinkers sought refuge by shifting their worldviews away from those of their fathers. In fields like biology, this meant looking for guidance to your grandfathers, instead. The founders of the modern synthesis reconciled the fading ideas of Wallace’s natural selection with Mendelian genetics. In the process, they unified many branches of biology, that at the dawn of the 20th century had been diverging, into a single paradigm that persists today. A return to evolution by natural selection illuminated their field and ended the eclipse of Darwinism. At the same time, mathematicians questioned Hilbert’s formalism and Russell’s logicism as valid foundations for their own field. As a result, they formalized mechanistic calculation and logical thought as computation and founded theoretical computer science to study its power and limitations. Even though some pioneers — like Alan Turing — kept an eye on biology, the two streams of thought did not converge. The coming of the Second World War pushed both fields away from each other and deep foundational questions, entrenching them in applied and technological work.

For the rest of the 20th century, the fields remained largely independent. Computer science only looked to biology for vague inspiration of heuristics in the form of evolutionary computing (Holland, 1975), and biologists looked at computer science as an engineering or technical field that could only provide them with applied tools like bioinformatics. Neither field saw in the other a partner for addressing deep theoretical questions. As I mentioned previously, my goal is to remedy this by applying ideas from analysis of algorithms and computational complexity to fundamental questions in biology. Sometime in late October, I tweeted my joy at seeing evolution clearly through the algorithmic lens. On Friday the 23rd, after much delay, I turned this into an ArXiv preprint ambitiously titled “Complexity of evolutionary equilibria in static fitness landscapes“. The paper combines the discussion of evolution from my previous few blog posts with three simple technical results. Although it was written for theoretical computer scientists, I tried to make it accessible to mathematical biologists as well; the hope is to serve as a launching point for discussion between the two disciplines.
Read more of this post

Microscopic computing in cells and with self-assembling DNA tiles

One of the three goals of natural algorithms is to implement computers in non-electronic media. In cases like quantum computing, the goal is to achieve a qualitatively different form of computing, but other times (as with most biological computing) the goal is just to recreate normal computation (or a subset of it) at a different scale or in more natural ways. Of course, these two approaches aren’t mutually exclusive! Imagine how great it would be if we could grow computers on the level of cells, or smaller. For starters, this approach could revolutionize health-care: you could program some of your own cells to sense and record your internal environment and release drugs only when necessary. It could also alter how we manufacture things; if you throught 3D printers are cool, what if you could program nanoscale assemblies?
Read more of this post

Algorithmic view of historicity and separation of scales in biology

A Science publications is one of the best ways to launch your career, especially if it is based on your undergraduate work, part of which you carried out with makeshift equipment in your dorm! That is the story of Thomas M.S. Chang, who in 1956 started experiments (partially carried out in his residence room in McGill’s Douglas Hall) that lead to the creation of the first artificial cell (Chang, 1964). This was — in the words of the 1989 New Scientists — an “elegantly simple and intellectually ambitious” idea that “has grown into a dynamic field of biomedical research and development.” A field that promises to connect biology and computer science by physically realizing John von Neumann’s dream of a self-replication machine.

makingBilayer
Read more of this post

Natural algorithms and the sciences

Today, I am passing through New York City on my way to Princeton’s Center for Computational Intractability for a workshop on Natural Algorithms and the Sciences (NA&S). The two day meeting will cover everything from molecular algorithms for learning and experiments on artificial cells to bounded rationality in decision-making and the effects of network topology on the evolution of collective migration. In other words, right at home for TheEGG blog. The full mission statement:

The workshop will bring together researchers from computer science, mathematics, physics, biology, and engineering to explore interactions among algorithms, dynamical systems, statistical physics, and complexity theory (in all senses of the term).

Read more of this post