Cataloging a year of blogging: the algorithmic world

relaxedToday is the last day of the Julian year, and tomorrow is Old New Years, so it is a great time to finish our overview of the three themes of TheEGG articles in 2013. We already looked at established applications of evolutionary game theory, and extending from behavior to society and mind; now, we will be envisioning the algorithmic world. It is fitting that we will end the Orthodox calendar with a discussion of the year’s most unorthodox articles.

Although I spend most of my time relaxing in a comfortable office in the Stewart Biology building, my official position is in the computer science department. Thus, when I can’t just call myself a theorist, but must specify a discipline, I say that I am a theoretical computer scientist. However, I am a cstheorist that dislikes computing machines, engineering, and technology, and have an unreasonable fondness for philosophy and fundamental science. Unfortunately, most of of the theoretical branches of science, if they try to be rigorous and mathematical, tend to borrow their tools form physics not the mathematics underlying theoretical computer science. In undergrad, I received enough exposure to physics to understand the limits of these tools, and in the years since have grown convinced that they are not sufficient for building the theoretical edifice of biology and psychology.

You might object that computers are used all the time in biology and psychology, and so cstheory must be prevalent.

Right?

Not so, these computing machines are largely used for simulation, working with data, and testing. They are experimental, not theoretical tools; it is just easier to do experiments on code than in the lab, and sometimes not all that different. Theoretical computer science, on the other hand, is a branch of mathematics and relies on rigorous mathematical proof (much more so than physics ever does). Unlike physics, it is also a robust and qualitative field, not inherently quantitative. In the social sciences for some reason (probably because of physics envy) there is this idea that quantitative is better than qualitative. However, that is not the case if you have a rigor. A rigorous qualitative statement is much more general and stronger than a rigorous quantitative statement because it applies to more settings. Physics is seldom able to make such general and robust statements, because as a field it has benefited greatly from a highly functional reductionism that usually allowed them to have exact quantitative parameters and full mechanistic description of causal mechanisms. Cstheory, on the other hand, provides tools for making conclusions based on any possible mechanism, and thus is better suited for settings where we expect implementation independence (as is popular in some philosophies of mind) or where overdetermination or other roadblocks stop us from putting together a specific microdynamics. In other words, whenever I get my feet off the table, I aim to introduce and use the tools of cstheory to answers questions relevant to biologists, psychologists, and the social sciences.

Of course, this is an incredibly ambitions and high-risk goal, that is why I balanced it with the other two themes of more orthodox theory. I saved these most speculative approaches for last.

Proof, automata, and physics

Ignore E.O. Wilson, to be a theorist it is important to love and understand mathematics. I originally intended to have a separate blog for my purely mathematical and computer science musings, but during the last year decided to concentrate everything at TheEGG. Given my background, this has mostly meant thoughts on physics, machine learning, and how we approach proof. Of course, there was still the occasional tie to evolution and cognition.

  1. Mathematical Turing test: Readable proofs from your computer
  2. Four color problem, odd Goldbach conjecture, and the curse of computing
  3. Evolution explains the fundamental constants of physics
  4. Minimizing finite state automata
  5. How teachers help us learn deterministic finite automata
  6. Limits on efficient minimization and the helpfulness of teachers
  7. Parallelism, products, and automata
  8. Quantum query complexity
  9. Lower bounds by negative adversary method

During the year, I also wrote a number of less speculative and non-technical articles for TruthIsCool like an early post on the resolution of the odd Goldbach conjecture, MIT’s open source code for building special relativity games, and a discussion of the potential consequences from improvements to classical discrete log algorithms. These explorations helped me integrate a little bit more into the mathematical blogging community, when the AMS Blog on Math Blogs mentioned my discussion of the odd Goldbach conjecture, and featured my introduction to learning DFAs and discussion of bounded rationality. Too bad they had trouble with my last name.

I also published a commentary on recent approaches to quantum models of mind (Kaznatcheev & Shultz, 2013), but have not had a chance to discuss it (or really even think about it) on the blog. It is a difficult topic to talk about since whenever people hear ‘quantum’ and ‘mind’ together, they tend to assume that the discussion is about Penrose & Hammeroff’s silly conviction that quantum effects in microtubules explain consciousness. This is a hokum that is popular on the internet, and I have a draft post expanding on why its wrong, but no motivation to finish it. My commentary was on a very different use of quantum mechanics for phenomenological modeling of decision making as a way to overcome some of the failings of rationality that Bayesian models struggle with. I’ve written a paper long expansion of the commentary, but I am not sure if I will bother to publish it because I have not been keeping up with the field.

I’ve also prepared a paper with Prakash Panangaden and Doina Precup on weighted automata. I hope to use it as a technical bridge between more standard computational learning theory and some of my speculations on biology and philosophy of science.

This category had a corpus of around 12.6 thousand words long and garnered around 11.1 thousand views.

Natural algorithms and biology

In late May, I traveled to Princeton for the second workshop on Natural Algorithms and the Science. This was probably my most exciting academical time in 2013, although it is neck-in-neck with the mathematical oncology visits to Florida. The experience also gave my confidence that there really is a future to pushing an algorithmic biology, and interest from both biologists and computer scientists. I wrote extensively about the trip, and other existing connections between theoretical biology and computer science:

  1. Programming playground: Cells as (quantum) computers?
  2. Natural algorithms and the sciences
  3. Algorithmic view of historicity and separation of scales in biology
  4. Distributed computation in foraging desert ants
  5. Mathematical models of running cockroaches and scale-invariance in cells
  6. Microscopic computing in cells and with self-assembling DNA tiles
  7. Toward an algorithmic theory of biology
  8. What can theoretical computer science offer biology?
  9. Programming language for biochemistry
  10. Software through the lens of evolutionary biology

I was very happy that the workshop organizers linked to my summary of the conference, and that WordPress editors featured my post about Feinerman & Korman’s (2012) distributed computation in ants on Freshly Pressed.

This category had a corpus of around 12.1 thousand words long and garnered around 20.1 thousand views.

Fitness landscapes and evolutionary equilibria

I could only spend so much time vicariously admiring others’ work on bringing cstheory to biology. I had started looking seriously at evolution through the algorithmic lens after reading Chaitin’s attempt, and in November 2012 I found the tools I wanted to use. My biggest concern with Valiant’s (2009) and Chaitin’s (2009) earlier approaches to understanding evolution was that they lacked a direct connection to questions and mental models biologists were already comfortable with. Having such a disconnect is an easy way to catch interdisciplinitis, and so I made sure to read heavily the biological literature and overviewed the relevant parts here:

  1. Fitness landscapes as mental and mathematical models of evolution
  2. Epistasis and empirical fitness landscapes
  3. Black swans and Orr-Gillespie theory of evolutionary adaptation
  4. NK and block models of fitness landscapes
  5. Computational complexity of evolutionary equilibria
  6. Semi-smooth fitness landscapes and the simplex algorithm
  7. John Maynard Smith: games animals play
  8. Computational complexity of evolutionary stable strategies

I also benefited greatly from discussions with Julian Xue, and gave my first formal talk on this at the Moffitt’s Integrated Mathematical Oncology Department while visiting David Basanta and Jacob Scott in July. By the end of August I had my first draft on the ArXiv (Kaznatcheev, 2013), and after some great feedback I am getting the second draft prepped for submission. In May of 2014, I hope to attend the first workshop on computational theories of evolution and expand on the consequences of the algorithmic approach to biology.

This category had a corpus of around 12.3 thousand words long and garnered around 4.3 thousand views.

Metamodeling and the (algorithmic) philosophy of science

My conviction that the algorithmic lens offers a promising perspective on science stems not only from pragmatic considerations of my talens and interests but from a personal philosophy of mind and science. In particular, I believe that the physics approach to theory is fundamentally rooted in logical positivism — a dated philosophy that I’ve rejected in favor of post-positivism. At its root, theoretical computer science is the study of the ideal mind, it is an extension of analytic philosophy from logic to computation. The basic tenant of the algorithmic philosophy I have come to advocate is that any conceivable and communicable theory is an algorithmic object that we can examine with the mathematical tools of computer science. Any non-algorithmic aspects of the world, are beyond our (even idealized) scientific understanding and thus irrelevant.

The second thread of my philosophical musings, is a more practical concern. When I first started research in 2008, I came to it from a simulation and computational perspective. Over the years, I’ve become disillusioned with this approach and I am committed to understanding its shortcomings and limits. For some of my orthodox theory, I still rely on it as a tool, but I try to acknowledge its essence as a rhetorical tool not a method of establishing truth.

My treatment of science as a structured narrative, has also resulted in intersections with the humanities, and I was happy to have my old undergrad roommate Forrest Barnum write about mathematical approaches to history. He had to put up with three years of my logical positivism — I didn’t see the errors in my ways until recently — and it was great to have his contribution as I shifted toward a more reasonable philosophy.

  1. Will the droids take academic jobs?
  2. Computer science on prediction and the edge of chaos
  3. Machine learning and prediction without understanding
  4. Cliodynamics: A Future for History? by Forrest Barnum
  5. Micro-vs-macro evolution is a purely methodological distinction
  6. Monoids, weighted automata and algorithmic philosophy of science
  7. Infographic history of evolutionary thought
  8. Three types of mathematical models
  9. Randomness, necessity, and non-determinism
  10. What is the algorithmic lens?
  11. Are all models wrong?
  12. Three goals for computational models
  13. Sherlock Holmes and the Case of the Missing Replication

Although I suspect that he would not be interested in algorithmic philosophy, Tim Johnson’s Magic, Maths, and Money blog has been helpful to my thinking, in particular it showed me that ethics can be an interesting branch of philosophy with an intricate relationship to science and mathematics.

This category had a corpus of around 16.4 thousand words long and garnered around 29.9 thousand views.

References

Chaitin, G. (2009). Evolution of Mutating Software. EATCS Bulletin, 97: 157-164.

Feinerman, O., & Korman, A. (2012). Memory Lower Bounds for Randomized Collaborative Search and Implications to Biology 26th International Symposium on Distributed Computing (DISC)

Artem Kaznatcheev (2013). Complexity of evolutionary equilibria in static fitness landscapes. arXiv arXiv: 1308.5094v1

Kaznatcheev, A., & Shultz, T. R. (2013). Limitations of the Dirac formalism as a descriptive framework for cognition. Behavioral and Brain Sciences, 36(03): 292-293.

Valiant, L.G. (2009) Evolvability. Journal of the ACM 56(1): 3.

About Artem Kaznatcheev
From the Department of Computer Science at Oxford University and Department of Translational Hematology & Oncology Research at Cleveland Clinic, I marvel at the world through algorithmic lenses. My mind is drawn to evolutionary dynamics, theoretical computer science, mathematical oncology, computational learning theory, and philosophy of science. Previously I was at the Department of Integrated Mathematical Oncology at Moffitt Cancer Center, and the School of Computer Science and Department of Psychology at McGill University. In a past life, I worried about quantum queries at the Institute for Quantum Computing and Department of Combinatorics & Optimization at University of Waterloo and as a visitor to the Centre for Quantum Technologies at National University of Singapore. Meander with me on Google+ and Twitter.

One Response to Cataloging a year of blogging: the algorithmic world

  1. Pingback: Cataloging a year of blogging: cancer and biology | Theory, Evolution, and Games Group

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.