From H. pylori to Spanish colonialism: the scales of cancer.

IMO2014Yesterday was the first day of the 4th Integrated Mathematical Oncology Workshop here at Moffitt. This year, it is run jointly with the Center for Infection Research in Cancer and is thus focused on the interaction of infection disease and cancer. This is a topic that I have not focused much attention on — except for the post on canine transmissible venereal tumor and passing mentions of Human papillomavirus (HPV) — so I am excited for the opportunity to learn. The workshop opened with a half-day focused on getting to know the external visitors, Alexander Anderson’s introduction, and our team assignments. I will be teammates with Heiko Enderling, Domenico Coppola, Jose M. Pimiento, and others. We will be looking at Helicobacter pylori. Go team blue! If you are curious, the more popularly known HPV went to David Basanta’s team, it will be great to compete against my team leader from last year. As you can expect, the friendly trash talking and subtle intimidation has already begun.

To be frank, before yesterday, I’ve only ever heard of H. pylori once and knew nothing of its links to stomach cancer. The story I heard was associated with Barry J. Marshall and J. Robin Warren’s award of the 2005 Nobel Prize in Physiology and Medicine “for their discovery of the bacterium Helicobacter pylori and its role in gastritis and peptic ulcer disease”. In 1984, Marshall was confident in the connection between H. pylori, inflammation, and ulcers, but the common knowledge of the day was that ulcers were caused by things like stress and smoking, not bacteria. The drug companies even happened to have an expensive drug that could manage the associated stomach inflammation, and given the money it was bringing in, nobody was concerned with finding some bacterium that could be cured with cheap antibiotics. Having difficulty convincing his colleagues (apart from Warren), Marshall decided to drink a Petri dish of cultured H. pylori, and within a few days grew sick, developing severe inflammation of the stomach before finally (two weeks after the ingestion) going on antibiotics and curing himself. This dramatic display was sufficient to push for bigger studies that eventually lead to the Nobel prize; I recommend listening to Warren’s podcast with Nobel Prize Talks or his acceptance speech for the whole story.

This is a fascinating tale, but from the modeling perspective, the real excitement of H. pylori and its role in stomach cancer is the multitude of scales that are central to the development of disease. We see important players from the scale of molecules involved in changing stomach acidity, to the single-cell scale of the bacteria and stomach lining, to the changes across the stomach as a whole organ, and the role of the individual patient’s life style and nutrition. These are the usual scales we see when modeling cancer, and dovetail nicely with Anderson’s opening remarks on the centrality of mathematics in helping us bridge the gaps. However, in the case of H. pylori, the scales go beyond the single individual at which Anderson stops and extend to the level of populations of humans in the co-evolution of host and pathogen, and even populations of groups of humans in a speculative connection to a topic familiar to TheEGG readers — the evolution of ethnocentrism. In preparation for the second half of the second day and the intense task of finding a specific question for team blue to focus on, I wanted to give a quick overview of these scales.
Read more of this post

Bernstein polynomials and non-linear public goods in tumours

By analogy, or maybe homage, to standard game theory, when we discuss the payoffs of an evolutionary game, we usually tell the story of two prototype agents representing their respective strategies meeting at random and interacting. For my stories of yarn, knitting needles, and clandestine meetings in the dark of night, I even give these players names like Alice and Bob. However, it is important to remember that these are merely stories, and plenty of other scenarios could take their place. In the case of replicator dynamics there is so much averaging going on that it is often just better to talk about the payoffs as feedback between same-strategy sub-populations of agents. The benefit of this abstraction — or vagueness, if you prefer — is that you don’t get overwhelmed by details — that you probably don’t have justification for, anyway — and focus on the essential differences between different types of dynamics. For example, the prisoner’s dilemma (PD) and public goods (PG) games tell very different stories, but in many cases the PD and linear PG are equivalent. Of course, ‘many’ is not ‘all’ and my inclusion of ‘linear’ should prompt you to ask about non-linear public goods. So, in this post I want to provide a general analysis of replicator dynamics for non-linear public goods games following the method of Bernstein polynomials recently used by Archetti (2013, 2014). At the end, I will quickly touch on the two applications to mathematical oncology that Archetti considers. SInce I am providing a more general analysis, I will use notation inspired by Archetti, but defined more precisely and at times slightly differently — some symbols will be the same in name but not in value, so if you’re following along with the paper then pay close attention.
Read more of this post

From realism to interfaces and rationality in evolutionary games

As I was preparing some reading assignments, I realized that I don’t have a single resource available that covers the main ideas of the interface theory of perception, objective versus subjective rationality, and their relationship to evolutionary game theory. I wanted to correct this oversight and use it as opportunity to comment on the philosophy of mind. In this post I will quickly introduce naive realism, critical realism, and the interface theory of perception and sketch how we can use evolutionary game theory to study them. The interface theory of perception will also give me an opportunity to touch on the difference between subjective and objective rationality. Unfortunately, I am trying to keep this entry short, so we will only skim the surface and I invite you to click links aggressively and follow the references papers if something catches your attention — this annotated list of links might be of particular interest for further exploration.
Read more of this post

Critical thinking and philosophy

Regular readers of TheEGG might have noticed that I have a pretty positive disposition toward philosophy. I wound’t call myself a philosopher — at least not a professional one, since I don’t think I get paid to sit around loving wisdom — but I definitely greatly enjoy philosophy and think it is a worthwhile pursuit. As a mathematician or theoretical computer scientists, I am also a fan of rational argument. You could even say I am a proponent of critical thinking. At the very least, this blog offers a lot of — sometimes too much — criticism; although that isn’t what is really meant by ‘critical thinking’, but I’ll get back to that can of worms later. As such, you might expect that I would be supportive of Olly’s (of Philosophy Tube) recent episode on ‘Why We Need Philosophy’. I’ll let you watch:

I am in complete support of defending philosophy, but I am less keen on limiting or boxing philosophy into a simple category. I think the biggest issue with Olly’s defense is that he equates philosophy to critical thinking. I don’t think this is a justified identity and false in both directions. There is philosophy that doesn’t fall under critical thinking, and there is critical thinking that is not philosophy. As such, I wanted to unpack some of these concepts with a series of annotated links.

Read more of this post

Stem cells, branching processes and stochasticity in cancer

When you were born, you probably had 270 bones in your body. Unless you’ve experienced some very drastic traumas, and assuming that you are fully grown, then you probably have 206 bones now. Much like the number and types of internal organs, we can call this question of science solved. Unfortunately, it isn’t always helpful to think of you as made of bones and other organs. For medical purposes, it is often better to think of you as made of cells. It becomes natural to ask how many cells you are made of, and then maybe classify them into cell types. Of course, you wouldn’t expect this number to be as static as the number of bones or organs, as individual cells constantly die and are replaced, but you’d expect the approximate number to be relatively constant. Thus number is surprisingly difficult to measure, and our best current estimate is around 3.72 \times 10^{13} (Bianconi et al., 2013).

stochasticStemCellBoth 206 and 3.72 \times 10^{13} are just numbers, but to a modeler they suggest a very important distinction over which tools we should use. Suppose that my bones and cells randomly popped in and out of existence without about equal probability (thus keeping the average number constant). In that case I wouldn’t expect to see exactly 206 bones, or exactly 37200000000000 cells; if I do a quick back-of-the-envelope calculation then I’d expect to see somewhere between 191 and 220 bones, and between 37199994000000 and 37200006000000. Unsurprisingly, the variance in the number of bones is only around 29 bones, while the number of cells varies by around 12 million. However, in terms of the percentage, I have 14% variance for the bones and only 0.00003% variance in the cell count. This means that in terms of dynamic models, I would be perfectly happy to model the cell population by their average, since the stochastic fluctuations are irrelevant, but — for the bones — a 14% fluctuation is noticeable so I would need to worry about the individual bones (and we do; we even give them names!) instead of approximating them by an average. The small numbers would be a case of when results can depend heavily on if one picks a discrete or continuous model.

In ecology, evolution, and cancer, we are often dealing with huge populations closer to the number of cells than the number of bones. In this case, it is common practice to keep track of the averages and not worry too much about the stochastic fluctuations. A standard example of this is replicator dynamics — a deterministic differential equation governing the dynamics of average population sizes. However, this is not always a reasonable assumption. Some special cell-types, like stem cells, are often found in very low quantities in any given tissue but are of central importance to cancer progression. When we are modeling such low quantities — just like in the cartoon example of disappearing bones — it becomes to explicitly track the stochastic effects — although we don’t have to necessarily name each stem cell. In these cases we switch to using modeling techniques like branching processes. I want to use this post to highlight the many great examples of branching processes based models that we saw at the MBI Workshop on the Ecology and Evolution of Cancer.
Read more of this post

Personification and pseudoscience

If you study the philosophy of science — and sometimes even if you just study science — then at some point you might get the urge to figure out what you mean when you say ‘science’. Can you distinguish the scientific from the non-scientific or the pseudoscientific? If you can then how? Does science have a defining method? If it does, then does following the steps of that method guarantee science, or are some cases just rhetorical performances? If you cannot distinguish science and pseudoscience then why do some fields seem clearly scientific and others clearly non-scientific? If you believe that these questions have simple answers then I would wager that you have not thought carefully enough about them.

Karl Popper did think very carefully about these questions, and in the process introduced the problem of demarcation:

The problem of finding a criterion which would enable us to distinguish between the empirical sciences on the one hand, and mathematics and logic as well as ‘metaphysical’ systems on the the other

Popper believed that his falsification criterion solved (or was an important step toward solving) this problem. Unfortunately due to Popper’s discussion of Freud and Marx as examples of non-scientific, many now misread the demarcation problem as a quest to separate epistemologically justifiable science from the epistemologically non-justifiable pseudoscience. With a moral judgement of Good associated with the former and Bad with the latter. Toward this goal, I don’t think falsifiability makes much headway. In this (mis)reading, falsifiability excludes too many reasonable perspectives like mathematics or even non-mathematical beliefs like Gandy’s variant of the Church-Turing thesis, while including much of in-principle-testable pseudoscience. Hence — on this version of the demarcation problem — I would side with Feyerabend and argue that a clear seperation between science and pseudoscience is impossible.

However, this does not mean that I don’t find certain traditions of thought to be pseudoscientific. In fact, I think there is a lot to be learned from thinking about features of pseudoscience. A particular question that struck me as interesting was: What makes people easily subscribe to pseudoscientific theories? Why are some kinds of pseudoscience so much easier or more tempting to believe than science? I think that answering these questions can teach us something not only about culture and the human mind, but also about how to do good science. Here, I will repost (with some expansions) my answer to this question.
Read more of this post

Ecology of cancer: mimicry, eco-engineers, morphostats, and nutrition

One of my favorite parts of mathematical modeling is the opportunities it provides to carefully explore metaphors and analogies between disciplines. The connection most carefully explored at the MBI Workshop on the Ecology and Evolution of Cancer was, as you can guess from the name, between ecology and oncology. Looking at cancer from the perspective of evolutionary ecology can offer us several insights that the standard hallmarks of cancer approach (Hanahan & Weingerg, 2000) hides. I will start with some definitions for this view, discuss ecological concepts like mimicry and ecological engineers in the context of cancer, unify these concepts through the idea of morphostatic maintenance of tissue microarchitecture, and finish with the practical importance of diet to cancer.
Read more of this post

Models and metaphors we live by

MetaphorsGeorge Lakoff and Mark Johnson’s Metaphors we live by is a classic, that has had a huge influence on parts of linguistics and cognitive science, and some influence — although less so, in my opinion — on philosophy. It is structured around the thought that “[m]etaphor is one of our most important tools for trying to comprehend partially what cannot be comprehended totally”.

The authors spend the first part of the book giving a very convincing argument that “even our deepest and most abiding concepts — time, events, causation, morality, and mind itself — are understood and reasoned about via multiple metaphors.” These conceptual metaphors structure our reality, and are fundamentally grounded in our sensory-motor experience. For them, metaphors are not just aspects of speech but windows into our mind and conceptual system:

Our ordinary conceptual system, in terms of which we both think and act, is fundamentally metaphorical in nature. … Our concepts structure what we perceive, how we get around the world, and how we relate to others. Our conceptual system thus plays a central role in defining our everyday realities. … Since communication is based on the same conceptual system that we use in thinking and actiong, language is an important source of evidence for what that system is like.

I found the book incredibly insightful, and in large agreement with many of my recent thoughts on the philosophies of mind and science. After taking a few flights to finish the book, I wanted to take a moment to provide a mini-review. The hope is to convincing you to make the time for reading this short volume.
Read more of this post

Limits of prediction: stochasticity, chaos, and computation

Some of my favorite conversations are about prediction and its limits. For some, this is purely a practical topic, but for me it is a deeply philosophical discussion. Understanding the limits of prediction can inform the philosophies of science and mind, and even questions of free-will. As such, I wanted to share with you a World Science Festival video that THEREALDLB recently posted on /r/math. This is a selected five minute clip called “What Can’t We Predict With Math?” from a longer one and a half hour discussion called “Your Life By The Numbers: ‘Go Figure’” between Steven Strogatz, Seth Lloyd, Andrew Lo, and James Fowler. My post can be read without watching the panel discussion or even the clip, but watching the clip does make my writing slightly less incoherent.

I want to give you a summary of the clip that focuses on some specific points, bring in some of discussions from elsewhere in the panel, and add some of my commentary. My intention is to be relevant to metamodeling and the philosophy of science, but I will touch on the philosophy of mind and free-will in the last two paragraphs. This is not meant as a comprehensive overview of the limits of prediction, but just some points to get you as excited as I am about this conversation.

Read more of this post

Philosophy of Science and an analytic index for Feyerabend

FeyerabendThroughout my formal education, the history of science has been presented as a series of anecdotes and asides. The philosophy of science, encountered even less, was passed down not as a rich debate and on-going inquiry but as a set of rules that best be followed. To paraphrase Gregory Radick, this presentation is mere propaganda; it is akin to learning the history of a nation from its travel brochures. Thankfully, my schooling did not completely derail my learning, and I’ve had an opportunity to make up for some of the lost time since.

One of the philosophers of science that I’ve enjoyed reading the most has been Paul Feyerabend. His provocative writing in Against Method and advocation for what others have called epistemological anarchism — the rejection of any rules of scientific methodology — has been influential to my conception of the role of theorists. Although I’ve been meaning to write down my thoughts on Feyerabend for a while, now, I doubt that I will bring myself to do it anytime soon. In the meantime, dear reader, I will leave you with an analytic index consisting of links to the thoughts of others (interspersed with my typical self-links) that discuss Feyerabend, Galileo (his preferred historic case study), and consistency in science.
Read more of this post

Follow

Get every new post delivered to your Inbox.

Join 2,348 other followers