The end of the holidays is coming closer and closer and as a warm up I was thinking about what central theme I would like to cover in my first blog posts in 2014. Then an old college shared a joke from/about physicists with me. You know this a bit nerdy jokes of the kind: "An electron and a positron are walking into a bar...".
But the following joke encouraged me to think (once more) over models, model reduction and parameter identification.
A group of wealthy investors wanted to be able to predict the outcome of a horse race. So they hired a group of biologists, a group of statisticians, and a group of physicists. Each group was given a year to research the issue. After one year, the groups all reported to the investors. The biologists said that they could genetically engineer an unbeatable racehorse, but it would take 200 years and $100bn. The statisticians reported next. They said that they could predict the outcome of any race, at a cost of $100m per race, and they would only be right 10% of the time. Finally, the physicists reported that they could also predict the outcome of any race, and that their process was cheap and simple. The investors listened eagerly to this proposal. The head physicist reported, "We have made several simplifying assumptions: first, let each horse be a perfect rolling sphere… "
Doesn't this joke have some grain of truth in it or is it more the other way round. Do we like to add complexity to our models either by adding parameters, or by combining different forms of propagation. In the next blog posts we will recap some of the most common models and will try to show their (dis)advantages and compare them to each other.