Site Color

Text Color

Text Color

Evergreen

Duotone

Mysterious

Classic

or

Bayesian Brain: Is Your Brain a Data Scientist? by@nikolao

# Bayesian Brain: Is Your Brain a Data Scientist?

### @nikolaoNikola O.

Combines ideas from data science, humanities and social sciences. Enjoys thinking, science fiction and design.

Is your Brain a Data Scientist? Yes, according to the Bayesian Brain Hypothesis, your brain is a Bayesian statistician. Let me explain.

## Bayes’ Theorem and Fear of Birds

Before we get to the nitty-gritty of Bayes’ Theorem, I want to tell you why I love the idea of the Bayesian Brain and how it help me to hack my brain.

I’m afraid of birds. By saying afraid, I mean terrified. When I see pigeons or seagulls, I just …

Gif by birdboxmovies

I tend to think that they can murder me or cause a severe injury, at least. Thankfully, I can manage it now with a bit of help from the Bayes’ theorem. First, let’s revise the theorem. (If you have no idea what I’m talking about, please bear with me; it will make sense in a minute. I promise.)

Now, let me show you how this equation helped me with my fear of being murdered by a pigeon or any other bird (we will work with pigeons here). The first thing I realized, thanks to this theorem, is that I only considered the probability of a flock of pigeons murdering me or any other humans. In my head, this probability is very high, around 90%. Instead, I should ask myself what is the probability of pigeons murdering me given the evidence. With this in mind, the equation would look like this:

First, we need to fill in the probabilities P(Pigeon) and P(Evidence). I already mentioned that P(Pigeon) is around 90% for me, so P(Pigeon) = 0.90. I tried to find real-world evidence of pigeons murdering people, and I couldn’t find anything, so this must be very low. For ease of calculation, we will assume that P(Evidence) = 0.

Here comes my second realization: What happens when we divide by 0? Dividing by zero is undefined, thus without going into much detail, the answer to my question is that the probability of pigeons murdering me is approaching zero.

My internal world model estimates that being murdered by a flock of birds is a highly likely event. However,  that's not the case.

Bayes’ Theorem helped me recognize that real-world evidence is what matters.

## Bayesian Brain Hypothesis

This theory assumes that our brain is a self-organizing system that computes Bayes's theorem constantly.

There are two concepts to consider:

1. Prior belief: person's belief system or an internal world model
2. Posterior perception: the evidence from sensory experience

In any circumstances, the Bayesian Brain wants to minimize the difference between the prediction and the perception. Our brain is surprised by a discrepancy between our expectations and the evidence we receive through our sensory experience. This surprise is captured in the form of an error analogous to the prediction error of a machine learning algorithm. Going back to my example, when I see a pigeon, my brain predicts an attack, but my perception tells me that there is no physical pain.

Image adapted from this research article.

How does the brain deal with such an error?

As a self-organizing system, the brain has one priority - homeostasis. In other words, the brain wants to be stable, and it has two options - it can fulfill the prediction by action or learning

The action can be something that will lead to the expected input. In my example, when my internal model predicts pigeon attack, it increases my anxiety, which could be a substitute for the physical pain that didn't occur.

Learning would be a better strategy. In the same way, algorithms can learn from data, the brain can update its prior beliefs about the world to align with reality. Unfortunately, my brain doesn't seem to be able to learn this. Hence, I need to hack my brain to manage the expectations about possible scenarios when I see a flock of pigeons flying by.

But why can't the brain update its model?

There is one more dimension I didn't mention - uncertainty. Our internal model updates dynamically based on the prediction error, which is weighted by the precision ratio.

The precision ratio captures the uncertainty of both the prior belief and the sensory input. Learning is more likely when we are more certain about what we perceive than what we apriori believe. Back to the pigeons, if the brain is confident in the internal prediction of pain and unsure about the sensory input due to fear reaction, the update of the internal model doesn't occur.

This is quite a complex topic, so this version of the hypothesis is simplified. If you are interested in learning more, I highly recommend this research paper where the authors discuss the implication of Bayesian Brain for Autism Spectrum Disorder. This article covers the academic debate about Bayesian Brains. In short, not everyone agrees that have Bayesian Brains.

For more insight into the neurobiology of this process check Machine Learning in the Brain.

For more about Bayes' theorem, see Solving a Problem with Bayes’ Theorem and Decision Tree.

By the way, you should be a little bit afraid of birds too.

by Nikola O. Combines ideas from data science, humanities and social sciences. Enjoys thinking, science fiction and design.
Join blog waiting list