top of page
CoverMockup.jpg

Probability and Information:

     From Priors to Posteriors

Chapter 3

Here we take a look at the the structure of probability distributions, which allows us to identify the information encoded by distributions and the stable characteristics of a given distribution. This in turn tells us how to assign sampling and likelihood distributions to describe data, as well as how to assign prior distributions that have desired behavior when combined with likelihood distributions( e.g., using Jeffreys rule, which yields a 'maximally uninformed' prior relative to a specific parameter of the likelihood). We see that combining different priors to a likelihood function will affect the character of the resulting posterior, as well as how this changes with new data.

Programming Asides:

bottom of page