Martyn Rhisiart Jones

Bayes’ theorem (alternatively Bayes’ law or Bayes’ rule, after Thomas Bayes) provides a mathematical rule. It is used for inverting conditional probabilities. This enables us to find the probability of a cause given its effect.[1] For example, we know the risk of developing health problems increases with age. Bayes’ theorem allows us to assess the risk to an individual of a known age more accurately. It achieves this by conditioning the risk relative to their age. This approach is better than assuming the individual is typical of the population as a whole. Based on Bayes law, you need to consider the prevalence of a disease in a population. Also, account for the error rate of an infectious disease test. This helps evaluate the meaning of a positive test result correctly and avoid the base-rate fallacy.
One of the many applications of Bayes’ theorem is Bayesian inference, a particular approach to statistical inference, where it is used to invert the probability of observations given a model configuration (i.e., the likelihood function) to obtain the probability of the model configuration given the observations (i.e., the posterior probability)
http://en.wikipedia.org/wiki/Bayes%27_theorem
Bayesian statistics (/ˈbeɪziən/ BAY-zee-ən or /ˈbeɪʒən/ BAY-zhən)[1] is a theory in the field of statistics based on the Bayesian interpretation of probability, where probability expresses a degree of belief in an event. The degree of belief may be based on prior knowledge about the event, such as the results of previous experiments, or on personal beliefs about the event. This differs from a number of other interpretations of probability, such as the frequentist interpretation, which views probability as the limit of the relative frequency of an event after many trials.[2] More concretely, analysis in Bayesian methods codifies prior knowledge in the form of a prior distribution.
http://en.wikipedia.org/wiki/Bayesian_statistics
Continue reading