Bayes' theorem

format_list_bulleted Contenido keyboard_arrow_down
ImprimirCitar
A neon sign showing the Bayes theorem's statement

The Bayes theorem, in the theory of probability, is a proposition proposed by the English mathematician Thomas Bayes (1702-1761) and published posthumously in 1763, which expresses the probability of a random event A{displaystyle A} given B{displaystyle B} in terms of the conditional probability distribution of the event B{displaystyle B} given A{displaystyle A} and the distribution of marginal probability alone A{displaystyle A}.

In more general and less mathematical terms, the Bayes theorem is of enormous relevance since it links the probability of A{displaystyle A} given B{displaystyle B} with the probability of B{displaystyle B} given A{displaystyle A}. That is, for example, that knowing the likelihood of having a headache since you have flu, you might know (if you have any more data), the likelihood of having flu if you have a headache. It shows this simple example the high relevance of the theorem in question for science in all its branches, since it has an intimate connection with the understanding of the probability of causal aspects given the observed effects.

One of the many applications of Bayes' theorem is Bayesian inference, a particular approach to statistical inference. When applied, the probabilities implied in the theorem can have different probability interpretations. With the Bayesian probability interpretation, the theorem expresses how a degree of belief, expressed as a probability, should rationally change to take into account the availability of related evidence. Bayesian inference is fundamental to Bayesian statistics, being considered by one authority as; "to probability theory what the Pythagorean theorem is to geometry."

History

Bayes' theorem is named after the Rev. Thomas Bayes ( /bz/), also a statistician and philosopher. Bayes used conditional probability to provide an algorithm (his Proposition 9 of his) that uses the evidence to calculate the bounds of an unknown parameter. His work was published in 1763 under the title An Essay to Solve a Problem in the Doctrine of Probabilities . Bayes studied how to calculate a distribution for the probability parameter of a binomial distribution (in modern terminology). On Bayes's death, his family transferred his papers to a friend, the minister, philosopher, and mathematician Richard Price.

Over two years Richard Price edited the unpublished manuscript significantly, before sending it to a friend who read it aloud at the Royal Society on December 23, 1763. Price edited Bayes's major work "An Essay towards solving a Problem in the Doctrine of Chances" (1763), which appeared in Philosophical Transactions, and contains Bayes' theorem. Price wrote an introduction to the paper that provides some of the philosophical foundations of Bayesian statistics, and chose one of the two solutions offered by Bayes. In 1765 Price was elected a Fellow of the Royal Society in recognition of his work on the Bayes legacy. On 27 April a letter sent to his friend Benjamin Franklin at the Royal Society was read at, and subsequently published that Price applies this work to the population and to the calculation of "lifetime pensions".

Independently of Bayes, Pierre-Simon Laplace in 1774, and later in his Théorie analytique des probabilités of 1812, used conditional probability to formulate the relation of a posterior probability updated from a prior probability, given evidence. He reproduced and expanded Bayes's results in 1774, apparently without knowledge of Bayes's work. Laplace refined Bayes' theorem over several decades:

  • Laplace announced its independent discovery of the Bayes theorem in: Laplace (1774) "Mémoire sur la probabilité des causes par les événements", "Mémoires de l'Académie royale des Sciences de MI (Savants étrangers)", 4: 621-656. Reprint in: Laplace, "Oeuvres complètes" (Paris, France: Gauthier-Villars et fils, 1841), vol. 8, pp. 27-65. Available online at: Gallica. The Bayes theorem appears on p. 29.
  • Laplace presented a refinement of the Bayes theorem in: Laplace (read: 1783 / published: 1785) "Mémoire sur les approximations des formulatings qui sont fonctions de très grands nombres", "Mémoires de l'Académie royale des Sciences de Paris", 423-467. Reprinted in: Laplace, "Oeuvres complètes" (Paris, France: Gauthier-Villars et fils, 1844), vol. 10, pp. 295-338. Available online at: Gallica. The Bayes theorem appears on page 301.
  • See also: Laplace, "Essai philosophique sur les probabilités" (Paris, France: Mme. Ve. Courcier [Madame veuve (i.e. widow) Courcier], 1814), page 10. English translation: Pierre Simon, Marquis de Laplace with F. W. Truscott and F. L. Emory, trans, "A Philosophical Essay on Probabilities" (New York, New York: John Wiley & Sons, 1902), page 15.}} The bayesian interpretation of probability was developed mainly by Laplace.

Some 200 years later, Sir Harold Jeffreys put Bayes's algorithm and Laplace's formulation on an axiomatic basis, writing in a 1973 book that Bayes's theorem "is to probability theory the same thing. that the Pythagorean theorem is to geometry".

Stephen Stigler used a Bayesian argument to conclude that Bayes's theorem was discovered by Nicholas Saunderson, a blind English mathematician, some time before Bayes; this interpretation, however, has been disputed. Martyn Hooper and Sharon McGrayne have argued that Richard Price's contribution was substantial:

In modern terms, we should refer to the Bayes-Price rule. Price discovered Bayes' work, recognized its importance, corrected it, contributed to the article and found a use. The modern convention to use only the name of Bayes is unjust, but it is so rooted that anything else makes little sense..

Theorem

Sea {A1,A2,...,Ai,...,An!{displaystyle {A_{1}, a_{2},..., a_{i},... a set of mutually exclusive and exhaustive events such that the probability of each of them is different from zero (P [chuckles]Ai]I was. I was. 0fori=1,2,...... ,n){displaystyle (operatorname {P} [A_{i}]neq 0;{mbox{para }}i=1,2,dotsn)}. Yeah. B{displaystyle B} It's an event that you know the conditional probability. P(B日本語Ai){displaystyle P(BINDA_{i})} Then the probability P(Ai日本語B){displaystyle P(A_{i}UDB)} is given by the expression:

P(Ai日本語B)=P(B日本語Ai)P(Ai)P(B){displaystyle P(A_{i} meantb)={frac {P(BINDA_{i})P(A_{i})}{P(B)}}}}}}

where:

  • P(Ai){displaystyle P(A_{i})} are the a priori odds,
  • P(B日本語Ai){displaystyle P(BINDA_{i})} is the probability of B{displaystyle B} in the hypothesis Ai{displaystyle A_{i}},
  • P(Ai日本語B){displaystyle P(A_{i}UDB)} are the probabilities later.

Bayes' Formula

The display of the Bayes theorem by the overlap of two decision trees

Based on the definition of conditional probability, the Bayes Formula, also known as the Bayes Rule, is obtained:

P(Ai日本語B)=P(B日本語Ai)P(Ai)␡ ␡ k=1nP(B日本語Ak)P(Ak) [chuckles]1]{displaystyle P(A_{i} meantB)={frac {P(BINDA_{i})P(A_{i})}{sum _{k=1^{n}P(BINDA_{k})P(A_{k}}}}}}{cdots [1]

This formula allows us to calculate probability P(Ai日本語B){displaystyle P(A_{i}UDB)} of any of the events Ai{displaystyle A_{i}} given B{displaystyle B}. The formula [chuckles]1]{displaystyle} "has caused many philosophical speculations and disputes."

Applications

Bayes' theorem is valid in all applications of probability theory. However, there is a controversy about the type of probabilities that it uses. In essence, the followers of traditional statistics only admit probabilities based on repeatable experiments and that have empirical confirmation, while the so-called Bayesian statistics allow subjective probabilities. The theorem can then serve to indicate how we should modify our subjective probabilities when we receive additional information from an experiment. Bayesian statistics is proving its usefulness in certain estimates based on a priori subjective knowledge and the fact that it allows these estimates to be reviewed based on empirical evidence is what is opening up new ways of making knowledge. An application of this is the Bayesian classifiers that are frequently used in implementations of junk mail or spam filters, which adapt with use. Another application is found in data fusion, combining information expressed in terms of probability density from different sensors.

As an observation, the following formula is obtained ␡ ␡ i=1nP(Ai日本語B)=1{displaystyle sum _{i=1}^{n}P(A_{i} and their demonstration is trivial.

As specific applications:

  1. The diagnosis of cancer.
  2. Probation evaluation during the development of a bridge game by Dan F. Waugh and Frederick V. Waugh.
  3. A priori and a posteriori probabilities.
  4. A controversial use in the Laplace Succession Act.
  5. In the hypothesis test in Political Science when methodology is used process tracing.

Use in genetics

In genetics, Bayes' theorem can be used to calculate the probability that an individual has a specific genotype. Many people seek to approximate their chances of being affected by a genetic disease or their probability of carrying a recessive gene of interest. A Bayesian analysis based on family history or genetic testing can be performed to predict whether an individual will develop a disease or pass it on to their children. Genetic testing and prediction is common practice among couples who plan to have children but are concerned that both may be carriers of a disease, especially in communities with low genetic variance.

The first step in Bayesian analysis for genetics is to propose mutually exclusive hypotheses: for a specific allele, an individual is or is not a carrier. Next, four probabilities are calculated: Prior Probability (the probability of each hypothesis taking into account information such as family history or predictions based on Mendelian Inheritance), Conditional Probability (of a certain outcome), Joint Probability (product of the two first) and Posterior Probability (a weighted product that is calculated by dividing the Joint Probability of each hypothesis by the sum of both joint probabilities).

Additional bibliography

  • Grunau, Hans-Christoph (24 January 2014). "Preface Issue 3/4-2013". Jahresbericht der Deutschen Mathematiker-Vereinigung. 115 (3–4): 127–128. doi:10.1365/s13291-013-0077-z.
  • Gelman, A, Carlin, JB, Stern, HS, and Rubin, DB (2003), "Bayesian Data Analysis," Second Edition, CRC Press.
  • Grinstead, CM and Snell, JL (1997), "Introduction to Probability (2nd edition)," American Mathematical Society (free pdf available) [1].
  • "Bayes formula", Encyclopedia of Mathematics, EMS Press, 2001 [1994]
  • McGrayne, SB (2011). The The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines & Emerged Triumphant from Two Centuries of Controversy. Yale University Press. ISBN 978-0-300-18822-6.
  • Laplace, Pierre Simon (1986). "Memoir on the Probability of the Causes of Events". Statistical Science. 1 (3): 364-378. doi:10.1214/ss/1177013621. JSTOR 2245476.
  • Lee, Peter M (2012), "Bayesian Statistics: An Introduction," 4th edition. Wiley. ISBN 978-118-33257-3.
  • Puga JL, Krzywinski M, Altman N (31 March 2015). "Bayes' theorem." Nature Methods. 12 (4): 277–278. doi:10.1038/nmeth.3335. PMID 26005726.
  • Rosenthal, Jeffrey S (2005), "Struck by Lightning: The Curious World of Probabilities". HarperCollins. (Granta, 2008. ISBN 9781862079960).
  • Stigler, Stephen M. (August 1986). "Laplace's 1774 Memoir on Inverse Probability." Statistical Science. 1 (3): 359–363. doi:10.1214/ss/1177013620.
  • Stone, JV (2013), download chapter 1 of "Bayes' Rule: A Tutorial Introduction to Bayesian Analysis", Sebtel Press, England.
  • Bayesian Reasoning for Intelligent People, An introduction and tutorial to the use of Bayes' theorem in statistics and cognitive science.
  • Morris, Dan (2016), Read first 6 chapters for free of "Bayes' Theorem Examples: A Visual Introduction For Beginners" Blue Windmill ISBN 978-1549761744. A short tutorial on how to understand problem scenarios and find P(B), P(A), and P(BINDA).

External links

  • Visual explanation of Bayes using trees on YouTube.
  • Bayes' frequentist interpretation explained visually on YouTube.
  • Earliest Known Uses of Some of the Words of Mathematics (B). Contains origins of "Bayesian", "Bayes' Theorem", "Bayes Estimate/Risk/Solution", "Empirical Bayes", and "Bayes Factor".
  • A tutorial on probability and Bayes' theorem devised for Oxford University psychology students
  • An Intuitive Explanation of Bayes' Theorem by Eliezer S. Yudkowsky
  • Bayesian Clinical Diagnostic Model

Contenido relacionado

Asymptotic lower bound

In algorithmic analysis, an asymptotic lower bound is a function that serves as the lower bound of another function when the argument tends to infinity....

Ruler

The term rule derived from the Latin regula, can refer either to measuring or calculating some asymmetric object in this...

Functor

In category theory a functor or functor is a function from one category to another that takes objects to objects and morphisms to morphisms such that the...
Más resultados...
Tamaño del texto:
undoredo
format_boldformat_italicformat_underlinedstrikethrough_ssuperscriptsubscriptlink
save