Probability theory

format_list_bulleted Contenido keyboard_arrow_down
ImprimirCitar

The theory of probability is a branch of mathematics that studies random and stochastic phenomena. Random phenomena are contrasted with deterministic phenomena, which are unique and/or predictable results of experiments carried out under the same determined conditions, for example, if water is heated to 100 °C at sea level, steam will be obtained. Random phenomena, on the contrary, are those that are obtained from experiments carried out, again, under the same determined conditions but as a possible result they have a set of alternatives, for example, the throw of a dice or a coin.

Probability theory is concerned with assigning a certain number to each possible outcome that may occur in a random experiment, in order to quantify those outcomes and find out if one event is more likely than another.

Many natural phenomena are random, but there are some such as the throwing of a dice, where the phenomenon is not repeated under the same conditions, due to the fact that the characteristics of the material mean that there is no symmetry of the same, thus the repetitions do not guarantee a definite probability. In real processes that are modeled using probability distributions, they correspond to complex models where all the parameters involved are not known a priori; this is one of the reasons why statistics, which seeks to determine these parameters, is not immediately reduced to probability theory itself.

In 1933, the Soviet mathematician Andrei Kolmogorov proposed a system of axioms for probability theory, based on set theory and measure theory, developed a few years earlier by Lebesgue, Borel, and Frechet, among others.

This axiomatic approximation that generalizes the classical framework of probability, which obeys the calculation rule of favorable cases over possible cases, allowed the rigorization of many arguments already used, as well as the study of problems outside the classical frameworks. Currently, the theory of probability finds application in the most varied branches of knowledge, such as physics (where it is appropriate to mention the development of diffusions and Brownian movement), or economics (where the Black-Scholes model stands out for the valuation of financial assets).

Definition of probability

History

The theory of probability originally developed from certain problems posed in the context of gambling. Initially, there was no well-defined axiomatic theory and the initial definitions of probability were based on the intuitive idea of a ratio of occurrences:

(1)Prorb(A)=limN→ → ∞ ∞ nAN{displaystyle mathrm {Prob} (A)=lim _{Nto infty }{frac {n_{A}}{N}}}}}}

where A is any event and:

N{displaystyle N,} is the number of times that an action or observation has been repeated, the result of which can be A or not-A.
nA,{displaystyle n_{A},} is the number of times you observe A in all observations.

Although this type of definition allowed the development of a large number of properties, it did not allow us to deduce all the important theorems and results that are now part of probability theory. In fact the above result can be rigorously proved within the axiomatic approach of probability theory, under certain conditions.

The first complete axiomatization was due to Andréi Kolmogórov (who used that approach for example to deduce his "law 0-1 for tail events" and other results related to the convergence of random successions). The axiomatic definition of probability is based on results of the theory of measure and in formalizations of the idea of probabilistic independence. This approach is part of a standardized measurement space (Ω Ω ,M,μ μ P){displaystyle (Omega{mathcal {M}},mu _{P}}} where Ω Ω {displaystyle Omega } is a set called a sequence space (according to the type of problem can be a finite, numberable or non-numerable set), M P(Ω Ω ){displaystyle {mathcal {M}}subset {mathcal {P}}(Omega)} is a σ-algebra of subsets of Ω Ω {displaystyle Omega } and μ μ P:M→ → R{displaystyle mu _{P}:{mathcal {M}}to mathbb {R}}} is a normalized measure (i.e., μ μ P(Ω Ω )=1{displaystyle mu _{P}(Omega)=1}). Possible events are considered as subsets S of possible elementary events: S한 한 M,S Ω Ω {displaystyle Sin {mathcal {M}},Ssubset Omega } and the probability of each event is given by the measure of that set:

Prorb(S)=μ μ P(S)한 한 [chuckles]0,1]{displaystyle mathrm {Prob} (S)=mu _{P}(S)in [0.1],

The interpretation of this probability is the average frequency with which the event appears if it is considered a random sample choice Ω Ω {displaystyle Omega }.

The previous definition is complicated to represent mathematically as Ω Ω {displaystyle Omega } It should be infinite. Another way to define the probability is axiomatically this by establishing the relationships or properties that exist between the concepts and operations that make up it.

Classical definition of probability

Probability is the characteristic of an event, which means that there are reasons to believe that it will take place.

The probability p of an event S occurring out of a total of n equally likely possible cases is equal to the ratio of the number of occurrences h of said event (favorable cases) and the total number of possible cases n.

p=Prorb{S!=hn{displaystyle p=mathrm {Prob} {s}={frac {h}{n}}}}}

The probability is a number (value) that varies between 0 and 1. When the event is impossible its probability is said to be 0, if the event is certain and always has to occur its probability is 1.

The probability of an event not occurring is given by q, where:

q=Prorb{norS!=1− − hn{displaystyle q=mathrm {Prob} {mathrm {no} ;S}=1-{frac {h}{n}}}}}}

We know that p is the probability that an event will occur and q is the probability that it will not occur, so p + q = 1

Symbolically the results space, which is normally denoted by Ω Ω {displaystyle Omega }, it is the space that consists of all possible results. The results, which are noted by ω ω 1,ω ω 2{displaystyle omega _{1},omega _{2}}, etc., are elements of space Ω Ω {displaystyle Omega }.

Axiomatic definition

As previously mentioned, the axiomatic definition of probability is an extension of measurement theory, in which the notion of relative independence is introduced. This approach allows reproducing the results of the classical theory of probability as well as new results referring to the convergence of random variables. In addition to stochastic processes, the Itô calculus and stochastic differential equations.

Within the axiomatic approach, it is possible to show that the weak law of large numbers implies that:

Prorb(日本語Sn− − E(Sn)n日本語≥ ≥ ε ε )≤ ≤ Var(Sn)n2ε ε 2→ → 0{displaystyle mathrm {Prob} left(left implied{frac {S_{n}-mathbb {E} (S_{n})}{n}}}}right presumption epsilon right)leq {frac {mathrm {Var} (S_{n}{n}{nx1}{2}{2}{epsilon}{epsilon}{

This allows us to rigorously justify the equation (1) assuming that:

Sn=X1+ +Xn,E(Sn)=npXi한 한 0,1{displaystyle S_{n}=X_{1}+dots +X_{n},quad mathbb {E} (S_{n})=npqquad X_{i}in {0.1}}

Where interpreted Xi=1{displaystyle X_{i}=1} likely p and that Xi=0{displaystyle X_{i}=0} 1-p.

Random variables

A random variable is a measurable function

X:Ω Ω → → R{displaystyle X:Omega to mathbb {R} ,}

giving a numerical value to each elemental event ω ω 한 한 Ω Ω {displaystyle omega in Omega }.

Discrete probability

This type of probability is one that can take only certain different values that are the result of the account of some characteristic of interest. More exactly, a discrete probability problem is a problem defined by a set of random variables that can only take on a finite or countably infinite set of different values:

card[chuckles]X(Ω Ω )]≤ ≤ Русский Русский 0{displaystyle mathrm {card} [X(Omega)]leq aleph _{0}}}

where:

card[chuckles]⋅ ⋅ ]{displaystyle mathrm {card} [cdot ]} designates the cardinal or "number of elements" of a set.
X(Ω Ω )={x한 한 R日本語consuming consuming ω ω 한 한 Ω Ω :x=X(ω ω )!{displaystyle X(Omega)={xin mathbb {R} Δexists omega in Omega:x=X(omega)}}}, is the set of all possible values that take the random variable.

Continuous probability

A continuous probability problem is one in which random variables appear capable of taking values in some range of real numbers (and therefore assuming a non-countable set of values), so continuing with the previous notation:

aleph _{0}}" xmlns="http://www.w3.org/1998/Math/MathML">card[chuckles]X(Ω Ω )]=Русский Русский 1▪Русский Русский 0{displaystyle mathrm {card} [X(Omega)]=aleph _{1}{0}{0}}}aleph _{0}}" aria-hidden="true" class="mwe-math-fallback-image-inline" src="https://wikimedia.org/api/rest_v1/media/math/render/svg/925509a8d533d90585d508a7535134a38f57a07d" style="vertical-align: -0.838ex; width:22.306ex; height:2.843ex;"/>

Probability distribution function

The probability distribution can be defined for any random variable X, whether continuous or discrete, by the following relation:

FX(x)=Prorb(X≤ ≤ x)=μ μ P{ω ω 한 한 Ω Ω 日本語X(ω ω )≤ ≤ x!{displaystyle F_{X}(x)=mathrm {Prob} (Xleq x)=mu _{P{omega in Omega /25070/X(omega)leq x}}}}}

For a discrete random variable this function is not continuous but piecewise constant (being continuous on the right but not on the left). For a general random variable the distribution function can be decomposed into a continuous part and a discrete part:

FX(x)=FXc(x)+FXd(x){displaystyle F_{X}(x)=F_{X}^{c}(x)+F_{X}^{d}(x)}

Where FXc(x){displaystyle F_{X}^{c}(x)} is an absolutely continuous function and FXd(x){displaystyle F_{X}^{d}(x)} is a constant function to sections.

Probability density function

The density function, or probability density of an absolutely continuous random variable, is a function from which the probability of each value taken by the variable defined as:

fX(x)=dFx(x)dx{displaystyle f_{X}(x)={frac {dF_{x}(x)}{dx}}}}}}

That is, its integral in the case of continuous random variables is the probability distribution. In the case of discrete random variables, the probability distribution is obtained through the summation of the density function. The notion can be generalized to several random variables.

Probability theory

Modern probability theory includes topics from the following areas:

  • σ-algebras, measurement theory, product measurement and measurable functions
  • Random variables and distribution functions
  • Convergence of measurable functions and weak convergence.
  • Probabilistic independence
  • Conditional probability
  • Martingalas and stop times
  • Laws of the great numbers
  • Features functions
  • Theorem Central Limit and Theorem Extreme Value
  • Stochastic processes

Contenido relacionado

Lipschitzian function

In mathematics, a function f: M → N between metric spaces and is said to be lipschitzian if there exists a constant K > 0 such...

Zero

The zero is a numeral of the even property. It is the numerical sign of null value, which in positional notation occupies the places where there is no...

Unit Conversion

The conversion of units is the transformation of the numerical value of a physical magnitude, expressed in a certain unit of measurement, into another...
Más resultados...
Tamaño del texto:
undoredo
format_boldformat_italicformat_underlinedstrikethrough_ssuperscriptsubscriptlink
save