Diferential calculus

format_list_bulleted Contenido keyboard_arrow_down
ImprimirCitar
The logarithmic spiral of the Nautilus shell is a classic image used to represent the (continuous)growth, key concept of calculation.

The differential calculus is a part of the infinitesimal calculus and mathematical analysis that studies how continuous functions change as their variables change state. The main object of study in the differential calculus is the derivative. A closely related notion is that of a differential of a function.

The study of the change of a function is of particular interest to the differential calculation, in particular the case in which the change of the variables is infinitesimal, that is, when the change tends to zero (it becomes as small as desired). And the differential calculation is constantly supported by the basic concept of the limit. The step to the limit is the main tool that allows to develop the theory of differential calculus and that which clearly differentiates it from algebra. From the philosophical point of view of functions and geometry, the derivative of a function at a certain point is a measure of the rate in which a function changes according to an argument is modified. This is, a derivative involves, in mathematical terms, a rate of change. A derivative is the calculation of instant earrings f(x){displaystyle f(x)} at each point x{displaystyle x}. This corresponds to the slopes of the tangents of the graph of that function in its points (a tangent per point); The derivatives can be used to know the concavity of a function, its growth intervals, its maximums and minimums. The reverse of a derivative is called primitive, antiderivated or integral.

Differentiation and differentiability

A variable function is differential in point x{displaystyle x} if your derivative exists at that point; a function is differential at an interval if it is at each point x{displaystyle x} belonging to the interval. If a function is not continuous c, then it can't be differential in c; however, even if a function is continuous cIt may not be different. I mean, any differential function at a point c is continuous c, but not all continuing function c is differential in c (like) f(x) =.xΔ is continuous, but not differentiated in x = 0).

Notion of derivatives

Secant recipe between points f(x+h){displaystyle f(x+h)} and f(x){displaystyle f(x)}.

The derivatives are defined by taking the limit of the slope of the secant lines as they approach the tangent line. It is difficult to directly find the slope of the tangent line of a function because only one point of the function is known, the point where it must be tangent to the function. Therefore, the tangent line is approximated by secant lines. When the limit of the slopes of the close secants is taken, the slope of the tangent line will be obtained.

To get these earrings, take an arbitrarily small number that will be called h. h represents a small variation in xand can be both positive and negative. The slope of the line between the points (x,f(x)){displaystyle (x,f(x)} and (x+h,f(x+h)){displaystyle (x+h,f(x+h)} That's it.

f(x+h)− − f(x)h{displaystyle f(x+h)-f(x) over h}

This expression is a Newton differential quotient. The derivative of f at x is the limit of the value of the differential quotient as the secant lines get closer to the tangent:

f♫(x)=limh→ → 0f(x+h)− − f(x)h{displaystyle f'(x)=lim _{hto 0}{f(x+h)-f(x) over h}}

If the derivative of f exists at every point x, then it is possible to define the derivative of f as the function whose value at the point point x is the derivative of f in x.

Since immediately replacing h with 0 results in division by zero, calculating the derivative directly can be unintuitive. One technique is to simplify the numerator so that the h in the denominator can cancel. This is very easy with polynomial functions, but for most functions it is too complicated. Fortunately, there are general rules of thumb that make it easy to differentiate between most features.

The alternative differential ratio

The derivative of f(x) (as defined by Newton) was described as the limit, as h approaches zero. An alternative explanation of the derivative can be interpreted from Newton's ratio. Using the above formula, the derivative at c is equal to the limit as h approaches zero of [f(c + h) - f(c)] / h. If we let h = x - c (hence, c + h > = x), then x approaches c (as h approaches zero). Thus, the derivative is equal to the limit as x approaches c, of [f(x) - f(c)] / (x - c). This definition is used for a partial proof of the chain rule.

Functions of several variables

For functions of several variables f:Rm→ → Rn{displaystyle f:mathbb {R} ^{m}to mathbb {R} ^{n}}, the conditions of differentiability are stricter and require more conditions apart from the existence of partial derivatives. In particular, the existence of a linear approach to function in the environment of a point is required. Given a vectorial base, this linear approach is given by the Jacobin matrix:

lim h → → 0f(x0+h)− − f(x0)− − J(x0)h h =0{cHFFFFFF}{cHFFFF}{cHFFFFFF}{cHFFFFFF}{cHFFFF}{cHFFFF}{cHFFFFFF}{cHFFFFFF}{cHFFFF}{cHFFFF}{cHFFFF}{cHFFFF}{cHFFFFFFFFFF}{cHFFFFFFFFFF}{cH}{b}{cH}{cH}{cHFFFFFFFFFFFFFFFFFFFFFFFFFF}{cHFFFF}{cHFF}{cH}{cHFFFFFFFFFFFFFFFF}{cHFFFFFF}{cHFF}{cHFFFFFFFFFFFFFFFF}{cHFFFFFF}{cHFF}{cHFFFFFFFFFFFF}{cH}{cHFFFFFFFF}{b}{cH}{cH

History

The common problems that gave rise to the infinitesimal calculus began to be considered in the classical age of ancient Greece (III century BC), with concepts of a geometric type such as the problem of the tangent to an Apollonius curve of Perge, but systematic methods of solving were not found until the 17th century, thanks to the work of Isaac Newton and Gottfried Wilhelm Leibniz.

The two of them synthesized two concepts and methods used by their predecessors in what is now called «differentiation» and «integration». They developed rules to manipulate the derivatives (derivation rules) and showed that both concepts were inverses (fundamental theorem of calculus).

Since the 17th century, many mathematicians have contributed to the differential calculus. In the 19th century, calculus took on a more rigorous style, due to mathematicians such as Augustin Louis Cauchy (1789-1857), Bernhard Riemann (1826-1866), and Karl Weierstrass (1815-1897). It was also during this period that the differential calculus was generalized to Euclidean space and the complex plane.

Important applications of differential calculus

Tangent line to a function at a point

Tangent straight to a function f(x){displaystyle f(x)} is how you have seen the limit of the drying straights when one of the cutting points of the secant with the function is made to tend to the other cutting point. It can also be defined to the tangent straight as the best linear approach to the function at its point of tangence, that is, the tangent line is the first degree polynomic function that best approaches the function locally at the point of tangence considered.

If the equation of the tangent line is known Ta(x){displaystyle T_{a}(x)} to function f(x){displaystyle f(x)} at the point a{displaystyle a}, it can be taken Ta(x){displaystyle T_{a}(x)} as a reasonably good approach f(x){displaystyle f(x)} near the point a{displaystyle a}. This means if you take a point a+h{displaystyle a+h} and is evaluated both in function and in tangent line, the difference f(a+h)− − T(a+h){displaystyle f(a+h)-T(a+h)} will be despicable in front of h{displaystyle h} in absolute value if h{displaystyle h} tends to zero. The closer the point is a{displaystyle a} the more precise the approximation of f(x){displaystyle f(x)}.

For a function f(x){displaystyle f(x)} locally derived at point a{displaystyle a}the tangent straight to f(x){displaystyle f(x)} by the point a is:

Ta(x)=f(a)+f♫(a)(x− − a){displaystyle T_{a}(x)=f(a)+f'(a)(x-a)}

Using derivatives to graph functions

Derivatives are a useful tool for examining the graphs of functions. In particular, points in the interior of a domain of a real-valued function that take that function to a local extremum will have a first derivative of zero. However, not all critical points are local extremes. For example, f(x)=x³ has a critical point at x=0, but at that point there is no maximum or minimum. The first derivative test and the second derivative test allow you to determine if the critical points are maximum, minimum, or none.

In the case of multidimensional domains, the function will have a partial derivative of zero with respect to each dimension at a local extremum. In this case, the second derivative test can still be used to characterize the critical points, considering the eigenvalue of the Hessian matrix of the second partial derivatives of the function at the critical point. If all eigenvalues are positive, then the point is a local minimum; if they are all negative, then it is a local maximum. If there are some positive and some negative eigenvalues, then the critical point is a saddle point, and if neither of these cases holds, the test is inconclusive (ie, the engeivalues are 0 and 3).

Once the local extrema are found, it is much easier to get a rough idea of the overall graph of the function, since (in the case of the one-dimensional domain) it will increase or decrease uniformly except at the critical points, and for this reason (assuming its continuity) it will have intermediate values between the values at the critical points of each side.

Local Taylor Approximation

It is then possible to approach through its tangent straight to a locally derivative function at a point. If it is fulfilled that the function is soft enough at the point or domain of study (that is, the function is class Cn{displaystyle C^{n}}), then you can approach the function not by grade one polynomials, but grade two, three, four and so on. This approach is called Taylor's "political development" and is defined as follows:

P(x)=f(a)+f♫(a)1!(x− − a)+f♫(a)2!(x− − a)2+f♫(a)3!(x− − a)3+ +fn(a)n!(x− − a)n{displaystyle P(x)=f(a)+{frac {f'(a)}{1!}(x-a)+{frac {f'(a)}{2!}(x-a)}{2}{2}{frac {f''(a)}{3!}{3(x-a)}{3(cdots +{

Where P(x){displaystyle P(x)} It's grade polynomial. n{displaystyle n} which best approaches the function at the point x=a{displaystyle x=a}. Note that, if assessed P(x){displaystyle P(x)} in x=a{displaystyle x=a}all the terms except the first are canceled; then, P(a)=f(a){displaystyle P(a)=f(a)}. Note also that the equation of the tangent line in the previous section corresponds to the case in which n=1{displaystyle n=1}.

When a=0{displaystyle a=0}development is called development of MacLaurin. In practice, most of the time MacLaurin developments are used. Examples of important developments in MacLaurin are:

ex≈ ≈ 1+x1!+x22!+x33!+x44!+x55!+ {displaystyle e^{x}approx 1+{frac {x}{1}{x}{x^{2}}}{2}}}} +{frac {x{3}}{3}}}}{3!}+{frac {x^{4}{4}}{4}}}{frac {x^{5}{5}}}}{5!
without (x)≈ ≈ x− − x33!+x55!− − x77!+ {displaystyle sin left(xright)approx x-{frac {x^{3}}{3}}}{3}}+{frac {x^{5}{5}}}}}{frac {x^{7}}}{7}}}}}}
ln (1+x)≈ ≈ x− − x22+x33− − x44+ {displaystyle ln(1+x)approx x-{frac {x^{2}{2}{2}}}}}{frac {x^{3}}}{3}}}-{frac {x^{4}{4}}}{4}}}}

Note the symbol ≈ ≈ {displaystyle approx } that denotes approximation, not equality. If the function to approach is infinitely derivable (C∞ ∞ {displaystyle C^{infty}}) and add infinite terms to development, then the ≈ ≈ {displaystyle approx } becomes a ={displaystyle} and the previous development becomes a series of Taylor. The functions that are equal to your Taylor series are called analytical functions.

Calculation of points

FunEsc Dominio 04.svg

Given a function

f:[chuckles]a,b]→ → Rx f(x){displaystyle {begin{aligned}f:[a,b] strangerto mathbb {R} \x strangermapsto f(x)end{aligned}}}}}}}}}

we can differentiate the following points:

Border point
Interior point
Critical point
Unique point
Stationary point
Inflection point

Critical points

A critical point is understood as a singular or stationary point.

Singular points

The values in which the derivative of the function are singular points: f♫(x){displaystyle f'(x)}presents some kind of discontinuity.i

Stationary Points

It is called stationary point to the values of the variable in which the derivative is annulled f♫(x){displaystyle f'(x)}I mean, yeah. f♫(xi)=0{displaystyle f'(x_{i})=0} for i=1,2,...... ,n{displaystyle i=1,2,dotsn} then. x1,x2,...... ,xn{displaystyle x_{1},x_{2},dotsx_{n}} are stationary points of f(x){displaystyle f(x)}.

Values f(x1),f(x2),...... ,f(xn){displaystyle f(x_{1}),f(x_{2}),dotsf(x_{n}}}}}They're called stationary values.

If the second derivative is positive at a stationary point, the point is said to be a local minimum; if it is negative, the point is said to be a local maximum; if it is zero, it can be either a minimum or a maximum or an inflection point. Deriving and solving at the critical points is usually a simple way to find local maxima and minima, which can be used in optimization. However, extremes in such problems should never be underestimated.

Generalization of the differential calculus

When a function depends on more than one variable, the concept of partial derivative is used. Partial derivatives can be informally thought of as taking the derivative of a function with respect to one of them, holding the other variables constant. The partial derivatives are represented as:

▪ ▪ ▪ ▪ x{displaystyle {frac {partial}{partial x}}}}}

Where ▪ ▪ {displaystyle partial } is a rounded 'd' known as 'symbol of partial derivative'.

The concept of derivative can be extended more generally. The common thread is that the derivative at a point serves as a linear approximation to the function at that point. Perhaps the most natural situation is that functions are differentiable on manifolds. The derivative at a certain point then becomes a linear transformation between the corresponding tangent spaces, and the derivative of the function becomes a mapping between the tangent groups.

To differentiate all continuous functions and much more, the concept of distribution can be defined. For complex functions of a complex variable, differentiability is a much stronger condition than the simple real and imaginary part of the differentiated function with respect to the real and imaginary part of the argument. For example, the function f(x+iand)=x+2iand{displaystyle f(x+{mathrm {i} }y)=x+2{mathrm {i} } satisfy the second, but not the first. See also the holistic function.

See also: diffintegral.

Given the functions, of real value, and both with domain, the problem consists of finding the maximum or minimum values (extreme values) of when it is restricted to take values in the set.

Contenido relacionado

Euclid's Postulates

The postulates of Euclid refer to the treatise called The Elements, written by Euclid around the year 300 BC. C., exposing the geometric knowledge of...

Pedro Aguirre Cerda

Pedro Abelino Aguirre Cerda was a Chilean politician, educator, and...

Fable

As a literary genre, it has a mixed narrative and didactic character, which La Fontaine already noted when dividing it, in the prologue to his Fables into a...
Más resultados...
Tamaño del texto:
undoredo
format_boldformat_italicformat_underlinedstrikethrough_ssuperscriptsubscriptlink
save