Antiderivatives

Page
Antiderivatives – HMC Calculus Tutorial

Let $f(x)$ be continuous on $[a,b]$. If $G(x)$ is continuous on $[a,b]$ and $G'(x)=f(x)$ for all $x\in (a,b)$, then $G$ is called an antiderivative of $f$.

We can construct antiderivatives by integrating. The function \[F(x)=\int^x_a f(t)\, dt\] is an antiderivative for $f$ since it can be shown that $F(x)$ constructed in this way is continuous on $[a,b]$ and $F'(x)=f(x)$ for all $x\in (a,b)$.

Properties

Let $F(x)$ be any antiderivative for $f(x)$.

  • For any constant $C$, $F(x)+C$ is an antiderivative for $f(x)$.

    Proof: Since $\displaystyle \frac{d}{dx}[F(x)]=f(x)$, \begin{eqnarray} \frac{d}{dx}[F(x)+C]&=&\frac{d}{dx}[F(x)]+\frac{d}{dx}[C]\\ &=&f(x)+0\\ &=&f(x) \end{eqnarray} so $F(x)+C$ is an antiderivative for $f(x)$.

  • Every antiderivative of $f(x)$ can be written in the form \[F(x)+C\] for some $C$. That is, every two antiderivatives of $f$ differ by at most a constant.

    Proof: Let $F(x)$ and $G(x)$ be antiderivatives of $f(x)$. Then $F'(x)=G'(x)=f(x)$, so $F(x)$ and $G(x)$ differ by at most a constant, which requires proof—it is shown in most calculus texts and is a consequence of the Mean Value Theorem.

The process of finding antiderivatives is called antidifferentiation or integration: \[ \begin{array}{l@{\qquad}l@{\qquad}l} \displaystyle\frac{d}{dx}[F(x)]=f(x) & \Longleftrightarrow & \displaystyle\int f(x)\, dx=F(x)+C.\\ \displaystyle\frac{d}{dx}[g(x)]=g'(x) & \Longleftrightarrow & \displaystyle\int g'(x)\, dx=g(x)+C. \end{array} \]

Properties of the Indefinite Integral

  • $\displaystyle \frac{d}{dx}\left[\int\! f(x)\, dx\right]=f(x)$.

    Proof:

    Let $\displaystyle\int f(x)\, dx=F(x)$, where $F(x)$ is an antiderivative of $f$. Then \begin{eqnarray*} \frac{d}{dx}\left[\int f(x)\, dx\right]&=&\frac{d}{dx}F(x)\\ &=&f(x). \end{eqnarray*}

  • (Linearity) $\displaystyle \int [\alpha f(x)+\beta g(x)]\, dx=\alpha \int\! f(x)\, dx+\beta \int\! g(x)\, dx$.

    Proof:

    We need only show that $\displaystyle \alpha\!\int\! f(x)\, dx+\beta\!\int\! g(x)\, dx$ is an antiderivative of $\displaystyle\int\! [\alpha f(x)+\beta g(x)]\, dx$: \begin{eqnarray*} \frac{d}{dx}\left[\alpha \int f(x)\, dx+\beta \int g(x)\, dx\right]&=& \alpha \frac{d}{dx}\left[\int f(x)\, dx\right]+\beta \frac{d}{dx}\left[\int g(x)\, dx\right]\\ &=&\alpha f(x)+\beta g(x). \end{eqnarray*}

Examples
  1. Every antiderivative of $x^2$ has the form $\displaystyle \frac{x^3}{3}+C$, since $\displaystyle \frac{d}{dx}\left[\frac{x^3}{3}\right]=x^2$.
  2. $\displaystyle \frac{d}{dx}\left[\int\! x^5\, dx\right]=x^5$.

Key Concepts

If $G(x)$ is continuous on $[a,b]$ and $G'(x) = f(x)$ for all $x\in (a,b)$, then $G$ is called an antiderivative of $f$.

We can construct antiderivatives by integrating. The function $F(x) = \displaystyle\int^x_a f(t)\, dt$ is an antiderivative for $f$. In fact, every antiderivative of $f(x)$ can be written in the form $F(x)+C$, for some $C$.

\[ \begin{array}{l@{\qquad}l@{\qquad}l} \displaystyle\frac{d}{dx}[F(x)]=f(x) & \Longleftrightarrow & \displaystyle\int f(x)\, dx=F(x)+C.\\ \displaystyle\frac{d}{dx}[g(x)]=g'(x) & \Longleftrightarrow & \displaystyle\int g'(x)\, dx=g(x)+C. \end{array} \]


[I’m ready to take the quiz.] [I need to review more.]