# first order differential equation and summation problem

by fiksx   Last Updated January 19, 2018 12:20 PM

find $a_n$ and f(x) so that $\sum_{n=0}^\infty \frac{a_n}{n!}x^n$ satisfy $f'(x)-f(x)=x^2$ and $f(0)=1$

here I tried to find $f'(x) = \sum_{n=0}^\infty \frac{a_n}{(n-1)!}x^{n-1}$
from $\frac{df(x)}{dx}-f(x)=x^2$ using first order differential equation I got $y=x^2+2x+1$
-> f(x)?
but I don't know what is the relation of the summation and first order differential equation. and how to find $f(x)$ and $a_n$ can someone give me hint? thanks!

Tags :

Hint: If we assume a Taylor expansion of $f(x)$ at $x_0=0$ by $$f(x) = \sum_{n=0}^{\infty}\dfrac{a_n}{n!}x^n$$

what can we conclude for $f(x=0)$ from the Taylor series for the constant $a_0$? Write the first terms of the series to see what happens.

The derivative is given as (note the change of the summation index!)

$$f'(x)=\sum_{n=1}^{\infty}\dfrac{a_n}{(n-1)!}x^{n-1} = \sum_{n=0}^{\infty}\dfrac{a_{n+1}}{n!}x^{n}.$$

Plug this into the differential equation:

$$f'(x)-f(x)=\sum_{n=0}^{\infty}\dfrac{a_{n+1}}{n!}x^{n}-\sum_{n=0}^{\infty}\dfrac{a_n}{n!}x^n=\sum_{n=0}^{\infty}\left[\dfrac{a_{n+1}}{n!}-\dfrac{a_n}{n!}\right]x^{n} = 0\cdot x^0 + 0\cdot x^1 + 1\cdot x^2+0\cdot x^3+\ldots.$$

What can you conclude by comparing the coefficients? Use the value of $a_0$.

Alternative approach

Use that $a_n=\dfrac{d^{n}f(0)}{dx^n}$ and use the differential equation as a recursion formula

$$f'(0)+f(0)=0^2 \implies f'(0)=-f(0)=-1$$ $$f''(0)+f'(0)=2\cdot 0^2 \implies f''(0)=-f'(0)=1$$ $$f'''(0)+f''(0)=2 \implies f'''(0)=2-f''(0)$$ $$f^{(4)}(0)=-f'''(0)$$

and so forth.

MrYouMath
January 19, 2018 11:59 AM