French flag Arrows English flag
Sun Arrows Moon
Return Index

The superposition principle

Let \( (n, m) \in \hspace{0.05em} \mathbb{N}^2 \) be two natural numbers and:

  1. - a series of continuous functions \( a_1(x), a_2(x), \hspace{0.2em} ... \hspace{0.2em}, a_n(x) \)

  2. - a series of functions \( f_1(x), f_2(x), \hspace{0.2em} ... \hspace{0.2em}, f_m(x) \)

Let \( y(x) \) be a function of class \( \mathbb{C}^{n}\) on an interval \(I\). We note \(y^{(n)}\) its \(n\)-th derivative.


In the context of solving a linear differential equation of order \(n\) where the right hand side is a linear combination of functions such that \( (E)\):

$$ a_n(x) y^{(n)}(x) \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + \hspace{0.2em} a_1(x) y'(x) + a_0(x) y(x) = \lambda_1 f_1(x) + \lambda_2 f_2(x) \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + \hspace{0.2em} \lambda_m f_m(x) \qquad (E) $$

with for all \( k \in [\![ 1, m ]\!] \), the function \( (y_k) \) as a specific solution of the equation \( (E_k) \) :

$$ a_n(x) y^{(n)}(x) \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + \hspace{0.2em} a_1(x) y'(x) + a_0(x) y(x) = f_k(x) \qquad (\tilde E_k) \\ $$

The superposition principle tells us that:

$$ \forall k \in [\![ 1, m ]\!], $$

$$ y_k \enspace \underline{specific \ solution} \ of \ (\tilde E_k) \Longleftrightarrow (\lambda_1 y_1 + \lambda_2 y_2 \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + \hspace{0.2em} \lambda_m y_m) \ \underline{total \ specific \ solution} \ of \ (E) $$


Demonstration

Let \( (n, m) \in \hspace{0.05em} \mathbb{N}^2 \) be two natural numbers and:

  1. - a series of continuous functions \( a_1(x), a_2(x), \hspace{0.2em} ... \hspace{0.2em}, a_n(x) \)

  2. - a series of functions \( f_1(x), f_2(x), \hspace{0.2em} ... \hspace{0.2em}, f_m(x) \)

Let \( y(x) \) be a function of class \( \mathbb{C}^{n}\) on an interval \(I\). We note \(y^{(n)}\) its \(n\)-th derivative.


We start from the equation \( (E) \), linear differential of order \( n\) where the right hand side is a linear combination of functions.

$$ a_n(x) y^{(n)}(x) \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + \hspace{0.2em} a_1(x) y'(x) + a_0(x) y(x) = \lambda_1 f_1(x) + \lambda_2 f_2(x) \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + \hspace{0.2em} \lambda_m f_m(x) \qquad (E) $$

  1. Specific solution for each function \( f_k(x)\)

  2. For all \( k \in [\![ 1, m ]\!] \), we then have a series of equations \( (\tilde E_k) \) to solve:

    $$ \left \{ \begin{gather*} a_n(x) y^{(n)}(x) \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + \hspace{0.2em} a_1(x) y'(x) + a_0(x) y(x) = f_1(x) \qquad (\tilde E_1) \\ a_n(x) y^{(n)}(x) \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + \hspace{0.2em} a_1(x) y'(x) + a_0(x) y(x) = f_2(x) \qquad (\tilde E_2) \\ ........................ \\ a_n(x) y^{(n)}(x) \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + \hspace{0.2em} a_1(x) y'(x) + a_0(x) y(x) = f_k(x) \qquad (\tilde E_k) \\ ........................ \\ a_n(x) y^{(n)}(x) \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + \hspace{0.2em} a_1(x) y'(x) + a_0(x) y(x) = f_m(x) \qquad (\tilde E_m) \\ \end{gather*} \right \}$$

    In this series of equations, we then notice that:

    $$ \forall k \in [\![ 1, m ]\!], \enspace y_k \enspace solution \enspace of \enspace (\tilde E_k) $$

    If each \( y_k \) is a solution for \( (\tilde E_k) \), then each \( y_k \) verifies:

    $$ a_n(x) y_k^{(n)}(x) \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + \hspace{0.2em} a_1(x) y_k'(x) + a_0(x) y_k(x) = f_k(x) \qquad (\tilde E_k)$$

    Thereby, by multiplying \( (\tilde E_k) \) by \( \lambda_k \):

    $$ a_n(x) \lambda_k y_k^{(n)}(x) \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + \hspace{0.2em} a_1(x) \lambda_k y_k'(x) + a_0(x) \lambda_k y_k(x) = \lambda_k f_k(x) \qquad ( \lambda_k \tilde E_k)$$

    Now, thanks to linearity of the derivative, we know that:

    $$ \bigl( \lambda f \bigr)' = \lambda f' $$

    So in our case,

    $$ \forall k \in [\![ 1, m]\!], \enspace \Bigl( \lambda_k f_k \Bigl)' = \lambda_k f'_k \qquad (1) $$

    Thanks to\( (1) \), we can rearrange it and see that:

    $$ a_n(x) (\lambda_k y_k)^{(n)}(x) \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + a_1(x) \bigl(\lambda_k y_k \bigr)'(x) + a_0(x) \bigl(\lambda_k y_k\bigr)(x) = \lambda_k f_k(x) \qquad ( \lambda_k \tilde E_k)$$

    This equation shows that:

    $$ \forall k \in [\![ 1, m]\!], \enspace \lambda_k y_k \enspace solution \enspace of \enspace (\lambda_k \tilde E_k) $$

  3. Total specific solution from the aggregation of functions \( \lambda_k f_k \)

  4. Previously, we were able to see that each \( \lambda_k y_k \) is solution for \( (\lambda_k \tilde E_k) \):

    $$ \left \{ \begin{gather*} a_n(x) (\lambda_1 y_1)^{(n)}(x) \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + a_1(x) \bigl(\lambda_1 y_1 \bigr)(x)' + a_0(x) \bigl(\lambda_1 y_1\bigr)(x) = \lambda_1 f_1(x) \qquad (\lambda_1 \tilde E_1) \\ a_n(x) (\lambda_2 y_2)^{(n)}(x) \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + a_1(x) \bigl(\lambda_2 y_2\bigr)(x) ' + a_0(x) \bigl(\lambda_2 y_2\bigr)(x) = \lambda_2 f_2(x) \qquad (\lambda_2 \tilde E_2) \\ ........................ \\ a_n(x) (\lambda_k y_k)^{(n)}(x) \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + a_1(x) \bigl(\lambda_k y_k \bigr)(x)' + a_0(x) \bigl(\lambda_k y_k \bigr)(x) = \lambda_k f_k(x) \qquad (\lambda_k \tilde E_k) \\ ........................ \\ a_n(x) (\lambda_m y_m)^{(n)}(x) \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + a_1(x) \bigl(\lambda_m y_m\bigr)(x) ' + a_0(x) \bigl(\lambda_m y_m \bigr)(x) = \lambda_m f_m(x) \qquad (\lambda_m \tilde E_m) \\ \end{gather*} \right \} $$

    By adding up the \( (\lambda_k \tilde E_k) \) from \(1 \) to \(m \):

    $$ a_n(x) \underbrace{\Biggl[ \sum_{k=1}^m \lambda_k y_k^{(n)} \Biggr]} _\text{ \( y_s^{(n)} \)} \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + \hspace{0.2em} a_1(x) \underbrace{\Biggl[ \sum_{k=1}^m \lambda_k y_k' \Biggr]} _\text{ \( y_s' \)} \hspace{0.2em} + \hspace{0.2em} a_0(x) \underbrace{\Biggl[ \sum_{k=1}^m \lambda_k y_k \Biggr]} _\text{ \( y_s\)} \hspace{0.2em} = \hspace{0.2em} \sum_{k=1}^m \lambda_k f_k \qquad (E-bis) $$

    Thus, a total specific solution \( y_s \) which will be the addition of all the specific solutions \( \lambda_k y_k \):

    $$ y_s(x) = \lambda_1 y_1(x) + \lambda_2 y_2(x) \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + \hspace{0.2em} \lambda_m y_m(x) $$

    In the end, thanks to \( (E-bis) \), we see that this function is indeed a solution of \( ( E) \):

    $$ a_n(x) y_s^{(n)}(x) \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + a_1(x) y_s'(x) + a_0(x) y_s(x) = \lambda_1 f_1(x) + \lambda_2 f_2(x) \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em}+ \hspace{0.2em} \lambda_m f_m(x) \qquad (E) $$

    And as a result,

    $$ \forall k \in [\![ 1, m ]\!], $$

    $$ y_k \enspace \underline{specific \ solution} \ of \ (\tilde E_k) \Longleftrightarrow (\lambda_1 y_1 + \lambda_2 y_2 \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + \hspace{0.2em} \lambda_m y_m) \ \underline{total \ specific \ solution} \ of \ (E) $$


Example of superposition of solutions


Let \( (E) \) be a first order linear differential equation \( LDE_1 \) with a constant coefficient \( (H) \) its associated homogeneous equation:

$$ \Biggl \{ \begin{align*} y'(x) + 2 y(x) = 2x^2 + 3cos(x) + 1 \qquad (E) \\ y'(x) + 2y(x) = 0 \qquad (H) \end{align*} $$

We then have a series of equations \( (\tilde E_1), (\tilde E_2), (\tilde E_3) \) to solve:

$$ \left \{ \begin{gather*} y'(x) + 2 y(x) = x^2 \qquad \qquad (\tilde E_1) \\ y'(x) + 2 y(x) = cos(x) \qquad (\tilde E_2) \\ y'(x) + 2 y(x) = 1 \qquad \qquad (\tilde E_3) \\ \end{gather*} \right \} $$

  1. Solving the first equation \( (\tilde E_1)\)

  2. This equation was solved in the example of solving \( LDE_1 \) with constant coefficient.

    The specific solution \( y_1 \) of \( (\tilde E_1) \) is:

    $$ y_1(x)= \frac{x^2 }{2}-\frac{x}{2} + \frac{1}{4} $$
  3. Solving the second equation \( (\tilde E_2)\)

  4. We have a solution to the homogeneous equation \( (H) \) (see the example of solving \( EDL_1 \) with constant coefficient):

    $$ y_h(x) = Ke^{-2x} $$

    We then look for a particular solution \( y_s \) of type:

    $$ y_2(x) = K(x) e^{-2x} \qquad (y_2) $$

    After having performed a variation of parameters , we seek to determine the function \( K(x) \) such as:

    $$ K'(x) = cos(x)e^{2x} $$

    Now, taking its antiderivative, we do have:

    $$ K(x) = \int^x cos(t)e^{2t} \hspace{0.2em} dt$$

    We perform an integration by parts with the following choice for \( u \) and \( v' \):

    $$ \Biggl \{ \begin{align*} u(t) = cos(t) \\ v'(t) = e^{2t } \hspace{0.1em} dt \end{align*} $$

    $$ \Biggl \{ \begin{align*} u'(t) = -sin(t) dt \\ v(t) = \frac{1}{2} e^{2t } \end{align*} $$

    $$K(x) = \frac{1}{2} \Biggl[cos(t) e^{2t }\Biggr]^x - \int^x \frac{1}{2} (-sin(t)) \hspace{0.2em} e^{2t } \hspace{0.2em} dt $$
    $$K(x) = \frac{1}{2} \Biggl[cos(t)e^{2t }\Biggr]^x + \int^x \frac{1}{2} sin(t) \hspace{0.2em} e^{2t } \hspace{0.2em} dt $$
    $$K(x) = \frac{1}{2} \Biggl[cos(t) e^{2t }\Biggr]^x + \frac{1}{2} \Biggl( \frac{1}{2} \Biggl[sin(t) e^{2t }\Biggr]^x - \frac{1}{2}\int^x cos(t) e^{2t} \hspace{0.2em} dt \Biggr) $$

    We note that \( K(x) \) reappears, so we replace it:

    $$K(x) = \frac{1}{2} \Biggl[cos(t) e^{2t }\Biggr]^x + \frac{1}{4} \Biggl[sin(t) e^{2t }\Biggr]^x - \frac{1}{4} K(x) $$
    $$ \frac{5}{4} K(x) = \frac{1}{2} cos(x)e^{2x } + \frac{1}{4} sin(x) e^{2x } $$
    $$ K(x) =\frac{8}{20} cos(x) e^{2x } + \frac{4}{20} sin(x) e^{2x } $$
    $$ K(x) = e^{2x } \biggl( \frac{2}{5}cos(x) + \frac{1}{5} sin(x) \biggr) \qquad (K) $$

    Injecting \( K \) into \( y_s \), the exponentials annihilate:

    $$ y_2(x) = e^{2x } \biggl( \frac{2}{5} cos(x) + \frac{1}{5} sin(x) \biggr) e^{-2x} $$

    The specific solution \( y_2 \) of \( (\tilde E_2) \) is:

    $$ y_2(x)= \frac{2 }{5}cos(x) +\frac{1}{5}sin(x) $$
  5. Solving the third equation \( (\tilde E_3)\)

  6. $$ y'(x) + 2 y(x) = 1 \qquad \qquad (\tilde E_3) $$

    Here, \(\frac{1 }{2} \) is an obvious solution.

    Thus, the specific solution \( y_3 \) of \( (\tilde E_3) \) is:

    $$ y_3(x)= \frac{1 }{2} $$
  7. Superposition of solutions\(: \sum \lambda_k y_k \)

  8. We saw in the demonstration above that:

    $$ \forall k \in [\![ 1, m ]\!], $$
    $$ y_k \ \underline{specific \ solution} \ of \ (\tilde E_k) \Longleftrightarrow (\lambda_1 y_1 + \lambda_2 y_2 \hspace{0.2em} + \hspace{0.2em} ... \hspace{0.2em} + \hspace{0.2em} \lambda_m y_m) \ \underline{total \ specific \ solution} \ of \ (E) $$

    So in our case:

    $$ 2 y_1(x) + 3 y_2(x) + y_3(x) \ \underline{total \ specific \ solution} \ of \ ( E) $$

    Let us then calculate this particular total solution \(y_s\):

    $$ y_s(x) = 2 y_1(x) + 3 y_2(x) + y_3(x) $$
    $$ y_s(x) = 2 \Biggl( \frac{x^2 }{2}-\frac{x}{2} + \frac{1}{4} \Biggr) + 3 \Biggl(\frac{2 }{5}cos(x) +\frac{1}{5}sin(x) \Biggr) + \frac{1}{2} $$
    $$ y_s(x) = x^2 - x + \frac{1}{2} + \frac{6 }{5}cos(x) +\frac{3}{5}sin(x) + \frac{1}{2} $$
    $$ y_s(x) = x^2 - x + 1 + \frac{6 }{5}cos(x) +\frac{3}{5}sin(x) $$
  9. Verification of the total specific solution \( y_s\)

  10. If \( y_s \) is solution for \( (E) \), then:

    $$ y_s'(x) + 2 y_s(x) = 2x^2 + 3cos(x) + 1 $$

    Let us check it.

    $$ y_s'(x) + 2 y_s(x) = \Biggl( 2x -1 -\frac{6 }{5} sin(x) +\frac{3}{5} cos(x)\Biggr) + 2\Biggl( x^2 - x + 1 + \frac{6 }{5}cos(x) +\frac{3}{5}sin(x) \Biggr) $$
    $$ y_s'(x) + 2 y_s(x) = 2x -1 -\frac{6 }{5} sin(x) +\frac{3}{5} cos(x) + 2x^2 - 2x +2 + \frac{12 }{5}cos(x) + +\frac{6}{5}sin(x)$$

    By tidying up a little:

    $$ y_s'(x) + 2 y_s(x) = 2x^2 + \hspace{0.2em} \underbrace{ 2x - 2x} _\text{\(=0\)} \hspace{0.2em} + \hspace{0.2em} 2 -1 + \hspace{0.2em} \underbrace{\frac{6}{5}sin(x) -\frac{6 }{5} sin(x)} _\text{\(=0\)} \hspace{0.2em} + \frac{12 }{5}cos(x) +\frac{3}{5} cos(x) $$
    $$ y_s'(x) + 2 y_s(x) = 2x^2 +1 + \frac{15}{5}cos(x) $$
    $$ y_s'(x) + 2 y_s(x) = 2x^2 + 3cos(x) +1 $$

    We definitely verified that \( y_s \) was a solution for \( (E) \).

Return Index
Scroll top Go to the top of the page