Let \( n \hspace{0.1em} \in \hspace{0.05em} \mathbb{N} \) be a natural number, and a given dataset of antecedents/images \( \Bigl \{ \bigl \{x_i, y_i \bigr \}_{ i \hspace{0.1em} \in \hspace{0.05em} n} \Bigr\}\).
This method allows you to find the expression of a unique polynomial function, with this series of values.
Lagrange interpolating polynomial tells us that it is possible to construct a unique polynomial corresponding to this series of values.
There will be two ways to generate this polynomial:
By the direct construction of a polynomial \(L(X) = \sum L_j(X)\)
$$ L(X) = \sum_{j = 0}^n y_j \Biggl[ \prod_{\underset{i \neq j}{i=0}}^n \frac{X - x_i}{x_j - x_i} \Biggr]$$
Given the following system \( (S) \):
$$ (S) \enspace \left \{ \begin{gather*} a_0 + a_1 x_0 + a_2 x_0 ^2 + \hspace{0.1em}... \hspace{0.1em}+ a_n x_0 ^n = y_0 \\ a_0 + a_1 x_1 + a_2 x_1 ^2 + \hspace{0.1em}... \hspace{0.1em}+ a_n x_1 ^n = y_1 \\ ........................ ............\\ a_0 + a_1 x_n + a_2 x_n ^2 + \hspace{0.1em}... \hspace{0.1em}+ a_n x_n ^n = y_n \\ \end{gather*} \right \} $$
We can resolve it at least by two methods:
With \( \bigl \{a_0, \ a_1, \ ..., \ a_n \bigr\} \) as the unique solution of the system \( (S)\)
$$ (S^*) \enspace \Longleftrightarrow \enspace \underbrace{ \begin{pmatrix} 1 & x_0 & x_0^2 & \dots & x_0^n \\ 1 & x_1 & x_1^2 & \dots & x_1^n \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 1 & x_n & x_n^2 & \dots & x_n^n \\ \end{pmatrix} } _\text{X} \times \underbrace{ \begin{pmatrix} a_0 \\ a_1 \\ a_2 \\ . \\ a_n \\ \end{pmatrix} } _\text{A} = \underbrace{ \begin{pmatrix} y_0 \\ y_1 \\ y_2 \\ . \\ y_n \\ \end{pmatrix} } _\text{Y} $$
And at that moment:
Where \(X_k\) represents the square matrix formed by \(X\) in which we inverted its \(k\)-th column with the column matrix \(Y\).
Let \( n \hspace{0.1em} \in \hspace{0.05em} \mathbb{N} \) be a natural number, and a given dataset of antecedents/images \( \Bigl \{ \bigl \{x_i, y_i \bigr \}_{ i \hspace{0.1em} \in \hspace{0.05em} n} \Bigr\}\).
And such as the following figure as an example of it:
We want to find a polynomial function that would correspond to this given series of values.
For a given naturel number \(m \enspace (m \in \mathbb{N} )\), a polynomial of degree \( m\) is defined as follows:
To find this function \( L(X) \), we will therefore have to find a value for each coefficient \(a_i\) for a given \(m\).
We will then have the correspondance between our coefficients and our series of values, which gives us the following system \((S)\):
$$ (S) \enspace \left \{ \begin{gather*} a_0 + a_1 x_0 + a_2 x_0 ^2 + \hspace{0.1em}... \hspace{0.1em}+ a_m x_0 ^m = y_0 \\ a_0 + a_1 x_1 + a_2 x_1 ^2 + \hspace{0.1em}... \hspace{0.1em}+ a_m x_1 ^m = y_1 \\ ........................ ............\\ a_0 + a_1 x_k + a_2 x_k ^2 + \hspace{0.1em}... \hspace{0.1em}+ a_m x_k ^m = y_k \\ ........................ ............\\ a_0 + a_1 x_m + a_2 x_m ^2 + \hspace{0.1em}... \hspace{0.1em}+ a_m x_m ^m = y_m \\ \end{gather*} \right \} $$
There will be at least two ways to generate this polynomial \(L(X)\):
- by the direct construction of a polynomial \(L(X) = \sum L_j(X)\)
- by constructing a polynomial \(P_n(X)\), by finding the only solution for the coefficients \(\bigl \{a_0, \ a_1, \ ..., \ a_n \bigr\} \) unknown to the system \( (S) \)
This method consists of constructing a series of \(n\) distinct polynomials \( \Bigl \{L_j(X) \Bigr\}_{ i \hspace{0.1em} \in \hspace{0.05em} \mathbb{N}} \) such as:
$$ \forall (i,j) \in [\![ 0, n]\!]^2, \enspace i \neq j, \enspace \Biggl \{ \begin{gather*} L_j(x_j) = 1 \\ L_j(x_i) = 0 \qquad (C) \end{gather*} $$
We will therefore construct each polynomial one by one.
For all \( j \in [\![ 0, n]\!] \), it is necessary that the condition \( (C) \) is satisfied, tha tis to say \( x_i \ ( i\neq j) \) are roots so that this polynomial \( L_j(x_i) \) cancels itself out. Then,
Finally, in order to have the polynomial \( L_j(x_j) \) being equal to \( 1 \), there must be a denominator that is equal to the numerator when \( X = x_j \). So,
Adding a denominator does not change anything from the previously installed condition \( (C) \).
Moreover, the condition \( i \neq j \) assures us that the denominator will never be zero.
Now, by multiplying each polynomial \( L_j(X) \) by the value of \( y_j \), we ensure that:
$$ \forall (i,j) \in [\![ 0, n]\!]^2, \enspace i \neq j, \enspace \Biggl \{ \begin{gather*} y_j L_j(x_j) = y_j \\ y_j L_j(x_i) = 0 \end{gather*} $$
Then, this resulting polynomial:
will correctly match each image to its antecedent in our series of initial values \( \Bigl \{ \bigl \{x_0, y_0 \bigr\}, \bigl \{x_1, y_1 \bigr\}, ..., \bigl \{x_n, y_n \bigr\} \Bigr\} \) .
And as a result,
$$ L(X) = \sum_{j = 0}^n y_j \Biggl[ \prod_{\underset{i \neq j}{i=0}}^n \frac{X - x_i}{x_j - x_i} \Biggr]$$
Since our dataset has \(n\) pair of values, we will have the system \((S)\) depending on \(n\):
$$ (S) \enspace \left \{ \begin{gather*} a_0 + a_1 x_0 + a_2 x_0 ^2 + \hspace{0.1em}... \hspace{0.1em}+ a_n x_0 ^n = y_0 \\ a_0 + a_1 x_1 + a_2 x_1 ^2 + \hspace{0.1em}... \hspace{0.1em}+ a_n x_1 ^n = y_1 \\ ........................ ............\\ a_0 + a_1 x_n + a_2 x_n ^2 + \hspace{0.1em}... \hspace{0.1em}+ a_n x_n ^n = y_n \\ \end{gather*} \right \} $$
It is useless to take a polynomial of degree greater than \(n\), because in further calculations there will remain one or more free variables which will not be useful.
We can proceed at least in two different ways to solve \( (S) \):
- by directly solving the system \( (S) \) by substitution or by a Gaussian pivot
- by solving the equivalent matricial system \( (S^*) \)
A System of linear equations such as the system \( (S) \) comprises a unique solution if and only if its associated homogeneous system \( (S_h) \) (i.e. without right side member) has a unique solution..
$$ (S_h) \enspace \left \{ \begin{gather*} a_0 + a_1 x_0 + a_2 x_0 ^2 + \hspace{0.1em}... \hspace{0.1em}+ a_n x_0 ^n = 0 \\ a_0 + a_1 x_1 + a_2 x_1 ^2 + \hspace{0.1em}... \hspace{0.1em}+ a_n x_1 ^n =0 \\ ........................ ............\\ a_0 + a_1 x_n + a_2 x_n ^2 + \hspace{0.1em}... \hspace{0.1em}+ a_n x_n ^n = 0 \\ \end{gather*} \right \} $$
However, a polynomial is always worth zero if and only if all of these coefficients are also worth zero.
Then, the solution of the system \( (S_h)\) is unique and:
We then conclude that the solution of the system \( (S)\) is also unique.
We then solve this system with the values \( \Bigl \{ \bigl \{x_0, y_0 \bigr\}, \bigl \{x_1, y_1 \bigr\}, ..., \bigl \{x_n, y_n \bigr\} \Bigr\}\) of the series, either by substitution or by a Gaussian pivot.
We will then obtain a unique polynomial which is:
With \( \bigl \{a_0, \ a_1, \ ..., \ a_n \bigr\} \) as the unique solution of the system \( (S)\)
It is possible to transpose this system into a matrix system:
$$ (S^*) \enspace \Longleftrightarrow \enspace \underbrace{ \begin{pmatrix} 1 & x_0 & x_0^2 & \dots & x_0^n \\ 1 & x_1 & x_1^2 & \dots & x_1^n \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 1 & x_n & x_n^2 & \dots & x_n^n \\ \end{pmatrix} } _\text{X} \times \underbrace{ \begin{pmatrix} a_0 \\ a_1 \\ a_2 \\ . \\ a_n \\ \end{pmatrix} } _\text{A} = \underbrace{ \begin{pmatrix} y_0 \\ y_1 \\ y_2 \\ . \\ y_n \\ \end{pmatrix} } _\text{Y} $$
With such a system, there exists a unique solution if and only its matrix (ici \(X\)) is invertible. And this is the case if its determinant is non-zero.
Which will be the case here, because our matrix is a Vandermonde matrix, and the condition for its determinant to be non-zero is that every \(x_i \) are two by two different.
This is the case here because our series of values includes \(x_i\) all different; the solution is therefore unique.
We must first calculate the general determinant of the matrix \(X\), as well as each specific determinant to obtain our series of solutions \(\{ a_0, a_1, ..., a_n \}\), coefficients of our polynomial to be determined.
Where \(X_k\) represents the square matrix formed by \(X\) in which we inverted its \(k\)-th column with the column matrix \(Y\).
We will then obtain a unique polynomial:
Where \(X_k\) represents the square matrix formed by \(X\) in which we inverted its \(k\)-th column with the column matrix \(Y\).
Let us construct a polynomial of degree \(2\) using this method, and give it the following dataset on three points:
Let us construct this unique polynomial with the two possibilities seen above:
- by the direct construction of a polynomial \(L(X) = \sum L_j(X)\)
- by constructing a polynomial \(P_n(X)\), by finding the only solution for the coefficients \(\bigl \{a_0, \ a_1, \ ..., \ a_n \bigr\} \) unknown to the system \( (S) \)
We have three polynomials to construct: \(L_0(X), L_1(X),L_2(X) \).
According to the formula, each \(L_j(x)\) will be of the form:
We definitely have:
$$ \Biggl \{ \begin{gather*} L_0(x_0) = 1 \\ L_0(x_1) = 0, \enspace L_0(x_2) = 0 \end{gather*} $$
We definitely have:
$$ \Biggl \{ \begin{gather*} L_1(x_1) = 1 \\ L_1(x_0) = 0, \enspace L_1(x_2) = 0 \end{gather*} $$
We definitely have:
$$ \Biggl \{ \begin{gather*} L_2(x_2) = 1 \\ L_2(x_0) = 0, \enspace L_2(x_1) = 0 \end{gather*} $$
We can then construct our polynomial by adding the \(y_j L_j(X)\).
We seek to solve the following system \((S)\):
$$ (S) \enspace \left \{ \begin{gather*} a_0 + a_1 x_0 + a_2 x_0 ^2 = y_0 \\ a_0 + a_1 x_1 + a_2 x_1 ^2 = y_1 \\ a_0 + a_1 x_2 + a_2 x_2 ^2 = y_2 \\ \end{gather*} \right \} $$
With our dataset:
So,
$$ (S) \enspace \left \{ \begin{gather*} a_0 \hspace{5.5em} = 2 \qquad \hspace{0.5em} (L_1) \\ a_0 + \hspace{0.3em} a_1 \hspace{0.2em} + \hspace{0.2em} a_2 \hspace{0.4em} = -3 \qquad (L_2) \\ a_0 + 2a_1 + 4a_2 \hspace{0.1em} = 1 \qquad \hspace{0.7em} (L_3) \\ \end{gather*} \right \} \Longrightarrow a_0 = 2$$
Then, we replace in the other lines \( a_0 \) by its value:
$$ (S) \enspace \left \{ \begin{gather*} 2 + \hspace{0.3em} a_1 \hspace{0.2em} + \hspace{0.2em} a_2 \hspace{0.2em} = -3 \qquad (L_2) \\ 2 + 2a_1 + 4a_2 = 1 \qquad \hspace{0.6em} (L_3) \\ \end{gather*} \right \} $$
With \( (L_2) \), we directly have:
Now, we inject \( (a_1 = f(a_2)) \) into \( (L_3) \):
Finally, we inject \( (a_2) \) into \( (L_2) \):
And we do obtain as a unique solution for \((S)\):
And as a result, the polynomial \(P_2(X) \):
We seek to solve the following matricial system \((S^*)\):
$$ (S^*) \enspace \Longleftrightarrow \enspace \begin{pmatrix} 1 & x_0 & x_0^2 & \dots & x_0^n \\ 1 & x_1 & x_1^2 & \dots & x_1^n \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 1 & x_n & x_n^2 & \dots & x_n^n \\ \end{pmatrix} \begin{pmatrix} a_0 \\ a_1 \\ a_2 \\ . \\ a_n \\ \end{pmatrix} = \begin{pmatrix} y_0 \\ y_1 \\ y_2 \\ . \\ y_n \\ \end{pmatrix} $$
So in our case:
$$ (S^*) \enspace \Longleftrightarrow \enspace \begin{pmatrix} 1 & 0 & 0 \\ 1 & 1 & 1 \\ 1 & 2& 4 \\ \end{pmatrix} \times \begin{pmatrix} a_0 \\ a_1 \\ a_2 \\ \end{pmatrix} = \begin{pmatrix} 2 \\ \hspace{-0.8em} -3 \\ 1\\ \end{pmatrix} $$
We calculate \(det(X)\), the determinant of the matrix \(X\).
$$ det(X) = \begin{vmatrix} 1 & 0 & 0 \\ 1 & 1 & 1 \\ 1 & 2& 4 \\ \end{vmatrix} $$
$$ det(X) = 1\times \begin{vmatrix} 1 &1 \\ 2 & 4 \\ \end{vmatrix} -0 \times \begin{vmatrix} 1 &1 \\ 1 & 4 \\ \end{vmatrix} +0 \times \begin{vmatrix} 1 &1 \\ 1 &2 \\ \end{vmatrix} $$
To obtain \(X_1 \), we replace the \(1 ^{st}\) column of the matrix \(X\) by \(Y\).
$$ X_1 = \begin{pmatrix} 2 & 0 & 0 \\ \hspace{-0.8em} -3 & 1 & 1 \\ 1 & 2& 4 \\ \end{pmatrix} $$
Then we calculate its determinant:
$$ det(X_1) = \begin{vmatrix} 2 & 0 & 0 \\ \hspace{-0.8em} -3 & 1 & 1 \\ 1 & 2& 4 \\ \end{vmatrix} $$
$$ det(X_1) = 2\times \begin{vmatrix} 1 &1 \\ 2 & 4 \\ \end{vmatrix} -0 \times \begin{vmatrix} -3&1 \\ \hspace{0.5em} 1 & 4 \\ \end{vmatrix} +0 \times \begin{vmatrix} -3 &1 \\ \hspace{0.5em} 1 &2 \\ \end{vmatrix} $$
To obtain \(X_2 \), we replace the \(2 ^{nd}\) column of the matrix \(X\) by \(Y\).
$$ X_2 = \begin{pmatrix} 1 & 2 & 0 \\ 1 & \hspace{-0.8em} -3 & 1 \\ 1 & 1& 4 \\ \end{pmatrix} $$
Then we calculate its determinant:
$$ det(X_2) = \begin{vmatrix} 1 &2 & 0 \\ 1 & \hspace{-0.8em} -3 & 1 \\ 1 & 1& 4 \\ \end{vmatrix} $$
$$ det(X_2) = 1\times \begin{vmatrix} \hspace{-0.8em} -3 &1 \\ 1 & 4 \\ \end{vmatrix} -2\times \begin{vmatrix} 1&1 \\ 1 & 4 \\ \end{vmatrix} +0 \times \begin{vmatrix} 1& \hspace{-0.8em} -3 \\ 1 & 1 \\ \end{vmatrix} $$
To obtain \(X_3 \), we replace the \(3 ^{rd}\) column of the matrix \(X\) by \(Y\).
$$ X_3 = \begin{pmatrix} 1 & 0 & 2 \\ 1 & 1 & \hspace{-0.8em} -3\\ 1 & 2& 1 \\ \end{pmatrix} $$
Then we calculate its determinant:
$$ det(X_3) = \begin{vmatrix} 1 & 0 & 2 \\ 1 & 1 & \hspace{-0.8em} -3\\ 1 & 2& 1 \\ \end{vmatrix} $$
$$ det(X_3) = 1\times \begin{vmatrix} 1 & \hspace{-0.8em} -3 \\ 2 & 1 \\ \end{vmatrix} -0\times \begin{vmatrix} 1& \hspace{-0.8em} -3 \\ 1 & 1 \\ \end{vmatrix} +2 \times \begin{vmatrix} 1& 1\\ 1 & 2 \\ \end{vmatrix} $$
The, we have as an unique solution for \((S^*)\):
And as a result, the polynomial \(P_2(X) \):
We found for the three previous cases a polynomial of degree 2:
Let us now check that our polynomial verifies the three points of our initial conditions:
We do have:
We definitely have checked that:
$$ \Biggl \{ \begin{gather*} P_2(x_0) = y_0 \\ P_2(x_1) = y_1 \\ P_2(x_2) = y_2 \end{gather*} $$
We finally obtain this smooth curve corresponding to our interpolation: