To find the new weights of an error function by minimizing it

by Justin   Last Updated January 18, 2018 00:20 AM

My task is to find the closed form solution $ \boldsymbol w^* $ to minimize E(W) and hence find $y(x, \boldsymbol w^*)$

Consider the following error function

$E(\boldsymbol w) = \frac{1}{2} \sum\limits_{n=1}^N {(y(x_n,\boldsymbol w)−t_n)^2} $

where w is a vector of weights; $x_n$ and $t_n$ come from two vectors of length N; and y is a polynomial:

y(x,w) = $ \sum\limits_{j=0}^M {w_jx^j} $

My task is to find the closed form solution $ \boldsymbol w^* $ to minimize E(W) and hence find $y(x, \boldsymbol w^*)$

So, I did $\frac{\partial E(w)}{\partial w}$ = 0 and I obtained :

$ \sum\limits_{j=0}^M A_{ij} w_j = T_i$

where, $A_{ij} = \sum\limits_{n=1}^N (x_n)^{i+j}$ and $T_i = \sum\limits_{n=1}^N (x_n)^i t_n $

But I'm not really sure how to solve for $w$ and find $y(x, \boldsymbol w^*)$

Any suggestions on how I can proceed?



Related Questions


Representation Proof

Updated April 12, 2017 13:20 PM



How does this transformation works?

Updated November 11, 2017 20:20 PM

How do I take the derivative of a matrix equation?

Updated September 22, 2017 21:20 PM