by Robert Smith
Last Updated September 07, 2018 07:19 AM

Why is the sum of squares error function of noisy input and noisy target variables very similar to the error function for only noisy input?

This is the relevant part in Bishop's book:

Another viewpoint on kernel regression comes from a consideration of regression problems in which the input variables as well as the target variables are corrupted with additive noise. Suppose each target value $t_{n}$ is generated as usual by taking a function $y(z_{n})$ evaluated at a point $z_{n}$, and adding Gaussian noise. The value of $z_{n}$ is not directly observed, however, but only a noise corrupted version $x_{n}=z_{n}+\xi_{n}$ where the random variable $\xi$ is governed by some distribution $g(\xi)$.

$$E=\frac{1}{2}\sum_{n=1}^{N}\int \{y(x_{n}-\xi_{n})-t_{n}\}^{2}g(\xi_{n})d\xi_{n}$$

However, the error function when we consider a noisy input variable is extremely similar:

$$E=\frac{1}{2}\sum_{n=1}^{N}\int \{y(x_{n}+\xi_{n})-t_{n}\}^{2}\nu(\xi_{n})d\xi_{n}$$

Why?

- ServerfaultXchanger
- SuperuserXchanger
- UbuntuXchanger
- WebappsXchanger
- WebmastersXchanger
- ProgrammersXchanger
- DbaXchanger
- DrupalXchanger
- WordpressXchanger
- MagentoXchanger
- JoomlaXchanger
- AndroidXchanger
- AppleXchanger
- GameXchanger
- GamingXchanger
- BlenderXchanger
- UxXchanger
- CookingXchanger
- PhotoXchanger
- StatsXchanger
- MathXchanger
- DiyXchanger
- GisXchanger
- TexXchanger
- MetaXchanger
- ElectronicsXchanger
- StackoverflowXchanger
- BitcoinXchanger
- EthereumXcanger