(sub)gradient of $\ell_1$ norm

by jjjjjj   Last Updated January 18, 2018 20:20 PM

I know that $\|\cdot\|_1$ is non-differentiable, but I'm curious about what happens we attempt to use the Frechet definition to get a subgradient $T: \mathbb{R}^n \to \mathbb{R}$.

Specifically, consider $A \in \mathbb{R}^{m \times n},\, y \in \mathbb{R}^m,\, x \in \mathbb{R}^n$ and \begin{align} \lim_{\|v\|\to 0}\, \biggr\lvert \frac{\|y - A(x+v)\|_1 - \|y - Ax\|_1 - Tv}{\|v\|} \biggr \rvert \leq \lim_{\|v\|\to 0}\, \biggr\lvert \frac{\sum_{i} |x_i^T v | - Tv}{\|v\|} \biggr \rvert = 0 \end{align} where the inequality comes from reverse triangle inequality and expanding the definition of $\|\cdot\|_1$.

Is there a way to get something meaningful from this using the chain rule? Or does this not make sense because the $sign(x)$ function (which should go into $T$) is nonlinear?

Related Questions

Gateaux derivative of the supremum norm

Updated March 09, 2018 22:20 PM

Meaning of functional differentiability

Updated June 22, 2017 15:20 PM