Savitzky-Golay filters (yes they have other names) are robust estimators of slope. Where a small noise can substantially damage the slope estimate of textbook finite difference methods, the S-G filter has much smaller response to the noise.
What is the integration/integrator equivalent of this filter?
I would think it is not Gaussian Quadrature, because quadrature is not derived from convolution, and it is sensitive to noise in a manner similar to non-robust finite difference methods for estimating the slope.
I tried several google searches, google scholar searches, and IEEE searches, but nearly nothing comes up referring to S-G integration or integrators. Am I using the wrong terms (or proper-nouns)?
Applications typically have the sliding time window much shorter than the overall data length. If you want to get the integral between one time and another for a period longer than the sliding time window, you'd have to carry over the integration result from earlier data than the period covered by a time window. That's inconsistent with the SG idea of just using data withing the time window. Conceptually, SG operation is like a general form of moving average in ARMA terms, only dependent on input values, not on any SG output. But integration, if you're doing it going from one time to another, inherently depends on the output of the previous integration step (more like the "autoregressive" in ARMA models).
Why not just use the SG smoother to estimate values (not derivatives) using the polynomial order of your choice and then integrate the smoothed values using your integration method of choice after SG smoothing?
I assume by integration, you mean a running estimate of the integral of a signal (e.g. from 0 to the current point): $\int_0^t f(\tau) d\tau$.
Methods like Savitzky-Golay filters are needed to differentiate noisy signals because naive methods like finite differencing are very sensitive to noise. In contrast, integration inherently provides smoothing, so there's not as much need for specialized methods. One can sometimes get away with really simple approaches, like taking the cumulative sum over elements of the signal vector, multiplied by the sampling interval.
One way to think of this is that the finite difference estimate of the slope only depends on a couple points. The noise can completely dominate this estimate if the magnitude of the noise is large relative to the actual change in signal between the points. In contrast, integration requires summing over many previous points. To the extent that noise fluctuates up and down rapidly relative to the signal, it will average out.
For example, here's a signal with additive white Gaussian noise, integrated by the simple cumulative sum method mentioned above. The integral of the noisy signal matches the true value fairly well, without any special treatment.
Of course, the nature of the noise matters, and it my be necessary to explicitly filter it out in some cases. For example, if the signal is contaminated by high amplitude, low frequency noise, this could heavily influence the integral (but wouldn't affect the derivative as much). So, it would be best to apply a high pass filter before integrating. If the noise is narrow band (e.g. 50/60Hz noise from the power lines), a notch filter would be appropriate, etc.