This is a followup on the previous analysis in 70134. In our disucssions we wondered if the low frequency excess noise can be all ascribed to the LSC feedforward being mistuned. The answer is: not all the noise increase, but only some of it.
The attached plot shows a selection of about 1500 times: each one is a period of at lest 600 seconds when the range was above 120 Mpc. For each segment I computed the average noise in the band 20-50 Hz as a proxy for the low frequency sensitivity. The gray circles uses CAL-DELTAL_EXTERNAL_DQ as in 70134. I then used a multi-coherence-based subtraction to remove the coherent contribution of LSC-MICH, PRCL and SRCL from DARM. This method should be similar to what Valera did and it's similar to the frequency-domain subtraction that NonSENS used to do. This "coherence-based subtraction" gives us an idea of the best that the LSC feedforward could ever do, by completely removing the coherent contribution of the LSC signals from DARM.
The orange dots in the plot show the 20-50 Hz noise after the coherence-based subtraction is performed. It looks like: