Summary:
I did jitter subtraction similar to alog 31378 from Nov 09 last year.
It seems like the jitter is eating somewhat smaller than 6% of BNS range. That is a loss of about 4Mpc at the time of measurement. Look at the first attachment (but don't pay attention to absolute number of the BNS range, just make the ratio of the BNS ranges, which is about .944, and the sensemon range at the time of measurement was about 66Mpc, 66/.944=69.9).
Simple spectrum comparison show that the jitter bumps are bigger today than Nov 09 2016.
What was done:
I picked a time window starting 25/01/2017 08:03:00 UTC for 8004 seconds. This is a 66Mpc-ish lock segment without any huge glitch from today (1st attachment left).
Coherence and cross spectrum were measured between CAL-DARM_EXTERNAL_DQ and various witness channels that are believed to have some jitter related information, both angular and doughnut (1st right). These channels are IMC WFS DC A and B PIT and YAW, and ILS HV and PMC HV. As you can see, the coherence is relatively high for f>100Hz.
Next, as a quick and dirty method of noise projection using multiple channels, for each frequency bin I chose the channel with the highest coherence as the best witness for that specific bin to make a combined witness channel:
bestchannel(f) = whatever channel with the largest coh(f).
combinedWitnessASD(f) = ASD(bestchannel(f), f).
combinedCrossSpectrum(f) = crossspectrum(CAL_DELTAL_EXTERNAL, bestchannel(f), f).
noiseProjection(f) = combinedCrossSpectrum(f)/combinedWitnessASD(f).
This way I'm only using whatever is the dominant noise witness without double counting (or without properly calculating the multiple noise sources).
Then I subtracted the projected noise from the original CAL_DELTAL_EXTERNAL.
subtracted= sqrt(CAL_DELTAL_EXTERNAL^2-noiseProjection^2).
I used the dtt calibration file from calibration SVN to convert CAL_DELTAL_EXTERNAL to displacement:
/ligo/svncommon/CalSVN/aligocalibration/trunk/Runs/O2/H1/Scripts/ControlRoomCalib/caldeltal_calib.txt
I plugged the spectrum to BNS_range.m in the calibration svn to obtain BNS range. This produces different number from sensemon but it should be good for comparison purpose.
If you look at the second attachment, before subtraction the range was 75.9Mpc, after the subtraction it's 80.4, i.e. about 6% increase. If we take the 66Mpc number from sensemon, after subtraction this would become 69.9.
Comparison between now and Nov 09 2016:
Compare the first attachment with the displacement spectrum of this plot, it seems to me that jitter bumps in DARM are larger now than before, though these things change with time.