TITLE: 06/14 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Observing at 147Mpc
INCOMING OPERATOR: Oli
SHIFT SUMMARY:
IFO is in NLN and OBSERVING as of 21:33 UTC.
21:07 UTC to 21:33 UTC: COMISSIONING
19:28 UTC to 21:07 UTC: OBSERVING
Things of note:
LOG:
Start Time | System | Name | Location | Lazer_Haz | Task | Time End |
---|---|---|---|---|---|---|
15:29 | PCAL | Tony | Pcal Lab | Local | LLO Measurements | 17:59 |
16:33 | PCAL | Francisco | Pcal Lab | Local | LLO Measurements | 17:21 |
17:13 | SQZ | Sheila | LVEA | Y | Move magnetometer | 17:59 |
19:03 | PCAL | Tony, Miriam | PCAL Lab | N | LLO Measurements | 19:55 |
19:43 | RUN | Camilla, Neil | Y Arm | N | Improve and/or maintain health | 19:43 |
19:56 | PCAL | Francisco | PCAL | N | Preparing for lab upgrade steps | 19:58 |
22:12 | PCAL | Francisco, Miriam | PCAL Lab | Local | Preparing for lab upgrade steps | 22:40 |
22:41 | SAF | LVEA | LVEA | YES | LVEA IS LASER HAZARD | 15:52 |
22:42 | WALK | Francisco, Miriam | Overpass | N | Walking | 23:02 |
TITLE: 06/14 Eve Shift: 2300-0800 UTC (1600-0100 PST), all times posted in UTC
STATE of H1: Observing at 147Mpc
OUTGOING OPERATOR: Ibrahim
CURRENT ENVIRONMENT:
SEI_ENV state: CALM
Wind: 5mph Gusts, 3mph 5min avg
Primary useism: 0.02 μm/s
Secondary useism: 0.10 μm/s
QUICK SUMMARY:
Detector Observing and has been locked for over 4 hours. Range isn't great, around 148 MPc, but besides that everything is good.
I looked at the SUS-{SR2,SR3,SRM_M1_DAMP_{P,Y}_INMON channels for a period on the 21st April before the OFI, and after we had re-aligned through the OFI to rcover our optical gain on the 25th April after the OFI burn.
I have compared these to the drift in alignment in the signal extraction cavity yesterday. This is partly to figure out why we have to keep retuning our SRCL feedfoward.
Date | Mirror | delta pitch value | delta yaw value |
21st April | SR2 | 2 | 0.8 |
25th April | SR2 | 3 | 2 |
13th June | SR2 | 3 | 3 |
21st April | SR3 | 0.5 | 0.9 |
25th April | SR3 | 0.3 | 6 |
13th June | SR3 | 0.5 | 0.3 |
21st April | SRM | 13 | 6 |
25th April | SRM | 23 | 7 |
13th June | SRM | 7 | 7 |
The only angular degrees of freedom that changed significantly between these dates were:
Ibrahim, Sheila, Camilla
After Ibrahim relocked the IFO, the range is lower around 150 Mpc. Ibrahim ran the coherence checks from the low range wiki, and saw that there was braod SRCL coherence. He double checked that the SRC ASC offsets 78415 from yesterday are correctly still in place. The squeezing level is quite good in this lock.
I accidentally started to run the A2L script (by clicking in a terminal I didn't mean to), so we took a few minutes to try to retune SRCL FF. We tried the feedforward that Gabriele fit last week, 78307, which gives us worse coupling than the April 29th filters do. Then we adjusted the gain using the April 29th filter, screenshot attached. The SRCL coherence is still highly, and the range is still low. We would need a new filter to get good SRCL decoupling in this lock, but we do not understand why this has changed since last lock.
IFO is in NLN and OBSERVING as of 19:26 UTC
Mini-Events Today:
Since the yaw ASC cross couplings to DARM were high in our last lock, Ibrahim and I took a couple minutes before going into observing to run the script that TJ edited yesterday (78419) again. The attached screenshot shows that in our first time running the script the ADS values were still drifting when the script moved on to measuring the next step. This might have been because the IFO had locked recently, or because it needs to wait longer. We added a 30 second wait after we change the A2L gains before checking the values by adding +30 on line 185, after this it looked like things had settled well.
We ran the script for all test masses yaw only from the terminal using: python a2l_min_multi.py --quads ETMX ETMY ITMX ITMY --dofs Y
After the script ran, there were many SDF diff that Ibrahim has screenshots of. It would be good to edit the end of the script to have fewer of these diffs each time we run it. We added these new values to lscparams and loaded ISC_LOCK.
Here are our first and second run results.
*************************************************
RESULTS
*************************************************
ETMX Y
Initial: 4.94
Final: 4.95
ETMY Y
Initial: 0.94
Final: 1.01
ITMX Y
Initial: 2.89
Final: 2.89
ITMY Y
Initial: -2.51
Final: -2.39
Diff: 0.11999999999999966
*************************************************
RESULTS
*************************************************
ETMX Y
Initial: 4.95
Final: 4.91
Diff: -0.040000000000000036
ETMY Y
Initial: 1.01
Final: 1.0
Diff: -0.010000000000000009
ITMX Y
Initial: 2.89
Final: 2.85
Diff: -0.040000000000000036
ITMY Y
Initial: -2.39
Final: -2.45
Diff: -0.06000000000000005
Here are some plots that show the fuctioning of the fast shutter, before, during and after the June 7th pressure spikes. The fast shutter functions the same way before and after the pressure spikes. However, in the locking attempts with the HAM6 alignment different the beam going to AS_C was clipped, and this meant that the fast shutter didn't block the beam on AS_A and AS_B as quickly as it normally does.
The first attachment shows a lockloss from June 6th, the lockloss before the pressure spikes started. The fast channel that records power on the shutter trigger diode (H1:ASC-AS_C_NSUM_OUT_DQ) is calibrated into Watts arriving in HAM6. Normally these NSUM channels are normalized by the input power scaling, but as the simulink screenshot shows that is not done in this case. Using the time that the power on AS_A is blocked, the shutter triggered when there was 0.733 Watts arriving in HAM6, and the light on AS_A which is behind the shutter is blocked. There is a bounce, where the beam passes by the shutter again 51.5 ms after it first closes, this bounce last 15ms and in that time the power into HAM6 rises to 1.3W. In total, AS_A (and AS_B) where saturated for 24ms. This pattern is consistent for 4 locklosses in several locklosses from higher powers, with the normal alignment on AS_C, both before and after the pressure spikes in HAM6.
In the first lockloss with a pressure spike, 78308, the interferometer input power was 10W, rather then the usual 60W, and the alignment into HAM6 was in an unusual state. The shutter triggered when the input power was 0.3Watts according to AS_C, which was clipped at the time and so is underestimating the power into HAM6. The trend of power on AS_A and AS_B was different this time, with what looks like two bounces and a total of 35ms of time when AS_A was saturated. The first bounce is about 14ms after the shutter first triggers, but the beam isn't unblocked enough to satrate the diode, and a second bounce saturates the diode 36ms after the shutter first closed, and lasted 26ms, during which time the power into HAM6 rose to 0.55W. The power on AS_C peaked about 250ms after the shutter triggered, at about 1W onto AS_C. Keita is going to look at energy deposited into HAM6 in typical locklosses, where AS_C is not clipping as that will be more accurate. The pressure spike shows up on H0:VAC-LY_RT_PT152_MOD2_PRESS_TORR about 1.5 seconds after the power peaks on AS_C, and peaks at 1.1-7 Torr.
At the next pressure spike, the interferometer was locked with 60W input power, and AS_A was saturated for 80ms before the shutter triggered, and there was no bouncing. This time the pressure rise was recorded 2 seconds after the lockloss, and 1.8 seconds after the power on AS_C peaked, this was a larger pressure spike than the first at 1.3e-7 Torr.
The third pressure spike was also a 60W lockloss, with AS_A saturated for 80ms and no bounce from the AS shutter visibile. The pressure rise was recorded 2.1 seconds after the lockloss, and was 3.1e-7 Torr.
The final attachment shows the first lockloss after we reverted the alignment in AS_C, when the fast shutter behavoir follows the same pattern seen before the pressure spikes happened.
Fri Jun 14 10:09:56 2024 INFO: Fill completed in 9min 53secs
Jordan confirmed a good fill curbside.
Closes FAMIS 26252
Laser Status:
NPRO output power is 1.813W (nominal ~2W)
AMP1 output power is 66.77W (nominal ~70W)
AMP2 output power is 137.7W (nominal 135-140W)
NPRO watchdog is GREEN
AMP1 watchdog is GREEN
AMP2 watchdog is GREEN
PDWD watchdog is GREEN
PMC:
It has been locked 16 days, 20 hr 27 minutes
Reflected power = 21.32W
Transmitted power = 106.0W
PowerSum = 127.3W
FSS:
It has been locked for 0 days 18 hr and 48 min
TPD[V] = 0.8414V
ISS:
The diffracted power is around 2.7%
Last saturation event was 0 days 18 hours and 48 minutes ago
Possible Issues:
PMC reflected power is high
TITLE: 06/14 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Observing at 152Mpc
OUTGOING OPERATOR: Corey
CURRENT ENVIRONMENT:
SEI_ENV state: CALM
Wind: 9mph Gusts, 5mph 5min avg
Primary useism: 0.02 μm/s
Secondary useism: 0.12 μm/s
QUICK SUMMARY:
IFO is in NLN and OBSERVING (17 hrs 28 mins)
Woke up to see that the SQZ_OPO_LR Guardian had the message:
"disabled pump iss after 10 locklosses. Reset SQZ-OPO_ISS_LIMITCOUNT to clear message"
Followed 73053, but did NOT need to touch up the OPO temp (it was already at its max value); then took SQZ Manager back to FRE_DEP_SQZ, and H1 went back to OBSERVING.
Received wake-up call at 440amPDT (1140utc). Took a few minutes to wake up, then log into NoMachine. Spent some time figuring out the issue, and ultimately doing an alog search to find steps to restore SQZ (found an alog by Oli which pointed to 73053). Once SQZ relocked, automatically taken back to OBSERVING at 517am(1217utc).
Sheila, Naoki, Camilla. We've adjusted this so it should automacally relock the ISS.
IFO went out of observing from the OPO without the OPO Guardian going down as the OPO stayed locked, just turned it's ISS off. We're not sure what the issue with the ISS was, SHG power was fine as the controlmon was 3.5 which is near the middle of the range. Plot attached. It didn't reset until Corey intervened.
* this isn't really a lockloss counter, more of a count of how many seconds the ISS is saturating.
Worryingly the squeezing got BETTER while the ISS was unlocked, plot attached of DARM, SQZ BLRMs and range BLMS.
In the current lock, the SQZ BLRMs are back to the good values plot, why was the ISS injecting noise last night? Has this been a common occurrence? What is a good way of monitoring this? Coherence with DARM and the ISS
Check on this is 78486. Think that the SQZ OPO temperature or angle wasn't well tuned for the green OPO power at this time, when the OPO ISS was off, the SHG launch power dropped from 28.8mW to 24.5mW, plot. it was just chance that SQZ was happier here.
State of H1: Observing at 157Mpc, locked for 6.5 hours.
Quiet shift so far except for another errant Picket Fence trigger to EQ mode just like ones seen last night (alog78404) at 02:42 UTC (tagging SEI).
That's about two triggers in a short time. If the false triggers are an issue, we should consider triggering on picket fence only if there's a Seismon alert.
The picket fence-only transition was commented out last weekend on the 15th by Oli. We now will only transition on picket fence signals if there is a live seismon notificaition.
Thanks Jim,
I'm back from my vacation and will resume work on the picket fence to see if we can fix these errant triggers this summer.
I took the script that we have been using to run our A2L and converted it to run the measurements for all quads and degrees of freedom at the same time, or less, as desired. The new script is (userapps)/isc/h1/scripts/a2l/a2l_min_multi.py. Today Sheila and I tested it for all quads with just Y with the results below. These values were accepted in SDF, updated in lscparams.py, and ISC_LOCK reloaded. More details about the script at the bottom of this log.
Results for ETMX Y
Initial: 4.99
Final: 4.94
Diff: -0.04999999999999982
Results for ETMY Y
Initial: 0.86
Final: 0.94
Diff: 0.07999999999999996
Results for ITMX Y
Initial: 2.93
Final: 2.89
Diff: -0.040000000000000036
Results for ITMY Y
Initial: -2.59
Final: -2.51
Diff: 0.08000000000000007
The script we used to use was (userapps)/isc/common/scripts/decoup/a2l_min_generic_LHO.py which was, I think, originally written by Vlad B. and then Jenne changed it up to work for us at LHO. I took this and changed a few things around to then call the optimiseDOF function for each desired quad and dof under a ThreadPool class from multiprocess to run all of the measurements simultaneously. We had to move or change filters in the H1:ASC-ADS_{PIT,YAW}{bank#}_DEMOD_{SIG, I, Q} banks so that each optic and dof is associated with a particular frequency and used the ADS banks 6-9. These frequencies needed to be spaced apart enought but still within our area of interest. We also had to engage notches for all of these potential lines in the H1:SUS-{QUAD}_L3_ISCINF_{P,Y} banks (FM6&7). We also accepted the ADS output matrix values in SDF for these new banks with a gain of 1.
This hasn't been tested for all quads and both P&Y, so far only Y.
I ran the DARM offset step code starting at:
2024 Jun 13 16:13:20 UTC (GPS 1402330418)
Before recording this time stamp it records the PCAL current line settings and makes sure notches for 2 PCAL frequencies are set in the DARM2 filter bank.
It then puts all the PCAL power into these lines at 410.3 and 255Hz (giving them both a height of 4000 counts), and measures the current DARM offset value.
It then steps the DARM offset and waits for 120s each time.
The script stopped at 2024 Jun 13 16:27:48 UTC (GPS 1402331286).
In the analysis the PCAL lines can be used to calculate how the optical gain changes at each offset.
See the attached traces, where you can see that H1:OMC-READOUT_X0_OFFSET is stepped and the OMC-DCPD_SUM and ASC-AS_C respond to this change.
Watch this space for analysed data.
The script sets all the PCAL settings back to nominal after the test from the record it ook at the start.
The script lives here:
/ligo/gitcommon/labutils/darm_offset_step/auto_darm_offset_step.py
The data lives here:
/ligo/gitcommon/labutils/darm_offset_step/data/darm_offset_steps_2024_Jun_13_16_13_20_UTC.txt
See the results in the attached pdf also found at
/ligo/gitcommon/labutils/darm_offset_step/figures/plot_darm_optical_gain_vs_dcpd_sum/all_plots_plot_darm_optical_gain_vs_dcpd_sum_1402330422_380kW__Post_OFI_burn_and_pressure_spikes.pdf
The contrast defect is 0.889 ± 0.019 mW and the true DASRM offset 0 is 0.30 counts.
I plotted the power at the antisymmetric port as in this entry to find out the loss term between the input to HAM6 and the DCPDs, which in this case is (1/1.652) = 0.605 with 580.3 mW of light at the AS port insensitive to DARM length changes.
From Jennie's measurement of 0.88 mW contrast defect, and dcpd_sum of 40mA/resp = 46.6mW, this implies an upper bound on the homodyne readout angle of 8 degrees.
This readout angle can be useful for the noise budget (ifo.Optics.Quadrature.dc=(-8+90)*np.pi/180)
and analyzing sqz datasets e.g. May 2024, lho:77710.
Table of readout angles "recently":
|
|
|
alog | ||
O4a | Aug 2023 | 46.6 mW | 1.63 mW | 10.7 deg | lho71913 |
ER16 | 9 March 2024 | 46.6 mW | 2.1 mW | 12.2 deg | lho76231 |
ER16 | 16 March 2024 | 46.6 mW | 1.15 mW | 9.0 deg | lho77176 |
O4b | June 2024 | 46.6 mW | 0.88 mW | 8.0 deg | lho78413 |
O4b | July 2024 | 46.6 mW | 1.0 mW | 8.4 deg | lho79045 |
##### quick python terminal script to calculate #########
# craig lho:65000
contrast_defect = 0.88 # mW # measured on 2024 June 14, lho78413, 0.88 ± 0.019 mW
total_dcpd_light = 46.6 # mW # from dcpd_sum = 40mA/(0.8582 A/W) = 46.6 mW
import numpy as np
darm_offset_power = total_dcpd_light - contrast_defect
homodyne_angle_rad = np.arctan2(np.sqrt(contrast_defect), np.sqrt(darm_offset_power))
homodyne_angle_deg = homodyne_angle_rad*180/np.pi # degrees
print(f"homodyne_angle = {homodyne_angle_deg:0.5f} deg\n")
##### To convert between dcpd amps and watts if needed #########
# using the photodetector responsivity (like R = 0.8582 A/W for 1064nm)
from scipy import constants as scc
responsivity = scc.e * (1064e-9) / (scc.c * scc.h)
total_dcpd_light = 40/responsivity # so dcpd_sum 40mA is 46.6mW
Following on from Sheila's alog 77427, checked on VAC trends and OFI temperature around April 22nd.
I checked (maybe a repeat of someone else) that there were no vacuum spikes during the locks when Kappa_C dropped or the locklosses afterwards. Plot of Kappa_ C with VAC channels attached.
Looking at the OFI TEC readbacks during that time, they were significantly noisier than usual during Tuesday maintenance after the 5% optical gain drop, before out alignment shift. Noisy between 6:30am and 4pm. This is before any Tuesday maintenance 77363 or injections start. Plot attached, including zoom out and zoom in. It also happened the May 28th to May 30th.
In 78442, we show that these larger than usual OFI temperature swings on locklosses and powerups so happened May 28th/29th when we adjusted our SRC alignment (SRC 250urad). Is this a sign there is an alignment where we are hitting something in the OFI?
Last night and today we are in a different spot through the OFI. See Sheila's alog 78096 for the move that was made.
Overall, SR2 and SRM yaw are much closer to center in this position, however SRM pitch is farther from center. I did a quick double check of the SRM pit, and indeed this is where it wants to be.
The previous spots (with the previous SR3 alignment) are recorded in alog 77443.
ampl [cts] of line at 31.0 Hz | A2L gain step size when minimizing | CAL-DELTAL line reduction factor | Final A2L gain | Inferred new spot position [mm] | Change from alog 77443 position | |
SR2 P2L | 1.0 | 0.1 | 100x | -1.0 | -2.0 | 13.1 mm other side of center |
SR2 Y2L | 1.0 | 0.1 | 100x | +0.3 | 0.6 | 9.7 mm other side of center |
SRM P2L | 2.0 | 0.1 | 50x | -5.5 | -11.1 | 4.3 mm farther from center |
SRM Y2L | 2.0 | 0.1 | 30x | +1.85 | 3.7 | 3.5 mm closer to center |
Attached are the saved SDF diffs for both Observe and Safe snap files.
Sheila and Keita have recently found and fixed sign convention errors. Please see alog 78393 for the corrected interpretation of A2L gains.
Sheila, Camilla. During this May 28th/29th SRC alignment, the OFI sees larger than usual temperature swings on locklosses and powerups. See plot attached. We saw similar larger OFI temperature swings on the re-locking attempts after our April 22nd optical gain drop before we changes alignments, shown in 78399 and second attached plot. Is the beam in this alignment hitting something in the OFI? Could be more likely the yaw alignment as that is common in both alignments with larger temperature swings.