I took advantage of the recent earthquake to tweak the beam alignment into the FSS RefCav from the Control Room to improve the transmission, which had been dropping for the last several days. With the IMC unlocked the starting RefCav TPD was ~0.660 V, and after the alignment tweak (IMC still unlocked) the TPD is ~0.833 V.
17:42 UTC lockloss from a 6.6 from Cape Verde, holding in down till the motion subsides.
Fri Mar 28 10:13:33 2025 INFO: Fill completed in 13min 29secs
Gerardo confirmed a good fill curbside. Note TC adjustments during fill when ice was being removed.
Sheila, Camilla
Data to take and how each data is used in the model:
Closes FAMIS26379
Laser Status:
NPRO output power is 1.864W
AMP1 output power is 70.39W
AMP2 output power is 140.2W
NPRO watchdog is GREEN
AMP1 watchdog is GREEN
AMP2 watchdog is GREEN
PDWD watchdog is GREEN
PMC:
It has been locked 51 days, 18 hr 58 minutes
Reflected power = 22.69W
Transmitted power = 106.0W
PowerSum = 128.7W
FSS:
It has been locked for 0 days 0 hr and 59 min
TPD[V] = 0.662V
ISS:
The diffracted power is around 4.2%
Last saturation event was 0 days 8 hours and 41 minutes ago
Possible Issues:
FSS TPD is low
After Jasons touch up the TPD is now ~0.833 V
TITLE: 03/28 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Lock Acquisition
OUTGOING OPERATOR: Ryan S
CURRENT ENVIRONMENT:
SEI_ENV state: CALM
Wind: 7mph Gusts, 6mph 3min avg
Primary useism: 0.03 μm/s
Secondary useism: 0.50 μm/s
QUICK SUMMARY:
ETMY mode1 and 6 were both ringing up under nominal damping for the first 10 minutes so I've turned off the gain for now.
I found some new settings that are working, I added FM5 (-30 degrees) and switched the gain to negative. The working settings are FM1+FM5+FM10 G= -0.2
H1 called for assistance at 07:13 UTC when ISIs tripped for every QUAD, BS, and HAM8 due to a M7.7 earthquake and M6.4 aftershock out of Myanmar. Lockloss happened earlier at 06:33 UTC because of the S-waves of this quake, I'm assuming. At the worst of the shaking, peakmon was at the highest I can recall seeing it at over 128,000 counts.
H1 will be down for a while. I'll check in after a bit and untrip platforms if the shaking has calmed enough and start locking.
I've untripped all platforms and set up H1_MANAGER for the night. SEI_ENV is still in LARGE_EQ, but H1_MANAGER should wait until the seismic configuration is fully out of any earthquake mode before trying to lock.
H1 again called at 12:41 UTC as it wasn't able to finish an initial alignment. ALS-Y was having trouble staying locked, and I eventually ended up adjusting the BS in yaw to improve transmission as I noticed the ALS beam was clipped on the camera (it's still a bit clipped on the right, but less so now). After that, initial alignment was able to finish without issue, then relocking following that has been going automatically as well. H1 is currently relocking in MOVE_SPOTS.
Follow up to LHO alog 83200 and LHO alog 80863.
Brian and I thought of a better way of getting the M1 OSEMs in PR3 calibrated relative to the HAM2 GS13s: Use the ISI drives in L, T, V for frequencies above the suspension resonances to get an absolute calibration.
We can use the measurements Jeff took in LHO alog 80863 and project them in the OSEM basis, then use the translational (Longitudinal, Transverse, Vertical) drives to calibrate the OSEMs. Since the GS13s are calibrated to [m] all of the transfer functions (shown in the first attachment) are in [OSEM meters]/[GS13 meters]. At high frequencies (sufficiently above the resonances) the OSEMs should just be measuring (-1)*ISI motion, because the suspension is isolated. The calibration error will depend on how well we are able to drive L,T,V from the ISI without cross-coupling to any other DOF.
Figure 1 attached shows the 6x6 transfer matrix between the ISI drives and OSEMs (inputs are columns, and outputs are rows). We highlight the 6 translational transfer functions that can be used to calibrate the OSEMs in red. The drives seem decoupled enough to do the calibration (for example, we expect to see a length drive only in LF and RT, no motion anywhere else).
Figures 2,3, and 4 show a detail of the 6 transfer functions we will use for calibration, all of them should be scaled to 1 [OSEM m]/[GS13 m]. The scalings are:
Adding here the phases for the TFs used for the calibration. There's a 20 to 25 degree phase loss after the last resonance up to 20 Hz. I'm not sure what more to make out of it, I thought the phase loss would be an artifact of the OSEM readouts, but maybe someone with more knowledge can chime in.
One way to fix the cross-coupling without having to get the absolute calibration is to slightly modify the M1 PR3 osem gains. They currently sit at:
T1 T2 T3 LF RT SD
1.161 0.998 1.047 1.171 1.163 1.062
Their new values would be at
T1 T2 T3 LF RT SD
1.164 0.903 0.866 1.073 1.199 0.942
This change will still leave a scaling factor of 1.54 m/(OSEM m), but at least it should fix the L-Y and the V-P-R cross couplings.
TITLE: 03/28 Eve Shift: 2330-0500 UTC (1630-2200 PST), all times posted in UTC
STATE of H1: Observing
INCOMING OPERATOR: Ryan S
SHIFT SUMMARY: Just got to NLN and into Observing. When trying to relock earlier, we did have a lockloss from DHARD_WFS for some reason, but no issues after that. The wind is currently low but has had a couple instances of quickly spiking up above 20/25mph before dying back down, the second of which came with very heavy rain. Hopefully that doesn't happen again during the night. The secondary microseism is still very high, but doesn't look to be getting worse.
LOG:
23:30 Relocking and in OMC_WHITENING damping violins
00:33 NOMINAL_LOW_NOISE
00:34 Observing
03:03 Lockloss
- Lost lock at DHARD WFS
- Ran an initial alignment
05:06 NOMINAL_LOW_NOISE
05:07 Observing
Start Time | System | Name | Location | Lazer_Haz | Task | Time End |
---|---|---|---|---|---|---|
20:59 | ISC | Keita, Mayank | Opt Lab | Yes | ISS PD array (Keita out 21:55) | 01:27 |
21:23 | TCS | Camilla, Matt | Opt Lab | Yes | CO2 laser testing | 21:56 |
22:10 | ISC | Jennie | OptLab | Yes | Looking for Keita | 22:18 |
Lockloss @ 03/28 03:03UTC after 2.5 hours locked. Of course right after I put in my midshift report saying that we were locked
Observing at 155Mpc and have been Locked for 2.5 hours. Nothing to report
TITLE: 03/27 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Microseism
INCOMING OPERATOR: Oli
SHIFT SUMMARY: Calibration and commissioning started the day. After a lock loss at 12PT we have been struggling to relock. We just recently made it to the last state before low noise, but the violins are very high from a few lock losses at Transition_From_ETMX and Move_Spots. Oli is damping those now and hopefully we'll be back to observing soon. It's not entirely clear why we had issues relocking. The primary and secondary useism are definitely elevated, but at levels that we have locked before. Wind is under 20mph, as well.
LOG:
Start Time | System | Name | Location | Lazer_Haz | Task | Time End |
---|---|---|---|---|---|---|
15:38 | FAC | Tyler | EX | n | Adjust fan speed manually | 16:08 |
16:29 | ISC | Sheila, Jeff | LVEA | n | OMC DCPD injections, LVEA wifi on during this time | 17:34 |
16:44 | PSL | Jason | Opt Lab | n | Unpacking laser in lab and receiving | 16:55 |
18:28 | FAC | Kim, Nelly | MY | n | Tech clean | 18:48 |
18:58 | - | Tony | LVEA | n | Checking for water leaks | 19:33 |
19:11 | CDS | Marc | CER | n | Check on power supplies | 19:33 |
19:11 | CDS | Fil | LVEA | n | Check on CPS glitching | 19:33 |
20:17 | PSL | Jason | MX | n | Drop off | 20:32 |
20:59 | ISC | Keita, Mayank | Opt Lab | Yes | ISS PD array | 22:18 |
21:23 | TCS | Camilla, Matt | Opt Lab | Yes | CO2 laser testing | 21:56 |
22:10 | ISC | Jennie | OptLab | Yes | ISS help | 22:18 |
Back to Observing 03/28 00:34UTC after accepting a couple diffs for SQZ and IOPLSC0
The Kepco Power Supply for SUS-C4 started chirping at the end of maintenance today. By 2pm the chirping is more regular. The draw on the supply is 7A, typical failing fans last a few days once they get to this stage. We should replace this one before the weekend, as a target of opportunity. Down time will be 30 mins start to finish.
M. Pirello, F. Clara
Due to earthquake we took a window of opportunity to replace the SUS-C4 Kepco Power Supply which controls HAM1 and HAM2 suspensions. These suspensions were placed into safe and the supply replaced. We replaced only the -18V supply, the +18V matching supply was replaced in December 2022 so we left it in place.
M. Pirello, F. Clara
20:40 UTC Observing