Displaying reports 17901-17920 of 86778.Go to page Start 892 893 894 895 896 897 898 899 900 End
Reports until 10:51, Friday 30 June 2023
LHO VE
david.barker@LIGO.ORG - posted 10:51, Friday 30 June 2023 (70972)
Fri CP1 Fill, not a good fill

Fri Jun 30 10:08:45 2023 INFO: Fill completed in 8min 45secs

Jordan confirmed this was not a good fill from curbside. We will try again later today.

Images attached to this report
H1 PSL (PSL)
corey.gray@LIGO.ORG - posted 09:15, Friday 30 June 2023 (70971)
PSL Status Report (#25485)

PSL Status FAMIS Report for this week:
Laser Status:
    NPRO output power is 1.829W (nominal ~2W)
    AMP1 output power is 67.26W (nominal ~70W)
    AMP2 output power is 135.5W (nominal 135-140W)
    NPRO watchdog is GREEN
    AMP1 watchdog is GREEN
    AMP2 watchdog is GREEN

PMC:
    It has been locked 8 days, 22 hr 37 minutes
    Reflected power = 16.31W
    Transmitted power = 109.1W
    PowerSum = 125.4W

FSS:
    It has been locked for 0 days 17 hr and 8 min
    TPD[V] = 0.8459V

ISS:
    The diffracted power is around 2.5%
    Last saturation event was 0 days 17 hours and 8 minutes ago


Possible Issues: None

H1 SUS (SEI, SYS)
jeffrey.kissel@LIGO.ORG - posted 08:32, Friday 30 June 2023 (70968)
UWash Picks Up HAM Optical Lever Package
J. Kissel, M. Ross, S. Fleisher, S. Apple
IIET 24286

Having been here for the first time in a long time for this year's GWANW 2023, we remember that they're interested in spare optical lever parts prepared for last year, LHO:64030.

They've now, after long last, taken the gear home with them on loan in order to begin understanding the noises of the system, and prototyping an improved system (G2201199) based on the loose set of requirements mentioned in G2201224. At this point, this low-cost loan is to support LSC involvement in improving an sensor for which the lab doesn't have person power. 

Images attached to this report
LHO General
corey.gray@LIGO.ORG - posted 08:32, Friday 30 June 2023 (70969)
Fri DAY Ops Transition

TITLE: 06/30 Day Shift: 15:00-23:00 UTC (08:00-16:00 PST), all times posted in UTC
STATE of H1: Observing at 142Mpc
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 5mph Gusts, 3mph 5min avg
    Primary useism: 0.01 μm/s
    Secondary useism: 0.05 μm/s
QUICK SUMMARY:

Got the run-down from Austin and getting situated for the day---luckily with an H1 which just made it through an EQ.  Heard there is possibility of H1 going down to address the Green X-arm issues of late at some point during the shift.

Calm winds (w/ forecast near 100degF!) and the Indonesian EQ looks mostly gone/past us.

LHO General
austin.jennings@LIGO.ORG - posted 08:01, Friday 30 June 2023 (70966)
Friday Owl Shift Summary

TITLE: 06/30 Owl Shift: 07:00-15:00 UTC (00:00-08:00 PST), all times posted in UTC
STATE of H1: Observing at 144Mpc
SHIFT SUMMARY:

- Arrived with H1 locked and observing for ~5 hours (though there are some pretty high peaks in the kHZ frequency range

- EX saturations @ 7:01/9:33/10:38/10:50

- S230630am @ 12:58

- 13:14 - incoming 5.8 EQ from Indonesia

Overall, a quiet night, H1 looks to be stable, passing to Corey with the IFO in observing for 13 hours
LOG:

No log for this shift.

LHO General
austin.jennings@LIGO.ORG - posted 04:01, Friday 30 June 2023 (70967)
Mid Shift Owl Report

H1 has been observing for just over 9 hours. Getting a couple back to back EX saturations, but otherwise all looks to be stable. Ground motion and wind speeds are low.

H1 General
anthony.sanchez@LIGO.ORG - posted 00:16, Friday 30 June 2023 (70965)
Thursday Ops Eve Shift End

TITLE: 06/30 Eve Shift: 23:00-07:00 UTC (16:00-00:00 PST), all times posted in UTC
STATE of H1: Observing at 20Mpc
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 18mph Gusts, 15mph 5min avg
    Primary useism: 0.05 μm/s
    Secondary useism: 0.06 μm/s
QUICK SUMMARY:

23:00 UTC Inherited an IFO that was in the middle of the locking process.
H1 sat In Check Violins for a while before moving on up to OMC Whitening.
The Violins were rung up so much that it took an hour and twenty minutes to get into NOMINAL_LOW_NOISE from OMC_WHITENING.

4:53 UTC H0:VAC-EX_INSTAIR_PT599_PRESS_PSIG alarm sounding off
Spoke to Jordan, he said that it's fine because that device is offline.

Passing H1 off to Austin in NOMINAL_LOW_NOISE & OBSERVING with a Range of 146 Mpc.

LHO General
austin.jennings@LIGO.ORG - posted 00:00, Friday 30 June 2023 (70964)
Ops Owl Eve Shift

TITLE: 06/30 Owl Shift: 07:00-15:00 UTC (00:00-08:00 PST), all times posted in UTC
STATE of H1: Observing at 142Mpc
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 17mph Gusts, 14mph 5min avg
    Primary useism: 0.05 μm/s
    Secondary useism: 0.06 μm/s
QUICK SUMMARY:

- H1 has been locked and observing for just over 5 hours

- CDS/SEI/DMs ok

H1 General
anthony.sanchez@LIGO.ORG - posted 18:55, Thursday 29 June 2023 (70963)
SDF DIFFS Accepted for Observing.

SDF Diffs were accepted to get into observing.
I imagine that Both of these changes were made in response to this PLL issue.

 

Images attached to this report
LHO General
thomas.shaffer@LIGO.ORG - posted 17:02, Thursday 29 June 2023 (70959)
Ops Day Shift Summary

TITLE: 06/29 Day Shift: 15:00-23:00 UTC (08:00-16:00 PST), all times posted in UTC
STATE of H1: Locking

SHIFT SUMMARY: Shift started with the end of a 42hour lock. Relocking, we've been having issues with the ALSX PLL beatnnote being too low. We started off with Jason adjusting the FSS, and then high violins. We then lost lock due to a PI that snuck up on us since damping wasn't automatically engaged while we were in an odd state damping violins.

H1 SQZ
naoki.aritomi@LIGO.ORG - posted 16:54, Thursday 29 June 2023 (70958)
NLG measurement

Naoki, Vicky

We measured NLG with seed. The measured NLG is 12.78, which corresponds to generated squeezing of 15.8 dB. The NLG calculator gives 12.64 so the calibration on February 3 is still OK.

Seed without pump: 0.00078
Seed with pump: 0.00997
NLG: 12.78

How to measure NLG

H1 General
anthony.sanchez@LIGO.ORG - posted 16:28, Thursday 29 June 2023 (70957)
Thursday Ops Eve Shift start

TITLE: 06/29 Eve Shift: 23:00-07:00 UTC (16:00-00:00 PST), all times posted in UTC
STATE of H1: Lock Acquisition
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 18mph Gusts, 15mph 5min avg
    Primary useism: 0.02 μm/s
    Secondary useism: 0.06 μm/s
QUICK SUMMARY:
Inherited an IFO that was relocking.
There has been a PLL Thershold change  due to an ALS PLL Issue that will have to be accepted before we reach Observing.

We are currently trying to get H1 relocked for the night.

 

 

H1 PSL
jenne.driggers@LIGO.ORG - posted 16:23, Thursday 29 June 2023 (70956)
Lockloss due to ISS second loop?

We may need to revisit the offsets for the ISS second loop.  It seems like we lost lock when we were just about to get to NLN quite soon after the ISS second loop closed. 

The diffracted power went basically to zero, which is not good, and can cause a lockloss - I suspect that's what happened here.

Images attached to this report
H1 ISC (ISC, OpsInfo)
jeffrey.kissel@LIGO.ORG - posted 15:32, Thursday 29 June 2023 - last comment - 11:16, Friday 30 June 2023(70951)
H1 ALS X Fiber PLL Giving Us Problems Again
J. Driggers, S. Dwyer, J. Kissel, T. Shaffer

TJ has been having issues locking ALS all day, and has traced down the problem *site* to be the ALS X Fiber PLL. He can lower the threshold on the ALS-X_FIBR_A_DEMOD_RFMON, and squeak by enough to move on with the lock acquisition, but the problem has been consistent all day and should be addressed by other means.

Sheila suspects that these are similar issues that we've had in the past with the ALS laser mode hopping (most recently in May 2023 LHO:69773 and LHO:69712, but even further in years past as well). However, today's issues are a marked step-function drop in DEMOD power, rather than a slow gradual decay from +10 dBm-ish to below -10 dBm.

In the past we've solved the slow gradual decay by lowering the beat note threshold from its nominal -10 dBm to unphysically low -30 dB threshold, left it there for a few days to get by, then the beat note would slow come back up, we'd re-raise the threshold back to -10 dB and move on.

Jenne has, for now, dropped the threshold (H1:ALS-X_FIBR_LOCK_BEAT_RFMIN) to -20 dB.

Earlier this morning, TJ and Jason tried increasing the power into the fiber coming from the PSL (see LHO:70944), but only was able to gain a little bit more power, and it didn't help the issue, so this further points to problems with the ALS laser, rather than what's coming in from the corner via the fiber.

We have trended this issue in ALS-X_FIBR_A_DEMOD_RFMON against XVEA temperature metrics and we *do not* see any correlation.

We tried flipping the laser's noise eater from ON to OFF then back ON again, and it had no effect.

We can again squeak by, quickly lowering the threshold and making it past this portion of the greater IFO lock acquisition sequence, but this is a bandaid to the real problem that should be address ASAP.
Sheila suggests a thing that *might* work is going to the X end station and adjusting the laser current and temperature.

Meanwhile, the investigation continues.

Attached are the last 24 hours, the last 24 days, and the last 24 months of behavior from this beat note channel compared against the laser crystal frequency (slow laser control via laser temperature), the laser head PZT frequency (fast laser control via PZT), and the ALS PLL Common Mode Board control signal prior to splitting between fast and slow control.

Images attached to this report
Comments related to this report
jeffrey.kissel@LIGO.ORG - 15:38, Thursday 29 June 2023 (70953)
Opened FRS Ticket 28430 to track the issue.
jason.oberling@LIGO.ORG - 15:56, Thursday 29 June 2023 (70954)

One clarification, re: "...TJ and Jason tried increasing the power into the fiber coming from the PSL (see LHO:70944)..." The ALS lasers no longer get their PSL light for the PLL from the RefCav transmission, so improving the RefCav TPD will have no effect on the amount of power in the ALS fibers. The ALS fibers are now fed by a pickoff in the ALS path on the IOO side of the PSL table, after the PMC. From the as-built PSL layout, the fiber pickoff is ALS-PBS01, which directs the picked off beam into ALS-FC2; the IOO ALS path itself is from the transmission of IO_MB_M2, which is directly in front of the ISC EOM (IO_MB_EOM). This was installed in Oct 2019 in the break between O3a and O3b.

sheila.dwyer@LIGO.ORG - 11:16, Friday 30 June 2023 (70973)

The beatnote strength stayed low overnight rather than coming back up.  It seems like going to the end station to adjust the laser current and temperature before the long weekend is a good idea. 

A note for the alog above:  we looked at the fiber transmission and the fiber rejected polarization PDs, they do not show any problems.  When the beatnote strength drops, there is also a jump in the crystal frequency needed to lock the PLL, which makes this look like mode hopping of the ALS laser. 

Images attached to this comment
H1 ISC (Lockloss, OpsInfo)
victoriaa.xu@LIGO.ORG - posted 13:37, Thursday 29 June 2023 - last comment - 18:09, Thursday 29 June 2023(70949)
PI damping will now engage after 1 minute of OMC_WHITENING

PI damping guardian has been edited to continue onto PI DAMPING after the first minute of OMC_WHITENING. Edits committed to SVN; diff and commit screenshot attached.

Due to the DCPD glitches from OMC_WHITENING, I had previously not engaged PI_DAMPING to turn on damping gains, coil drivers, etc during this lock stage. Oversight on my part, I didn't think about the case where we stay in that stage to damp violins for >1 hour. Violin damping looked really successful too, until PI24 ran away (nuc31 dcpds screenshot). I guess this shows that we indeed need active PI damping to avoid locklosses due to the 10.4 kHz PIs overlapping with the 2x HOMs.

Images attached to this report
Comments related to this report
victoriaa.xu@LIGO.ORG - 17:16, Thursday 29 June 2023 (70952)OpsInfo

NUC25 has been updated with a new PI-damping ndscope, which shows the PI damping's drive to the ESDs. Screenshot is annotated.

In the scope, first 2 plots show the ETMY (10.4kHz, PI 24+31) and ETMX (80.3kHz, PI 28+29) PI mode monitors. 3rd plot shows ESD drives: for ETMY, should be ~1000, for ETMY, should be ~50,000. If ESD drive is 0, there is no PI damping.

To damp manually, can do the following for e.g. PI 24 damping:

  • caput H1:SUS-ETMY_PI_ESD_DRIVER_PI_DAMP_SWITCH 1                         # turn on coil drive for PI damping. IF 80kHz PI 28 or 29 ring up, change "ETMY" to "ETMX".
  • caput H1:SUS-PI_PROC_COMPUTE_MODE24_DAMP_GAIN  1000                 # turn on damping gain for PI24 aka "MODE24". Change "MODE24" to whatever is ringing up.
  • caput H1:SUS-PI_PROC_COMPUTE_MODE24_PLL_FREQ_FILT2_RSET 2    # reset PLL integrator. Change "MODE24" to whatever is ringing up.

If PI Guardian is in "PI_DAMPING", but you want to manually step the phase, take it to "IDLE".

This is what PI_DAMPING guardian does (ie, turns on ESD switch, turns on damping gain, resets PLL integrator, then steps phase around until rms decreases).

Images attached to this comment
victoriaa.xu@LIGO.ORG - 18:09, Thursday 29 June 2023 (70962)OpsInfo

Looks like this guardan edit is working, and we can now do PI DAMPING in OMC_WHITENING, screenshot attached.

Images attached to this comment
H1 ISC (OpsInfo)
anthony.sanchez@LIGO.ORG - posted 20:10, Wednesday 28 June 2023 - last comment - 17:51, Thursday 29 June 2023(70933)
CAMERA_SERVO Guardian incident

00:19 UTC H1 Dropped out out of OBSERVING and into COMISSIONING due to the CAMERA_SERVO Guardian detecting that the ASC-CAM_PIT2_INMON & ASC-CAM_YAW2_INMON was "stuck" for 5 seconds. The Guardian node took itself to TURN_CAMERA_SERVO_OFF. The Guardian node was in this state for less than a minute. It then took itself back to CAMERA_SERVO_ON. This was also seen in the Guardian log. Please See attached ndscope Screeneshot.

00:34 UTC We went back to observing.


Looking back at yesterdays alog, and some ndscopes this appears to be the same issue we saw today, and it was a mere coincidence that Elenna happen to hit the load button at the same time as the guardian node transitioned down to get the camera unstuck.

Images attached to this report
Comments related to this report
naoki.aritomi@LIGO.ORG - 17:51, Thursday 29 June 2023 (70961)

For such short camera freeze, we implemented WAIT_FOR_CAMERA state in the camera guardian (alog68756). When the camera freeze happens, the camera guardian turns off the camera servo and waits for camera for 30s without moving to ADS. If the cameras are OK after 30s, the guardian turns on the camera servo again. So the camera guardian itself is working as expected.

In the camera guardian, the is_chan_static in static_tester.py is checking if the cameras are stuck or not. If the is_chan_static can skip this short camera freeze, that would be one of the solutions.

H1 SQZ
naoki.aritomi@LIGO.ORG - posted 15:19, Tuesday 27 June 2023 - last comment - 18:23, Thursday 29 June 2023(70890)
SQZ ASC ran away

Sheila, Naoki

As shown in the attached figure, SQZ ASC ran away and was turned off since the SQZ ASC trigger was below the threshold. We removed squeezing and reset the AS42 offset and push the graceful clear history. After we brought back the squeezing, everything seems working now.

Images attached to this report
Comments related to this report
naoki.aritomi@LIGO.ORG - 16:12, Thursday 29 June 2023 (70955)

Even with new input matrix, SQZ ASC ran away today. I reset the AS42 offset and pushed the graceful clear history. The SQZ ASC came back, but I am not sure why this happened.

Images attached to this comment
victoriaa.xu@LIGO.ORG - 18:23, Thursday 29 June 2023 (70960)

Naoki, Vicky - We added checkers to both SQZ_MANAGER and SQZ_FC guardians, to give notifications if ASC is not on.

SQZ_MANAGER now checks for ASC on if SQZ-ASC_ANG_P/Y_INMON = 0; same for filter cavity asc, which checks if SQZ-FC_ASC_ANG_P/Y_INMON = 0. Earlier today, looks like the ASC_WFS switch was ON but the alignment ran away, so the signal fell below trigger threshold, and AS42 ASC did not engage. Hope these notifications make it easier to catch when ASC isn't on.

These guardian edits for SQZ_MANAGER and SQZ_FC were the only SVN diffs; the changes are now committed to SVN version 25945.

Displaying reports 17901-17920 of 86778.Go to page Start 892 893 894 895 896 897 898 899 900 End