Displaying reports 5701-5720 of 83264.Go to page Start 282 283 284 285 286 287 288 289 290 End
Reports until 12:47, Tuesday 03 September 2024
H1 General
oli.patane@LIGO.ORG - posted 12:47, Tuesday 03 September 2024 (79885)
Ops DAY Midshift Status

Currently Observing at 147Mpc and have been Locked for 50 minutes. We did lose lock during maintenance due to us turning off the BRSs at the end stations making us unable to hold the lock, but with Jim's help we changed ISI_ETMX/Y_ST2_SC from SC_OFF to CONFIG_FIR and that helped us relock and we were relocked by 19:05UTC. The squeezer was also able to be fixed so we are also squeezing now.

H1 CDS
jonathan.hanks@LIGO.ORG - posted 12:40, Tuesday 03 September 2024 - last comment - 14:10, Tuesday 03 September 2024(79881)
WP 12069 Backup CDS GC router install
As per WP 12069 I have installed a new system in the daq-2 rack slot 39, above the existing router.

I have hooked it up to the admin vlan and have ipmi enabled on it.  This is all of the network connectivity that will be enabled today.  I have installed a solar flare 10g card into the system for its eventual connection to the core switch.  I had to replace one power supply which had failed.  I pulled the replacement power supply from a spare unit on the shelves.

Install notes:
 * I am installing the same version of vyos on this router as is used on the current router.
   * the test stand log from the last time I did this (https://alog.ligo-la.caltech.edu/TST/index.php?callRep=15381).
 * I created a bootable thumb drive from the install iso and booted into a live image mode.
 * After booting to the thumbdrive and logging in I issued the 'install image'
   * select the local disk (sda)
   * automatically partition to a 40GB size
 * Default settings otherwise
 * After install, issue the reboot command.

To transfer the config over, I formatted a usb thumb drive as a ext4 filesystem and mounted it to /mnt.  I then entered config mode, issued a 'save config_3_sep_2024' and exited out.  I copied the /config/config_3_sep_2024 file to /mnt and unmounted mnt.  After moving the drive to the new router, I became root, mounted /mnt copied the new config file to /home/vyos/config_3_sep_2024 and changed it's ownership to the vyos user.  At this time I also updated the config to have the correct hw mac addresses for this box. Then as the vyos user I entered config mode, issued a 'load /home/vyos/config_3_sep_2024', then 'commit', then 'save' to make the config persists.

I rebooted to make sure the config was properly saved.  It took me two tries as the first time I only committed the change and did not save it.

I have installed an optic on the router and powered it of.  I will provide documentation for the operator on how to switch over to this router if there is a failure.  The basic procedure is:

 * power off the old router (rack 5, slot 37) using the power button on the front
 * go to the back of the rack
   * move the pink cable from the older router to the the new one (the port is labeled GB1 on both systems).
   * move the fiber from the older router to the new one (there is presently only 1 optic in each so there should be no confusion).
 * go back to the front of the rack and power on the new router (rack 5, slot 39) using the power button on the front.
Comments related to this report
jonathan.hanks@LIGO.ORG - 14:10, Tuesday 03 September 2024 (79889)
LIGO-T2300212 has been updated to reflect these changes.
H1 ISC
sheila.dwyer@LIGO.ORG - posted 12:06, Tuesday 03 September 2024 (79883)
POPAIR B centering

While Oli was relocking I went to ISCT1 and checked the centering on POPAIR B (motivated by the observation that the POP18 is low compared to before the vent, as well as the DC light on this diode 79663).  The beam wasn't well centered on the diode, and I've moved it to be more centered while the IFO was locked at 22W on PRM (this didn't make much difference in the powers at 22W).  This did improve the powers on POP18 after power up by about 15-30%, but it doesn't recover us to the power levels we had in the earlier part of O4b.  It seems that the degradation started around May 15.  We might want to go to the table to check for clipping upstream (today I only touched the mirror in front of POP AIR B), or think about if we need to touch that pico motor.

 

Images attached to this report
H1 PSL
ryan.short@LIGO.ORG - posted 11:52, Tuesday 03 September 2024 (79882)
PSL Cooling Water pH Test

FAMIS 21311

pH of PSL chiller water was measured to be just above 10.0 according to the color of the test strip.

H1 SEI (OpsInfo)
jim.warner@LIGO.ORG - posted 10:16, Tuesday 03 September 2024 (79880)
LIGHT_MAINTENANCE mode and staying locked on Maintenance Tuesdays

Now that we are trying to stay locked on some maintenance days, I've added a "LIGHT_MAINTENANCE" state to the SEI_ENV guardian. This state turns off the end station stage 1 sensor correction and the all of the CPS_DIFF controls. It doesn't include all of the normal environmental tests, but will do the LARGE_EQ transition that we added a while ago, if the peakmon channel goes above 10 micron/s. It won't go to the normal eq state.

I don't think this will work as when the microseism is high, but Oli has been able to do an alignment and work on getting the IFO locked, while people were cleaning the endstations and working in the high bay.

Recovery to normal operations is the same as the normal maintenance state, select AUTOMATIC on SEI_ENV and INIT.

H1 PSL
ryan.short@LIGO.ORG - posted 09:53, Tuesday 03 September 2024 (79879)
PSL 10-Day Trends

FAMIS 21271

After tuning up the FSS path in the enclosure last week (alog79736), the signal on the RefCav trans TPD has held steady and the PMC looks like it came back at around the same levels for reflected and transmitted power. The incursion is easily seen on several environmental trends.

No other major events of note.

Images attached to this report
H1 SQZ
sheila.dwyer@LIGO.ORG - posted 09:28, Tuesday 03 September 2024 (79877)
SQZ not seeing RF6 MHz this morning

Sheila, Naoki, Daniel

Overnight, the OPO was scanning for about 5 hours, during which time the 6MHz demod was seeing flashes from the CLF reflected off the OPO.  This morning, we still see DC light on the diode, but no RF power on the demod channel.  There aren't any errors on the demod medm screen.

We did a manual check that we have nonlinear gain using the seed, (we can't use the guardian because of the RF6 problem), and it seems that we do have NLG, so the OPO temperature correct.

Daniel found that the CLF frequency was far off from normal (5MHz), because the boosts were on in the CLF common mode board.  Turning these off solved the issue.  We've added a check in the OPO guardian in PREP_LOCK_CLF to check if this frequency is more than 50kHz off, if so it will not return true and will give a notificiation to check the common mode board.

 

Images attached to this report
H1 CDS
david.barker@LIGO.ORG - posted 09:13, Tuesday 03 September 2024 - last comment - 09:21, Tuesday 03 September 2024(79875)
DTS Environment EPICS went flatline Fri 17:43

Starting 17:43 Fri 30jul2024 the DTS environment monitoring channels went flatline (no invalid error just unchanging values).

We caught this early this morning when Jonathan rebooted x1dtslogin and the DTS channels did not go white-invalid. When x1dtslogin came back, we restarted the DTS cdsioc0 systemd services (dts-tunnel, dts-env) and the channels are active again.

Images attached to this report
Comments related to this report
david.barker@LIGO.ORG - 09:21, Tuesday 03 September 2024 (79878)

Opened FRS31994

LHO VE
david.barker@LIGO.ORG - posted 08:28, Tuesday 03 September 2024 (79874)
Tue CP1 Fill

Tue Sep 03 08:11:49 2024 INFO: Fill completed in 11min 45secs

Jordan confirmed a good fill curbside. The low TC temperatures outside of the fill over the weekend was tracked to an ice build up at the end of the discharge line which has now been cleared. 1-week trend of TC-A also attached.

Images attached to this report
H1 CDS
erik.vonreis@LIGO.ORG - posted 07:56, Tuesday 03 September 2024 (79873)
Workstations and displays updated

Workstations and displays were updated and rebooted.  This was an os packages update.  Conda packages were not updated.

H1 General
oli.patane@LIGO.ORG - posted 07:38, Tuesday 03 September 2024 (79872)
Ops Day Shift Start

TITLE: 09/03 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Lock Acquisition
OUTGOING OPERATOR: Ryan C
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 16mph Gusts, 12mph 5min avg
    Primary useism: 0.01 μm/s
    Secondary useism: 0.12 μm/s
QUICK SUMMARY:

Currently in NOMINAL_LOW_NOISE but not Observing due to some SDF diffs from the PEM injections. We are planning on trying to stay locked during today's maintenance.

H1 General
ryan.crouch@LIGO.ORG - posted 06:21, Tuesday 03 September 2024 (79871)
OPS OWL assistance

H1 called for assistance following some trouble relocking, the previous lock only lasted ~8 minutes. 

08:50 UTC lockloss

09:46 UTC lockloss

10:40 UTC started an IA which took about 20 minutes

11:30 lost it at LOW_NOISE_LENGTH_CONTROL

I had a lot of trouble getting DRMI to lock, flashes were fairly decent (>100)

12:41 UTC back to NLN, the ISS refused to stay locked I finally put us into obs without sqzing after many tries 13:21UTC

LHO General
ryan.short@LIGO.ORG - posted 22:28, Monday 02 September 2024 (79870)
Ops Eve Shift Summary

TITLE: 09/03 Eve Shift: 2300-0500 UTC (1600-2200 PST), all times posted in UTC
STATE of H1: Lock Acquisition
INCOMING OPERATOR: Ryan C
SHIFT SUMMARY: Only one lockloss this shift. Recovery has been taking a while as it's been windy this evening, but H1 is now finally mostly relocked, currently waiting in OMC_WHITENING to damp violins. Otherwise a pretty quiet shift.

LOG:

No log for this shift.

H1 General (Lockloss)
ryan.short@LIGO.ORG - posted 20:47, Monday 02 September 2024 (79869)
Lockloss @ 03:31 UTC

Lockloss @ 03:31 UTC - link to lockloss tool

No obvious cause. Looks like there was some shaking of the ETMs about half a second before the lockloss. Wind speeds have come up to 30mph in the past 45 minutes, so it's possible that could have something to do with the lockloss.

Images attached to this report
H1 General (Lockloss)
anthony.sanchez@LIGO.ORG - posted 16:35, Monday 02 September 2024 (79868)
Unknown Lockloss and Labor Day Shift End

TITLE: 09/02 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Observing at 143Mpc
INCOMING OPERATOR: Ryan S
SHIFT SUMMARY:
Since the last update the only thing that happend was @ 21:58 UTC Lockloss from an unknown cause.
Wind wasn't elevated, not was the Primary microseism.
No PI ring up.

relocked and Back to Obsering at 22:51 UTC

LOG:
No log

LHO General
ryan.short@LIGO.ORG - posted 16:02, Monday 02 September 2024 (79867)
Ops Eve Shift Start

TITLE: 09/02 Eve Shift: 2300-0500 UTC (1600-2200 PST), all times posted in UTC
STATE of H1: Observing at 145Mpc
OUTGOING OPERATOR: Tony
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 19mph Gusts, 12mph 5min avg
    Primary useism: 0.04 μm/s
    Secondary useism: 0.12 μm/s
QUICK SUMMARY: H1 just began observing as I walked in; sounds like several locklosses have been due to PI ringups, and a couple from EQs.

H1 General (Lockloss)
anthony.sanchez@LIGO.ORG - posted 11:53, Monday 02 September 2024 - last comment - 13:08, Tuesday 03 September 2024(79866)
Small Sudden & Local Earthquake off the coast of Oregon

Lockloss page
No warning or notice of an Earthquake was given it was a sudden small spike in ground motion.
It was Observed in Picket fence.

USGS didn't post this right away, but it was a M 4.2 - 210 km W of Bandon, Oregon right off the coast.

I took ISC_lock to Initial Alignment after a lockloss at check miche fringes.
Relocking now.



 

Images attached to this report
Comments related to this report
neil.doerksen@LIGO.ORG - 13:08, Tuesday 03 September 2024 (79886)

Interesting.

H1 General (Lockloss)
anthony.sanchez@LIGO.ORG - posted 10:08, Monday 02 September 2024 (79865)
Labor day Locklosses - The Dreaded Double PI: The Series!

Naoki called the control room pretty early and suspected that the Locklosses from last night were from some PI ring ups as seen in Ryans alog 79860 .
So I told him I'd check it out and document it.

Last night during the spooky  & completely automated OWL shift, when no one was around to see it. There were 5 episodes of lock aquisition and locklosses that strangely all happened around the time when the Lock Clock approached the 2 hour mark. 
Turns out, Naoki's gut instinct was right, they were PI ring ups !

Not only was he correct that the Locklosses were caused by PI ring ups but they were Dreaded Double PI Ring up! Some Say that the Double PI ring up is just a myth or an old operator's legend. Its supposedly a rare event when 2 different Parametric Instabilities modes ring up at the same time!
But here is a list of the Dreaded Double PI ring up sitings from just last night! 

2024-09-02_05:34:01Z ISC_LOCK NOMINAL_LOW_NOISE -> LOCKLOSS  Cause: The Dreaded Double PI 28 And 29!!!  The SUS-PI guardian did not change the phase of compute mode 28 at all. Lockloss page

2024-09-02_08:06:14Z ISC_LOCK NOMINAL_LOW_NOISE -> LOCKLOSS Cause: Another Deaded Double PI 28 & 29! Compute mode 28 phase was not moved again this time. Lockloss page


2024-09-02_10:37:06Z ISC_LOCK NOMINAL_LOW_NOISE -> LOCKLOSS Cause: Dreaded Double PI ring up But this time the Phase for 28 changed, But by then it was too late for everyone involved! ( No one was invloved, cause this was completetly automated.)
Lockloss page


3: 2024-09-02_12:38:45Z  ISC_LOCK  NOMINAL_LOW_NOISE -> LOCKLOSS Ok I'll admit that this one is not a Dreaded Double PI ring up... but it certainly is a PI 24 ring up!
Lockloss page  After getting some second Eyes on this i am now Convinced that this is a Wind Gust Lockloss.

Maybe it is just the SUS-PI Guardian is having trouble damping PI28  and instead is trying to damp PI29 but that must be ringing up PI mode 29.

If it happens again, Naoki has asked me to simply take the SUS-PI Guardian to IDLE and take the  damping gains for 28 & 29 to 0.
Wish me luck.
 

 

 

Images attached to this report
Displaying reports 5701-5720 of 83264.Go to page Start 282 283 284 285 286 287 288 289 290 End