Displaying reports 1-20 of 87001.Go to page 1 2 3 4 5 6 7 8 9 10 End
Reports until 07:44, Thursday 19 March 2026
LHO General
ibrahim.abouelfettouh@LIGO.ORG - posted 07:44, Thursday 19 March 2026 (89563)
OPS Day Shift Start

TITLE: 03/19 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Planned Engineering
OUTGOING OPERATOR: None
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 21mph Gusts, 18mph 3min avg
    Primary useism: 0.03 μm/s
    Secondary useism: 0.21 μm/s 
QUICK SUMMARY:

IFO is in IDLE and ENGINEERING

Plan today is to continue locking. I will start with an initial alignment.

HAM3 ISI had tripped around, 4:30 UTC (9:30 PT) and was followed by 3 HAM3 CPS Glitches.

H1 General
ryan.crouch@LIGO.ORG - posted 20:02, Wednesday 18 March 2026 (89560)
OPS Wednesday EVE shift sumary

TITLE: 03/19 Eve Shift: 2330-0500 UTC (1630-2200 PST), all times posted in UTC
STATE of H1: Planned Engineering
INCOMING OPERATOR: None
SHIFT SUMMARY: I think the +90 phase is incorrect on RF45, and so is -90. I've been able to lock PRMI and DRMI easily this evening, but I lose it as soon as we engage DRMI_3F, more sensors to phase.
LOG:                                                                                                                                                                                                                                                                                                                

Start Time System Name Location Lazer_Haz Task Time End
22:53 TCS Sophie Vac prep lab N CHETA work 23:16
23:15 VAC Gerardo Site buildings N Cryotrap checks around site 00:39

 

Images attached to this report
H1 SUS (EPO, ISC)
oli.patane@LIGO.ORG - posted 18:07, Wednesday 18 March 2026 - last comment - 09:17, Thursday 19 March 2026(89562)
BHSS / BHD work Mar 18: Input alignment finished and OMCA in cradle!

Elenna, Keita, Oli

Input alignment:

Today Elenna and I finished fine-tuning the alignment into the BHSS, so now our beams pass through both sets of irises on the alignment tooling plate, so they should be well aligned to go into the OMCs nicely.

OMCA in cradle:

We also opened up what is going to be OMCA (SN 103) and placed it in its cradle. We followed the installation instructions and placed the OMC 1mm away from the three installed horizontal stops, but then we noticed that lining the short end up against the side horizontal stop led to the OMC overhanging the cradle on that side by a few mm. We're not sure if these end stops are also supposed to be further inwards than 1mm (the instructions just specify that they (the horizontal stops) will be touching the OMC 1mm in), but if we placed the other end stop in with the way that the OMC is currently positioned, that other stop would have to screw in at least 3 or 4 mm to meet up with the OMC. We will add photos of this tomorrow.

Installing DCPDs onto OMCA:

Keita and Elenna (who was chained to the optics table via grounding strap) very carefully trimmed the DCPD wires for D1-20 and C1-12 and placed them in the DCPD mounts for DCPD A (which is what we think TRANS is), and DCPD B (which is what we think REFL is), respectively. Faceplate SN 10 was installed onto DCPD A, and faceplate SN 06 was installed onto DCPD B to hold the PDs in place. It was very hard to get the length of the DCPD pins right, and getting the faceplates on and secured. There seems to be a small gap in between the faceplates and the DCPD housing (pic1, pic2), but LLO has the same thing happening so it seems okay. 

Checking pins

After we used a multimeter to check that the case pins followed D220027. According to D220027 the case pins should be going to pins 6 and 9, but we found that they went to what we thought were pins 1 and 4 when we were looking at the prongs. Later we found that the DB9 pin layout is read the other way, so that means that it was actually pins 2 and 5. This still doesn't line up with D220027, but looking at the schematic for the cable D2300119, it turns out that the wires for the Cases for the two DCPDs are shifted so that the Cases do actually go to pin 2 and pin 5, so they're wired correctly! It's just that D220027 is wired up incorrectly and appearantly the cable pins are read with the prongs facing away from you for some reason.

OMCA is looking pretty good so far! (pic3, pic4)

Summary of installed serial numbers:

OMCA (SN 103)

Images attached to this report
Comments related to this report
oli.patane@LIGO.ORG - 09:17, Thursday 19 March 2026 (89564)

Here are photos of the OMC overhang when the OMC is 1mm away from the horizontal stop (set screw is in by 1mm), and photos showing the other end.

overhang side

overhang top

underhang side

underhang top

Images attached to this comment
H1 IOO (ISC)
jennifer.wright@LIGO.ORG - posted 16:51, Wednesday 18 March 2026 (89561)
JAC guradian got stuck in 'LOCKING' state

Jennie W, Ryan C,

 

The JAC guardian got stuck in the 'LOCKING state after I lost lock of JAC and IMC while trying to inject onto the IMC PZT earlier today.

I think this was because the input power was at 2W when we lost lock and this is high enough powert that the shutter on the JAC REFL PD won't open.

This gives an error message currently saying the 'Shutter closed. Check shutter trigger values' and I added 'and input power' to the end of it and loaded and committed this change to the svn.

It would be good if we could have something that takes us back to 2W if the JAC unlocks.

After talking to Ryan C he has a decorator for the IMC guardian that does something similar so he is looking into this.


NB: from 1-3.30pm PST I was trying to do jitter transfer function measurements by injecting with the PSL PZT and meauring on the IMC WFS. I think what I actually need to do is measure on the JAC WFS so I will come back to this tomorrow.

LHO General
ibrahim.abouelfettouh@LIGO.ORG - posted 16:39, Wednesday 18 March 2026 (89559)
OPS Day Shift Summary

TITLE: 03/18 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Planned Engineering
INCOMING OPERATOR: Ryan C
SHIFT SUMMARY:

IFO is in a single bounce configuration

Overall calm day (though wind is on the rise).

The day started with an initial alignment that could help align the squeezer better. Steps taken were:

1. GREEN_ARMS_MANUAL - lock ALS - INITIAL_ALIGNMENT_OFFLOAD - ALSY was not showing up and trending told us BS was off so we adjusted it.

2. Went to MANUAL_INITIAL_ALIGNMENT

3. POWER to 10W. MICH_DARK_LOCKED (NO WFS). Adjust BS by hand to minimize AS_A

4. POWER to 2W, PRX_LOCKED, Adjust PRM to maximize AS_A

5. POWER to 10W, OFFLOAD_SR2_ALIGN - Fully auto

6. SRY_LOCKED, Adjust SRM by hand to maximize AS_A

7. POWER to 2W, AS_CENTERING_SINGLE_BOUNCE - Fully Auto. Ensure SRM, ITMX, PRM, ETMX, ETMY are misaligned (should do this on its own).

8. AS_CENTERING_OFFLOAD.

This is how to initially align.

Otherwise, SQZ work was the main task today - progress was made.

LOG:                                                                                                                                                                                                                                                                                                                                                                                                                        

Start Time System Name Location Lazer_Haz Task Time End
14:41 FAC Randy LVEA n Removing +X of BSC2 (Craning) 15:10
14:44 FAC Nellie, Kim LVEA N Technical Cleaning 15:47
16:06 FAC Nellie, Kim LVEA N Technical Cleaning 17:01
17:02 TCS Sophie Vac-Prep N CHETA 19:24
17:17 ISC Oli, Elenna Optics Lab Local BHD Work 19:31
17:30 SQZ Camilla Optics Lab, LVEA Local Grabbing optics 18:26
17:41 EE Fil MY N Look for cables at MY 18:41
18:13 VAC Gerardo, Camilla, Gabriele Optics Lab N Viewport inspection 19:31
18:26 PEM RyanC LVEA/CER N Checking dust monitors 18:36
18:28 SAF Richard LVEA N Checking curtains 18:36
18:52 SPI Jeff, Jim Optics Lab Local SPI work 22:00
19:42 OPS Ryan C CER N Dust monitor data 19:45
20:01 EE Fil LVEA N Cabling by biergarten 23:13
20:15 SQZ Sheila FCES N Recovering SQZ 22:04
21:06 BHD Elenna, Keita, Oli OptLab y(local) BHD work 23:54
21:21 TCS Camilla, Sophie LVEA N Marking stickers in prep 22:04
22:53 TCS Sophie Vac prep lab N CHETA work 23:16
23:15 VAC Gerardo Site buildings N Cryotrap checks around site 01:15
H1 General
ryan.crouch@LIGO.ORG - posted 16:17, Wednesday 18 March 2026 (89558)
OPS Wednesday EVE shift start

TITLE: 03/18 Eve Shift: 2330-0500 UTC (1630-2200 PST), all times posted in UTC
STATE of H1: Planned Engineering
OUTGOING OPERATOR: Ibrahim
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 22mph Gusts, 12mph 3min avg
    Primary useism: 0.04 μm/s
    Secondary useism: 0.39 μm/s 
QUICK SUMMARY:

H1 SQZ
sheila.dwyer@LIGO.ORG - posted 16:01, Wednesday 18 March 2026 (89557)
looking for the filter cavity alignment

Sheila, Gabriele, Camilla

The squeezer wasn't initially aligned to the filter cavity yesterday.  This morning I dither locked the OPO and rastered the alignment of ZM2+ZM3, and saw some IR flashes on SQZ-FC_TRANS_D_LF_OUT_DQ (IR at FCES), but this misaligned the beam so that it didn't make it back through the VIP to SQZT7.  I spent some time trying to walk FC1,2,ZM3 to get flashes and the beam on SQZT7, partially sucsesfully, now Camilla has done this same walk with FC2 misaligned and found both the transmission to FCES and the beam on SQZT7 at the same time.  

Gabriele and I went to the FCES to look for beams with cards, we weren't able to see anything.  However, when we got back we realized that the green beam wasn't injected at that time, so we will have to go back.  Camilla is still walking beams now, and we will return to FCES in the morning.  

H1 CDS (SUS)
filiberto.clara@LIGO.ORG - posted 15:47, Wednesday 18 March 2026 (89556)
Cabling for testing of BS on West Bay Test Stand

WP 13101

Four DB25 cables were pulled from the HAM3 SUS-R2 rack down the Y-Arm. Cables around HAM3 are clear of any door removal activities. Cables reached the pylon on the Y arm. A second set of DB25 cables will be used to reach the mechanical test stand. These will be installed later, so as not to create a trip hazard or get damaged.

F. Clara, M. Pirello

H1 PEM
ryan.crouch@LIGO.ORG - posted 15:26, Wednesday 18 March 2026 (89555)
TemTop dust monitors testing pt2

Follow-up to alog88174 on the new brand of dust monitors. The colors are consistent throughout the plots, blue for the control GT521s, red for the PMS21, yellow for the PMD331 in differential, and green for the PMD331 in cumulative.

After some back and forth with the companies support, they ended up sending us 2 new devices, a PMD331 and a PMS21. I ran a few tests with the new devices, the PMD331 continues to look reasonable and similar to the GT521s. Although I have some confusion to counting methods, the company says setting the unit of CFM (cubic foot / min) sets the counting method to differential but I'm not convinced. You can't specify counting method on this devices' interface unlike the PMD331.

The first test I ran was in the front room (incorrectly labeled as MSR) with two MetOne GT521s, a PMD331, and a PMS21. The second was in the CER with both PM# and one MetOne GT521s. The TemTops had filter elements built in which the MetOnes do not, I removed these, cleaned the devices of any leftover particulate from them and ran a bunch of cycles to clear out any more of those lingering particles. The PMD331's sampling clunk which was a multi-layered mesh style, unlike the PMS21 and GT521s which is just a funnel essentially, which seemed to be increasing the counts so I swapped it out to use the same one as the GT521s. After doing these things I prepared for the test by running both the zero count and flow rate calibrations on all the devices setup side by side.

 

The PMS21 still seems to be showing lower counts but it's not nearly as drastic as it was before. For the first test, I applied a scale factor of 2 and 4 for the 0.3um and 0.5um particles respectively. The second test I ran which was over a much longer time span (1.5 hours vs ~20 hours) in which the PMS21 performed more as expected, the 0.3um counts were pretty much spot on with the GT521. For the larger 0.5ums the counting was low as with the last device, this time the scale factor I found to be best was 3, compared to the 40 I had with the last device. This is the model that they prefer to make a pumpless version of, so I'm curious about testing a pumpless version from them to eliminate that potential source of error.

 

The PMD331 still seem reasonable although I have questions about the internal counting methods. I assumed the standard counting method was cumulative so I did some bin subtraction to the counts columns to convert it to differential counting by subtracting the 0.7um from the 0.5um and the 0.5um from the 0.3ums. I assumed the raw data collection was in cumulative as if it were the opposite the counts would be far higher than the GT521. For the 0.3um particle size bin, differential counting more closely match the control but for the 0.5ums it's less clear. The control (blue) trace is in-between the two, it matches the cumulative trace a bit better.

I have also relayed this info to the support engineer I've been talking to with TemTop.

Images attached to this report
LHO General
tyler.guidry@LIGO.ORG - posted 11:39, Wednesday 18 March 2026 (89554)
CEBEX Progress pt2
DGR is yet to break ground on the project, but there have been some additional mobilization efforts. This morning I affixed a time-lapse camera to the roof of Mid-Y for project documentation which will schedule a capture every 10 seconds between the hours of 7AM and 5PM M-F. Snapped pics attached. Earthwork soon..
Images attached to this report
LHO VE
david.barker@LIGO.ORG - posted 10:43, Wednesday 18 March 2026 (89552)
Wed CP1 Fill

Wed Mar 18 10:21:24 2026 INFO: Fill completed in 21min 20secs

 

Images attached to this report
H1 CDS
david.barker@LIGO.ORG - posted 08:02, Wednesday 18 March 2026 - last comment - 10:47, Wednesday 18 March 2026(89549)
h1ioplsc0 timing errors

We are tracking occassional timing errors on h1ioplsc0 which started following its restart yesterday morning when we modified h1lsc to add a DQ channel.

From that restart to now (22 hours) we have had 9 timing errors, of which 3 caused IPC receive errors on h1omcpi.

This had been seen before, in the 09dec2025 - 15dec2025 time period and appears to be related to restarts of the IOP, not code changes.

We don't think this will impact locking activities, we will keep investigating.

Comments related to this report
david.barker@LIGO.ORG - 10:47, Wednesday 18 March 2026 (89553)

I'm running a python script which clears h1ioplsc0 TIM error if that is the only error in STATEWORD. It also clears h1omcpi's IPC Rx error if needed. It runs every minute.

LHO General
ibrahim.abouelfettouh@LIGO.ORG - posted 07:40, Wednesday 18 March 2026 (89548)
OPS Day Shift Start

TITLE: 03/18 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Planned Engineering
OUTGOING OPERATOR: None
CURRENT ENVIRONMENT:
    SEI_ENV state: USEISM
    Wind: 29mph Gusts, 24mph 3min avg
    Primary useism: 0.03 μm/s
    Secondary useism: 0.45 μm/s 
QUICK SUMMARY:

IFO is in IDLE and PLANNED ENGINEERING 

Plan today is to continue locking.

H1 General
ryan.crouch@LIGO.ORG - posted 20:07, Tuesday 17 March 2026 - last comment - 09:19, Wednesday 18 March 2026(89547)
OPS Tuesday eve shift summary

TITLE: 03/18 Eve Shift: 2330-0500 UTC (1630-2200 PST), all times posted in UTC
STATE of H1: Planned Engineering
INCOMING OPERATOR: None
SHIFT SUMMARY: I couldn't get SRY locked/aligned, SR2 and SRM alignment should probably be looked at tomorrow.
LOG:

Comments related to this report
camilla.compton@LIGO.ORG - 09:19, Wednesday 18 March 2026 (89551)

Ibrahim and I found the Y-arm ALS beam by restoring BS Pitch which got mis-aligned ~100urad during MICH_DARK_LOCKED, attached. We also had to restore ETMY Pitch to get light back.

Images attached to this comment
H1 CDS (SEI, SPI, SUS)
filiberto.clara@LIGO.ORG - posted 17:04, Tuesday 17 March 2026 (89546)
Field Cable Work SUS-R2 and HAM2

WP 13089
WP 13095

The following work was completed today:

  1. Beckhoff media converter placed on rack 24V power. Previously powered by bench power supply.
  2. TCSX 24V power and IR sensor cables replaced with appropriate length cables. Both TCS laser controllers for X&Y were keyed off. Power supplies output in MER disabled. This was done in precaution since the media converter work could disable TCS chillers.  Laser controllers keyed on and power supplies output enabled when work was complete.
  3. Three HAM2 L4C cables pulled from SEI-C2 to HAM2. Jim W. sorting out final flange locations.
  4. Two 8 AWG power cables pulled from CER Mezzanine to SUS-R2.
  5. A DB37 cable was pulled from CER SUS-C2 to SUS-R1. Part of QOSEM upgrade.

Remaining work on SUS-R2 includes installation of power junction box, power sequencer, and a ±18V power strip.

F. Clara, J. Figueroa, M. Pirello and R. Thompson

H1 SQZ
sheila.dwyer@LIGO.ORG - posted 16:49, Tuesday 17 March 2026 (89540)
SQZ nonlinear gain, beam in HAM6

Sheila, Gabriele

Last week, Camilla started to get the squeezer working again in 89453, today I'm continuing that.

On Jan 29th, when we injected 74mW of seed into the CLF fiber, the rejected polarization PD in HAM7 for the IR fiber was saturated, and the CLF reflection trigger PD saw 21mW of seed, and when locked on the dither we get 0.8mW onto the OPO IR PD.   Now, we inject 74 mW, the rejected PD is still saturated, we have 22mW on the CLF reflected PD (trigger diode), so the fiber coupling efficiency and transmission seems similar to January.  However, we only have 0.1mW on the OPO IR PD diode (transmitted) as Camilla reported.  Looking at the OPO scan, the higher order mode is 13% of the 00 mode, so the alingment from the fiber into the OPO seems OK.  

The problem was the alignment of ZM2+ZM3.  According to their osems, they moved during pump down as is expected.  Restoring them to the osem values didn't bring us fully back to the 0.79mW OPO IR PD transmission, but I was able to walk them from there to get 0.74mW.  

The OPO green transmission is much lower than with the old OPO, this is expected because we are using the transmission through a M2 which is a high reflector for 532nm and the transmission can ve very different for different mirrors.  We during O4 we used 80uW as our setpoint for OPO trans, I set it to 23uW to get the pump ISS working.  I also edited line 232 of the OPO guardian , to set the trigger level for channel 1 (green trans) to 0.1 rather than 0.5.  

Without adjusting the translation stage, I checked the NLG at a few power levels for a threshold measurement:

OPO green transmission max OPO IR PD (unamplified seed = 1.49e-3) green launch power NLG
24uW 0.435 21mW 297
12uW 0.0114 10mW 7.8
18uW 0.0423 15.2mW 28.97

We got rid of the global ir_locked_refl_min in the OPO guardian and replaced it with a self.ir_locked_refl_min in only the LOCKED_SEED_DITHER state. 

We moved FC1 and FC2 back to where they were in O4 according to the osems, and walked ZM3 to keep the OPO IR seed dither lock on SZT7.  Gabriele and I went to SQZT7 and tried to re-align the FCGS refl diode, I think I put the wrong beam on the diode.  

We opened the beam diverter and we can see the seed beam on AS A, B, and C.  We tried engaging the centering servos as in 88767, but this saturated ZM4.  We would like to run an interferometer alignment to check the alignment of SRM, OM1, and OM2 before we walk the ZMs.  

Images attached to this report
H1 CDS
david.barker@LIGO.ORG - posted 11:43, Tuesday 17 March 2026 - last comment - 08:28, Wednesday 18 March 2026(89541)
CDS Maintenance Summary: Tuesday 17th March 2026

WP13096 Add missing h1lsc ALS DIFF PLL channel to DAQ

Jonathan, Erik, EJ, Dave:

The missing H1:ALS-C_DIFF_PLL_CTRL_OUT_DQ channel was added to the h1lsc model's DAQ Block.

We initially just restarted h1lsc, but this caused a DAC error in h1ioplsc0 (similar to h1ascimc and h1iopasc0) which meant we had to restart all the models on h1lsc0.

A DAQ restart was required.

WP13078 Add pending slow controls and h1sush6 to DAQ

WP13094 Add JAC CAM EPICS and CHETA slow chans to DAQ

DAQ RESTART

Jonathan, Erik, EJ, Dave:

A new H1EDC.ini was created with the following changes:

H1EPICS_DIGVIDEO.ini Regenerated with JAC CAM (cam38) added (+25)

H1EPICS_ECATTCSCS.ini Added CHETA laser pwr chans (+2)

H1EPICS_ECATAUXCS.ini Added C_PZT_SHUTTER chans (+11)

H1EPICS_FEC.ini Added h1sush6 models (+97)

H1EPICS_SDF.ini Added h1sush6 models (+160)

H1EPICS_DAQ.ini Added h1sush6 models (+32)

total +327. Num chans increased from 60582 to 60909.

Erik ran puppet to officially add all the new h1sush6 models to the system (I had done some hand-editing last week). He also determined the DAQ time slot h1sush6 should used.

Jonathan got the DAQ ready for the new front end and models. He also plugged the DAQ ethernet cable into the back of h1sush6 (the Dolphin cable remains unplugged for now).

We rebooted h1sush6 to get the DAQ sending started.

Jonathan then restarted the DAQ, starting with the 0-leg. I restarted the EDC at this time.

A second 0-leg restart was required to complete the configuration.

When this was stable, the 1-leg was restarted with no issues. 

We saw two spontaneous restarts of FW0 around this time, which had been seen before. At time of writing, FW0 has been up for 90 minutes.

Updates and Cleanup

Dave:

I regenerated the CDS Overview, changing h1sush6 from NO_DAQ to NOMINAL

I had to hand edit the python code generating the H1EPICS_[FEC,SDF].ini to add the new models even when they were not yet added to the DAQ, so we only needed to restart the DAQ once.

I updated DAQSTAT and CDS_HW_STAT to expect the new FECs, Models, ADCs, DACs.

Guardian Node Loading Modified Source Code.

Ryan S:

Recent changes to lscparams.py (last modified 18:43 16mar2026) and ISC_library.py (last modified 16:55 10mar2026) meant that 22 guardian nodes were pending reloading of these files.

Ryan verified that the changes made to the source code was orthogonal to the snipets used by these nodes and reloaded to the code into them to clear up the GRD CFC flags.

Loaded 273 opt/ file checksums from /opt/rtcds/lho/h1/data/guardian_files/current
Loaded 176 nodes from /opt/rtcds/lho/h1/data/guardian_files/current.yaml

================================================================================================
Node                    Status      File            Source Date            Running Date
================================================================================================
ALIGN_IFO               NOT LOADED  lscparams.py    18:43 Mon 16mar2026    12:16 Wed 11mar2026
ALS_COMM                NOT LOADED  lscparams.py    18:43 Mon 16mar2026    13:51 Mon 16mar2026
ALS_DIFF                NOT LOADED  lscparams.py    18:43 Mon 16mar2026    10:41 Fri 13mar2026
ALS_XARM                NOT LOADED  ISC_library.py  16:55 Tue 10mar2026    08:03 Tue 03mar2026
                        NOT LOADED  lscparams.py    18:43 Mon 16mar2026    08:03 Tue 03mar2026
ALS_YARM                NOT LOADED  lscparams.py    18:43 Mon 16mar2026    14:18 Wed 11mar2026
CAMERA_SERVO            NOT LOADED  ISC_library.py  16:55 Tue 10mar2026    08:02 Tue 03mar2026
                        NOT LOADED  lscparams.py    18:43 Mon 16mar2026    08:02 Tue 03mar2026
H1_MANAGER              NOT LOADED  ISC_library.py  16:55 Tue 10mar2026    08:03 Tue 03mar2026
                        NOT LOADED  lscparams.py    18:43 Mon 16mar2026    08:02 Tue 03mar2026
IMC_LOCK                NOT LOADED  lscparams.py    18:43 Mon 16mar2026    16:55 Tue 10mar2026
INIT_ALIGN              NOT LOADED  ISC_library.py  16:55 Tue 10mar2026    08:02 Tue 03mar2026
                        NOT LOADED  lscparams.py    18:43 Mon 16mar2026    08:02 Tue 03mar2026
ISC_DRMI                NOT LOADED  ISC_DRMI.py     07:57 Tue 17mar2026    18:43 Mon 16mar2026
LASER_PWR               NOT LOADED  ISC_library.py  16:55 Tue 10mar2026    08:03 Tue 03mar2026
                        NOT LOADED  lscparams.py    18:43 Mon 16mar2026    08:03 Tue 03mar2026
LOCKLOSS_SHUTTER_CHECK  NOT LOADED  lscparams.py    18:43 Mon 16mar2026    08:02 Tue 03mar2026
OMC_LOCK                NOT LOADED  ISC_library.py  16:55 Tue 10mar2026    08:03 Tue 03mar2026
                        NOT LOADED  lscparams.py    18:43 Mon 16mar2026    08:03 Tue 03mar2026
SEI_CONF                NOT LOADED  ISC_library.py  16:55 Tue 10mar2026    08:02 Tue 03mar2026
                        NOT LOADED  lscparams.py    18:43 Mon 16mar2026    08:02 Tue 03mar2026
SEI_ENV                 NOT LOADED  ISC_library.py  16:55 Tue 10mar2026    08:03 Tue 03mar2026
                        NOT LOADED  lscparams.py    18:43 Mon 16mar2026    08:02 Tue 03mar2026
SQZ_FC                  NOT LOADED  ISC_library.py  16:55 Tue 10mar2026    08:03 Tue 03mar2026
                        NOT LOADED  lscparams.py    18:43 Mon 16mar2026    08:03 Tue 03mar2026
SQZ_MANAGER             NOT LOADED  ISC_library.py  16:55 Tue 10mar2026    08:03 Tue 03mar2026
                        NOT LOADED  lscparams.py    18:43 Mon 16mar2026    08:03 Tue 03mar2026
SUS_CHARGE              NOT LOADED  lscparams.py    18:43 Mon 16mar2026    08:02 Tue 03mar2026
TCS_ITMX_CO2_PWR        NOT LOADED  lscparams.py    18:43 Mon 16mar2026    08:03 Tue 03mar2026
TCS_ITMY_CO2_PWR        NOT LOADED  lscparams.py    18:43 Mon 16mar2026    08:03 Tue 03mar2026
TEST                    NOT LOADED  lscparams.py    18:43 Mon 16mar2026    12:07 Tue 10mar2026
THERMALIZATION          NOT LOADED  lscparams.py    18:43 Mon 16mar2026    08:02 Tue 03mar2026
TMS_SERVO               NOT LOADED  ISC_library.py  16:55 Tue 10mar2026    08:03 Tue 03mar2026
                        NOT LOADED  lscparams.py    18:43 Mon 16mar2026    08:03 Tue 03mar2026
VIOLIN_DAMPING          NOT LOADED  ISC_library.py  16:55 Tue 10mar2026    08:02 Tue 03mar2026
                        NOT LOADED  lscparams.py    18:43 Mon 16mar2026    08:02 Tue 03mar2026
================================================================================================

36 file(s) NOT LOADED
 

Comments related to this report
david.barker@LIGO.ORG - 08:28, Wednesday 18 March 2026 (89550)

Tue17Mar2026
LOC TIME HOSTNAME     MODEL/REBOOT
09:37:51 h1lsc0       h1lsc       <<< new h1lsc model, but caused tim error on IOP
09:41:09 h1lsc0       h1ioplsc0   <<< restart all models on h1lsc0
09:41:23 h1lsc0       h1lsc       
09:41:37 h1lsc0       h1lscaux    
09:41:51 h1lsc0       h1sqz       
09:42:05 h1lsc0       h1ascsqzfc  


09:44:38 h1sush6      ***REBOOT*** <<< reboot h1sush6 to add to DAQ
09:45:40 h1sush6      h1iopsush6  
09:45:53 h1sush6      h1susom0    
09:46:06 h1sush6      h1susobs    
09:46:19 h1sush6      h1susam     
09:46:32 h1sush6      h1susomcab  
09:46:45 h1sush6      h1susom1ab  
09:46:58 h1sush6      h1susom2ab  
09:47:11 h1sush6      h1susom3ab  


09:52:00 h1daqgds0    [DAQ]  <<< first 0-leg restart to add sush6
09:52:01 h1daqnds0    [DAQ]
09:52:06 h1susauxh56 h1edc[DAQ] <<< EDC restart to add sush6, cam38, latest slow-ctrls
09:52:12 h1daqgds0    [DAQ]
09:52:13 h1daqfw0     [DAQ]
09:52:13 h1daqnds0    [DAQ]
09:52:13 h1daqtw0     [DAQ]


09:58:08 h1daqgds0    [DAQ] <<< second 0-leg restart to add sush6
09:58:08 h1daqnds0    [DAQ]
09:58:15 h1daqtw0     [DAQ]
09:58:19 h1daqfw0     [DAQ]
09:58:19 h1daqtw0     [DAQ]
09:58:21 h1daqgds0    [DAQ]
09:58:21 h1daqnds0    [DAQ]


10:05:49 h1daqnds0    [DAQ] <<< restart nds0 to make its uptime postive cf GDS0.


10:07:32 h1daqdc1     [DAQ] <<< 1-leg restart to add sush6
10:07:38 h1daqfw0     [DAQ] <<< spontaneous restart FW0, no FWs running at this point
10:07:42 h1daqnds1    [DAQ]
10:07:43 h1daqdc1     [DAQ]
10:07:47 h1daqtw1     [DAQ]
10:07:49 h1daqgds1    [DAQ]
10:07:52 h1daqfw1     [DAQ]
10:07:52 h1daqnds1    [DAQ]
10:07:52 h1daqtw1     [DAQ]
10:08:01 h1daqgds1    [DAQ] <<< 2nd GDS1 restart needed


10:12:48 h1daqtw0     [DAQ] <<< restart TW0 to get its delta time positive


10:23:07 h1daqfw0     [DAQ] <<< second spontaneous FW0 restart


14:34:50 h1seih16     ***REBOOT*** <<< power cycle h1seih16 following IO Chassis power glitch
14:36:23 h1seih16     h1iopseih16 
14:36:36 h1seih16     h1hpiham1   
14:36:49 h1seih16     h1hpiham6   
14:37:02 h1seih16     h1isiham1   
14:37:15 h1seih16     h1isiham6   
 

Displaying reports 1-20 of 87001.Go to page 1 2 3 4 5 6 7 8 9 10 End