Displaying reports 1061-1080 of 77262.Go to page Start 50 51 52 53 54 55 56 57 58 End
Reports until 10:23, Monday 17 June 2024
LHO VE
david.barker@LIGO.ORG - posted 10:23, Monday 17 June 2024 (78487)
Mon CP1 Fill

Mon Jun 17 10:09:34 2024 INFO: Fill completed in 9min 30secs

Gerardo confirmed a good fill via camera.

Images attached to this report
LHO General
ibrahim.abouelfettouh@LIGO.ORG - posted 07:37, Monday 17 June 2024 (78484)
OPS Day Shift Start

TITLE: 06/17 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Observing at 152Mpc
OUTGOING OPERATOR: Corey
CURRENT ENVIRONMENT:
    SEI_ENV state: SEISMON_ALERT
    Wind: 7mph Gusts, 5mph 5min avg
    Primary useism: 0.04 μm/s
    Secondary useism: 0.08 μm/s
QUICK SUMMARY:

IFO is in NLN and OBSERVING since 11:45 UTC (2hr 55 min lock)

H1 DetChar (DetChar)
young-min.kim@LIGO.ORG - posted 03:55, Monday 17 June 2024 (78483)
DQ Shift Report on 2024 May 20 to 2024 May 26
H1 General
oli.patane@LIGO.ORG - posted 01:00, Monday 17 June 2024 (78481)
Ops EVE Shift End

TITLE: 06/17 Eve Shift: 2300-0800 UTC (1600-0100 PST), all times posted in UTC
STATE of H1: Observing at 155Mpc
INCOMING OPERATOR: Corey
SHIFT SUMMARY: Currently Observing at 160 Mpc and have been locked for over one hour. Two locklosses during my shift, but both were easy to relock from.

LOG:

23:00UTC Detector Observing and Locked for 21 hours

23:22 Kicked out of Observing due to two cameras, ASC-CAM_PIT1_INMON and ASC-CAM_YAW1_INMON glitching or restarting
    - Weren't able to turn back on fully and gave the warning messages "[channel_name] is stuck! Going back to ADS"
    - Referencing alog 77499, we contacted Dave and he restarted camera 26 (BS cam)
23:43 Back into Observing

01:37 Lockloss
- During relocking, COMM wasn't able to get IR high enough
    - I stalled ALS_COMM and tried adjusting the COMM offset by hand, but still wasn't working
02:21 I started an initial alignment
02:43 Initial alignment done, relocking
03:27 NOMINAL_LOW_NOISE
03:29 Started running SQZ alignment (SDF)

03:38 Observing

05:28 Lockloss
05:35 Lockloss from LOCKING_ALS, started an initial alignment
06:00 Initial alignment done, starting relocking
06:43 NOMINAL_LOW_NOISE
06:45 Observing

H1 General (Lockloss)
oli.patane@LIGO.ORG - posted 22:30, Sunday 16 June 2024 - last comment - 00:58, Monday 17 June 2024(78478)
Lockloss

Lockloss @ 06/17 05:28 UTC from unknown causes. Definitely not from wind or an earthquake

Comments related to this report
oli.patane@LIGO.ORG - 23:46, Sunday 16 June 2024 (78479)

06:45 Observing

oli.patane@LIGO.ORG - 00:58, Monday 17 June 2024 (78480)

Haven't figure out the cause, but this lockloss generally follows the pattern that we have seen for other 'DARM wiggle'** locklosses, but there are a couple of extra things that I noted and want to have them recorded, even if they mean nothing.

Timeline (attachment1, attachment2-zoomed, attachment3-unannotated)
Note: I am taking the lockloss as starting at 1402637320.858, since that is when we see DARM and ETMX L3 MASTER_OUT lose and fail to regain control. The times below are milliseconds before this time.

  •  152 - 147ms before LL (yellow box): EX L3 MASTER_OUT channel has a small but sharp drop. This might not actually be anything significant, but the slope of this little drop is much steeper than how EX L3 usually moves. This is not seen by DARM.
  • 116 - 95ms before LL (blue): LSC_DARM_IN1 and DCPD{A,B} are suddenly a bit noisier
  •  96 - 68msbefore LL (pink): glitch
  • 68 - 1ms before LL: DARM and DCPDs go back to looking like normal (the classic requirement of what makes it a DARM wiggle)
  • 0ms (green): DARM/EX L3 MASTER_OUT/DCPDs lose control -> lockloss

It also kind of looks like ASC-CSOFT_P_OUT and ASC-DSOFT_P_OUT get higher in frequency in the ten seconds before the lockloss(attachemnt4), which is something I had previoiusly noticed happening in the 2024/05/01 13:19 UTC lockloss (attachment5). However, that May 1st lockloss was NOT a DARM wiggle lockloss.

** DARM wiggle - when there is a glitch seen in DARM and ETMX L3 MASTER_OUT, then DARM goes back to looking normal before losing lock within the next couple hundreds of milliseconds

 

Images attached to this comment
LHO FMCS (PEM)
oli.patane@LIGO.ORG - posted 21:13, Sunday 16 June 2024 (78477)
HVAC Fan Vibrometers Check FAMIS

Closes FAMIS#26310, last checked 78362

Corner Station Fans (attachment1)
- All  fans are looking normal and within range.

Outbuilding Fans (attachment2)
- All fans are looking normal and within range.

Images attached to this report
H1 General (Lockloss)
oli.patane@LIGO.ORG - posted 19:03, Sunday 16 June 2024 - last comment - 20:39, Sunday 16 June 2024(78475)
Lockloss

Lockloss at 06/17 01:37 UTC - looks like it may have been due to a jump in the wind speed?

Images attached to this report
Comments related to this report
oli.patane@LIGO.ORG - 20:39, Sunday 16 June 2024 (78476)

03:39 UTC Observing

H1 DetChar (DetChar)
sukanta.bose@LIGO.ORG - posted 17:41, Sunday 16 June 2024 (78474)
Data Quality Shift Report 2024-06-03 to 2024-06-09

Link to report.

LHO General
ibrahim.abouelfettouh@LIGO.ORG - posted 16:31, Sunday 16 June 2024 - last comment - 16:47, Sunday 16 June 2024(78472)
OPS Day Shift Summary

TITLE: 06/16 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Corrective Maintenance
INCOMING OPERATOR: Oli
SHIFT SUMMARY:

IFO was in NLN and OBSERVING as of 06:05 UTC (21hr 37 min lock) but is NOW in CORRECTIVE_MAINTENANCE while we briefly restart the ADS Camera.

Like 5 minutes before my shift ended, the ADS Pitch1 Inmon and Yaw1 Inmon are stuck. It seems that they keep trying to turn on, but can't get past TURN_ON_CAMERA_FIXED_OFFSET. This has happened before alog 77499 and it is likely that the cameras just need to be restarted. We did not lose lock. Oli (incoming Op) has called Dave and they are working on it.
 

LOG:

None

Comments related to this report
oli.patane@LIGO.ORG - 16:47, Sunday 16 June 2024 (78473)

Dave restarted the camera servo for camera 26 and we are back in Observing as of 23:43 UTC

H1 General
oli.patane@LIGO.ORG - posted 16:20, Sunday 16 June 2024 (78471)
Ops EVE Shift Start

TITLE: 06/16 Eve Shift: 2300-0800 UTC (1600-0100 PST), all times posted in UTC
STATE of H1: Observing at 152Mpc
OUTGOING OPERATOR: Ibrahim
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 13mph Gusts, 11mph 5min avg
    Primary useism: 0.02 μm/s
    Secondary useism: 0.07 μm/s
QUICK SUMMARY:

Observing and locked for 21.5 hours.

LHO General
ibrahim.abouelfettouh@LIGO.ORG - posted 13:30, Sunday 16 June 2024 (78470)
OPS Day Midshift Update

IFO is in NLN and OBSERVING (Now 18hr 35 min lock!)

Nothing else of note.

LHO VE
david.barker@LIGO.ORG - posted 10:13, Sunday 16 June 2024 (78469)
Sun CP1 Fill

Sun Jun 16 10:10:34 2024 INFO: Fill completed in 10min 30secs

Note TCs did not reach -200C because lower outside temps this morning (15C, 59F).

Images attached to this report
LHO General
ibrahim.abouelfettouh@LIGO.ORG - posted 07:39, Sunday 16 June 2024 (78468)
OPS Day Shift Start

TITLE: 06/16 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Observing at 156Mpc
OUTGOING OPERATOR: Corey
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 9mph Gusts, 4mph 5min avg
    Primary useism: 0.01 μm/s
    Secondary useism: 0.09 μm/s
QUICK SUMMARY:

IFO is in NLN and OBSERVING since 06:06 UTC (12hr 45 min lock)

NUC27 Glitch screen is giving a warning: "Cluster is down, glitchgram is not updated", which I haven't seen before.

H1 General (SQZ)
oli.patane@LIGO.ORG - posted 01:06, Sunday 16 June 2024 (78467)
Ops EVE Shift End

TITLE: 06/16 Eve Shift: 2300-0800 UTC (1600-0100 PST), all times posted in UTC
STATE of H1: Observing at 158Mpc
INCOMING OPERATOR: Corey
SHIFT SUMMARY:  We are Observing at 158 Mpc and have been locked for 6 hours now. The only issues I had today was the OPO ISS maxing out at 54uW and so not being able to catch at 80uW and so needing to adjust the setpoint, and then relocking after the 06/16 00:01 UTC lockloss , I had some issues with getting PRMI and DRMI to lock, but it makes sense since the wind was still a bit high at the time. After that the night has been quiet.
LOG:

23:00 Detector relocking and at DARM_TO_RF

23:39 NOMINAL_LOW_NOISE
- SQZ OPO ISS pump having trouble locking. The OPO transmission couldn't go higher than 54.6uW
    - Adjusted OPO temp, but the current temp was the best so I put it back. Reloaded the OPO guardian (So I had changed nothing!). OPO was able to get up to 72 after this, but still not to 80.
    - Naoki came on teamspeak and lowered the threshold to 70, and it caught very soon after that.
00:01 While dealing with the OPO, we lost lock. We had been locked for 22 minutes

00:29 Lockloss from ACQUIRE_DRMI
00:30 Started an initial alignment
00:51 Initial alignment done, relocking
01:04 Lockloss from ACQUIRE_DRMI
01:52 NOMINAL_LOW_NOISE
01:56 Observing
02:01 Our range was low so I took us out of Observing and ran the sqz tuning guardian states
02:10 Back to Observing, with a 7Mpc increase in range

05:25 Left Observing and started calibration measurements
05:53 Calibration measurements done, running sqz alignment (new optic offsets accepted)

06:05 Back into Observing

Images attached to this report
H1 CAL
oli.patane@LIGO.ORG - posted 22:27, Saturday 15 June 2024 - last comment - 23:05, Saturday 15 June 2024(78464)
Dropped Observing for Calibration

06/16 05:24 UTC I took us out of Observing to run a calibration sweep that we weren't able to run earlier.

Comments related to this report
oli.patane@LIGO.ORG - 23:05, Saturday 15 June 2024 (78466)

06/16 06:05 UTC Back into Observing after running calibration sweep and tuning squeeze

LHO VE (VE)
gerardo.moreno@LIGO.ORG - posted 02:05, Saturday 15 June 2024 - last comment - 02:18, Monday 17 June 2024(78453)
HAM4 Annulus Ion Pump Signal is Railed

HAM4 annulus ion pump signal railed about 7:50 utc 06/15/2024.  No immediate attention is required, per trend of PT120, an adjecent gauge, the internal pressure does not appear to be affected.  HAM4 AIP will be assesed next Tuesday.

Images attached to this report
Comments related to this report
gerardo.moreno@LIGO.ORG - 02:18, Monday 17 June 2024 (78482)VE

AIP is showing some good signs of coming back, see plot attached, regardless, we will keep the appointment to go and investigate this system on Tuesday.

Images attached to this comment
Displaying reports 1061-1080 of 77262.Go to page Start 50 51 52 53 54 55 56 57 58 End