Displaying reports 8481-8500 of 86139.Go to page Start 421 422 423 424 425 426 427 428 429 End
Reports until 16:33, Sunday 08 September 2024
H1 General
oli.patane@LIGO.ORG - posted 16:33, Sunday 08 September 2024 (79979)
Ops Day Shift End

TITLE: 09/08 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Lock Acquisition
INCOMING OPERATOR: Ryan C
SHIFT SUMMARY: Just got into Observing after having waited in OMC_WHITENING for a while damping violins - they seem to have gotten rung up from a lockloss we had from ENGAGE_ASC_FOR_FULL_IFO. This last relocking process went smoothly otherwise, and the lockloss from earlier today was also not complicated, besides needing to wait a good while for ALSY WFS 3 Yaw to converge.
LOG:

14:30UTC Observing and Locked for 16:50hours
15:21 Lockloss after 17:39 hours locked


15:41 Going to run an initial alignment after locking green arms
    - ALS_YARM was sitting in the INITIAL_ALIGNMENT state for a while without starting offloading because WFS_3_Y was taking a while to get under the threshold. I took ALS_YARM to UNLOCKED just in case and then back to INITIAL_ALIGNMENT_OFFLOAD and it converged eventually.
16:36 Initial alignment done, relocking
17:18 NOMINAL_LOW_NOISE
17:21 Observing

21:45 Lockloss
22:16 Lockloss from ENGAGE_ASC_FOR_FULL_IFO
23:30 NOMINAL_LOW_NOISE
23:30 Observing

H1 General
ryan.crouch@LIGO.ORG - posted 16:10, Sunday 08 September 2024 - last comment - 17:56, Sunday 08 September 2024(79977)
OPS Sunday EVE shift start

TITLE: 09/08 Eve Shift: 2330-0500 UTC (1630-2200 PST), all times posted in UTC
STATE of H1: Lock Acquisition
OUTGOING OPERATOR: Oli
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 8mph Gusts, 5mph 3min avg
    Primary useism: 0.01 μm/s
    Secondary useism: 0.07 μm/s
QUICK SUMMARY:

 

Comments related to this report
ryan.crouch@LIGO.ORG - 16:33, Sunday 08 September 2024 (79980)

Back to Observing at 23:30 UTC

ryan.crouch@LIGO.ORG - 17:56, Sunday 08 September 2024 (79982)CDS

CDS reports 12 disconnected channels, all related to NUC33. The NUC could probably use a restart, I can't vnc into it and pinging it gives nothing back, its frozen.

H1 General (Lockloss)
oli.patane@LIGO.ORG - posted 14:47, Sunday 08 September 2024 (79978)
Lockloss

Lockloss @ 09/08 21:45UTC after 4.5 hours locked

H1 General
oli.patane@LIGO.ORG - posted 12:51, Sunday 08 September 2024 (79976)
Ops DAY Midshift Status

Currently Observing at 158Mpc nd  have been Locked for 2.5 hours. Quiet day with nothing to report

H1 General (Lockloss)
oli.patane@LIGO.ORG - posted 08:24, Sunday 08 September 2024 - last comment - 10:21, Sunday 08 September 2024(79974)
Lockloss

Lockloss @ 09/08 15:21UTC after 17:39 locked

Comments related to this report
oli.patane@LIGO.ORG - 10:21, Sunday 08 September 2024 (79975)

17:21 Observing

LHO VE
david.barker@LIGO.ORG - posted 08:19, Sunday 08 September 2024 (79973)
Sun CP1 Fill

Sun Sep 08 08:09:39 2024 INFO: Fill completed in 9min 35secs

Images attached to this report
H1 General
oli.patane@LIGO.ORG - posted 07:33, Sunday 08 September 2024 (79972)
Ops Day Shift Start

TITLE: 09/08 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Observing at 152Mpc
OUTGOING OPERATOR: Ryan S
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 9mph Gusts, 5mph 3min avg
    Primary useism: 0.01 μm/s
    Secondary useism: 0.08 μm/s
QUICK SUMMARY:

Observing at 158Mpc and have been Locked for almost 17 hours. Everything looking normal, but the dust monitor alarm for the optics lab was going off so I'll make sure we don't have more sand appearing in there.

LHO General
ibrahim.abouelfettouh@LIGO.ORG - posted 22:00, Saturday 07 September 2024 (79971)
OPS Eve Shift Summary

TITLE: 09/08 Eve Shift: 2330-0500 UTC (1630-2200 PST), all times posted in UTC
STATE of H1: Observing at 152Mpc
INCOMING OPERATOR: Ryan S
SHIFT SUMMARY:

IFO is in NLN and OBSERVING as of 21:52 UTC (7hr 20 min lock) with some minor squeeze exceptions. Otherwise, very smooth day with 0 locklosses during my shift.

The squeezer has been acting up today:

23:04 UTC COMISSIONING: Squeezer was far from optimal and while still observing, we were only getting 120ish MPc range. As such, Oli and I went into temp comissioning to run the temperature optimization script before trying to relock it. While this was happening, Naoki called and said he thought it was a Pump ISS issue and then took hold of IFO to fix it. He was successful and we were back to obsering at our recent 155ish MPc. OBSERVING again as of 23:27 UTC

01:53 UTC COMISSONING: Squeezer unlocked, sending us into comissiong but within a few minutes it relocked again automatically. I was watching it do so. We were OBSERVING again as of 01:58 UTC.

Other:

We successfuly rode through a 6.0 mag EQ from Tonga. EQ mode triggered successfully.

Dust is high in the Optics Lab - I was told by Oli yesterday that there's a strange acummulation of sand by a dust monitor and that some measures were taken to remove the sand though perhaps more has built up. The 300NM PCF monitor is at RED alert and the 500NM PCF monitor is at YELLOW.

LOG:

None

H1 General
oli.patane@LIGO.ORG - posted 16:30, Saturday 07 September 2024 (79970)
Ops Day Shift End

TITLE: 09/07 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Commissioning
INCOMING OPERATOR: Ibrahim
SHIFT SUMMARY: We have been Locked for close to 2 hours. Not currently Observing because Naoki is trying to adjust some squeeze stuff since our range is really bad right now. Quiet day with one lockloss and easy relocking
LOG:

14:30UTC Observing and Locked for 7:47hrs
15:28 Plane passes overhead
15:38 Superevent S240907cg
18:30 Left Observing to run calibration sweep
19:04 Calibration measurements finished, back into Observing
19:36 Lockloss

20:20 We started going through CHECK_MICH_FRINGES for the second time so I took us to DOWN and started an initial alignment
20:41 Initial alignment done, relocking
21:41 NOMINAL_LOW_NOISE
    - OPO couldn't catch, I lowered opo_grTrans_setpoint_uW to 69 in sqzparams, reloaded OPO, locked the ISS, and then adjusted the OPO temperature a bit until SQZ-CLF_REFL_RF6_ABS was maximized. Accepted new temperature setpoint and went into Observing
21:52 Observing
23:04 Left Observing to run sqz alignment

LHO General
ibrahim.abouelfettouh@LIGO.ORG - posted 16:03, Saturday 07 September 2024 (79968)
OPS Eve Shift Start

TITLE: 09/07 Eve Shift: 2330-0500 UTC (1630-2200 PST), all times posted in UTC
STATE of H1: Observing at 136Mpc
OUTGOING OPERATOR: Oli
CURRENT ENVIRONMENT:
    SEI_ENV state: SEISMON_ALERT
    Wind: 13mph Gusts, 6mph 3min avg
    Primary useism: 0.04 μm/s
    Secondary useism: 0.09 μm/s
QUICK SUMMARY:

IFO is in NLN and OBSERVING as of 21:52 UTC

X1 SUS
oli.patane@LIGO.ORG - posted 13:58, Saturday 07 September 2024 (79966)
08/30 BBSS M1 Transfer function measurements

We have another installment of transfer functions for the BBSS. These were taken on August 30th and results can be found in $(sussvn)/BBSS/X1/BS/SAGM1/Results/2024-08-30_2300_tfs/ and I've attached the pdf. We've been dealing with the mystery of the F1 OSEM on M1 drifting downwards sometimes (see 79941 - very important info), and through we believe that the drift does not affect the transfer functions, we still want to make it clear that this measurement was taken shortly after adjusting for the drift, and over the next few days we confirmed that it was still drifting down. We also did a comparison by plotted this measurement set next to the set from July 12(79181), which was close to when the new (shorter by 9mm) wire loop was installed, and which was a time where we did have F1 drift, although in that case the measurements were taken over a week after F1 started the drift, so it had already travelled most of the way down that it was going to travel (drifts down in an exponential decay-like fashion). I also added the measurements from back in January when we completed the first iteration75787. It's interesting to see how the July and August Pitch TFs line up with each other around 1Hz as compared to the January measurement and the model. The location of this peak depends heavily on the distance between the M1 blades and the center of mass of M1, so this shift makes sense since the M1 blade heights have been changed since the initial build in January, and the current model doesn't yet reflect this change (comparison of how d1 changes the model with this measurement also plotted).

Non-image files attached to this report
H1 General (Lockloss)
oli.patane@LIGO.ORG - posted 12:37, Saturday 07 September 2024 - last comment - 14:57, Saturday 07 September 2024(79965)
Lockloss

Lockloss @ 09/07 19:36UTC after nearly 13 hours Locked

Comments related to this report
oli.patane@LIGO.ORG - 14:57, Saturday 07 September 2024 (79967)

21:52 UTC Observing

H1 General
oli.patane@LIGO.ORG - posted 12:25, Saturday 07 September 2024 (79964)
Ops DAY Midshift Status

Currently Observing right around 160Mpc and have been Locked for 12.5 hours. We dropped Observing a little while ago to run a calibration sweep, but we've been back Observing for over half an hour now.

H1 CAL
oli.patane@LIGO.ORG - posted 12:04, Saturday 07 September 2024 (79963)
Calibration Measurements September 07 2024

Calibration measurements run. Before starting measurements we had been Locked for 11:45mins. calibration monitor ss

Measurements completed, but we got this error in simulines - not sure if this exact error is already known about but I thought it wouldn't hurt to attach it just in case.

 

Broadband (18:30 - 18:35UTC)

Output file:

/ligo/groups/cal/H1/measurements/PCALY2DARM_BB/PCALY2DARM_BB_20240907T183023Z.xml

 

Simulines (18:38 - 19:01UTC)

Output files:

/ligo/groups/cal/H1/measurements/DARMOLG_SS/DARMOLG_SS_20240907T183846Z.hdf5

/ligo/groups/cal/H1/measurements/PCALY2DARM_SS/PCALY2DARM_SS_20240907T183846Z.hdf5

/ligo/groups/cal/H1/measurements/SUSETMX_L1_SS/SUSETMX_L1_SS_20240907T183846Z.hdf5

/ligo/groups/cal/H1/measurements/SUSETMX_L2_SS/SUSETMX_L2_SS_20240907T183846Z.hdf5

/ligo/groups/cal/H1/measurements/SUSETMX_L3_SS/SUSETMX_L3_SS_20240907T183846Z.hdf5

Images attached to this report
LHO VE
david.barker@LIGO.ORG - posted 08:15, Saturday 07 September 2024 (79962)
Sat CP1 Fill

Sat Sep 07 08:08:26 2024 INFO: Fill completed in 8min 23secs

Images attached to this report
X1 SUS (SUS)
ibrahim.abouelfettouh@LIGO.ORG - posted 18:57, Thursday 05 September 2024 - last comment - 16:19, Saturday 07 September 2024(79941)
BBSS M1 Pitch Instability F1 BOSEM Drift: The Saga Continues

Ibrahim, Oli, Jeff, Betsy, Joe, Others

Summary:

Relevant Alogs:

alog 79079: Recent Post-TF Diagnostic Check-up - one of the early discoveries of the drift and pitch instability.

alog 79181: Recent M1 TF Comparisons. More recent TFs have been taken (found at: /ligo/svncommon/SusSVN/sus/trunk/BBSS/X1/BS/SAGM1/Data on the X1 network). We are waiting on updated confirmation of model parameters in order to know what we should correctly be comparing our measurements to. We just confirmed d4 a few days ago following the bottom wire loop change and now seek to confirm d1 and what that means with respect to our referential calibration block.

alog 79042: First investigation into the BOSEM drift - still operating erroneously under the tmperature assumption.

alog 79032: First discovery of drift issue, originally erroneously thought to be part of the diurnal temperature driven suspension sag (where I though that blades sagging more than others contributed to the drift in pitch).

Hypothesis:

We think that this issue is related to the height of the blades for these reasons:

  1. The issue was fixed when we lowered all blades from the calibration block's "nominal" or zero by -1.5mm with all 4 blades roughly close to this number (avg -1.5mm)
  2. The issue came back when we attempted to fix the S-shaped M1 blade tip by correcting the extra swivel it needed to have in order to stay at the same height. (Joe recommendation to Betsy)
  3. Oli and Jeff have a d1 investigation in alog 76071 overlays different P to P model TFs when the blade heights are above/below their physical D (called FD in the attached plots).
    1. Interestingly, there is a new mode at roughly 1.9Hz when d is above the model's physical D by +-4mm. This mode is confirmed to not be cross coupling. Our recent TFs don't have them but TFs with the drift earlier do - I think this is a red herring.
    2. More clearly, the attached file shows overlays from different d1 sizes (Pitch).
  4. While the F1 blades are at an avg height of -1.5mm below nominal calibration block height, the spread between the individual blades is higher than before, with the problematic "soft/S" blade measuring at only -1mm. There's another blade at -1.8mm. This is the only difference between our current drifty -1.5mm avg and the non-drifty -1.5mm avg is the spread of each indiv. blade height. At this point, I'm interested in seeing how the spread of individual blades affects the drift effect in addition to just an everage d1 drop - could it be a combo of these effects? We can investigate the latter by playing with the model and the former by emperically measuring the drift itself.

Our Units:

Sensor Calibration Block Nominal: 0mm = 25.5mm using shims, drifty - what below measurements are based on
Config 1: -1.5mm avg = 24mm using shims, no drift
Config 2: -1.5mm avg = 24mm using shims, drifty. Only difference is that the spread of the individual blade tip heights is greater. Indiv blade heights: -1.6mm, -1.5mm, -1.0mm, -1.8mm.

We need to know how the calibration block converts to model parameters in d1 and whether that's effective or physical d1 in the model. Then we can stop using referential units.

To further investigate, we have questions:

  1. What is the "sensor calibration block" calibrated to? Physical D (Center of Mass to blade tip) or Effective D?  What are these values? We just want to find a model way to test parameters rather than the cal block or the shim methods to our model since right now we're going off potentially old information.
  2. Could differences between the 4 individual blades be causing a drift this stark? (i.e. it's not a net d1 height issue but a blade to blade height issue or a combo). I'm thinking this may be the case since we have two equal net heights (-1.5mm avg) with the only difference being the spread of the indiv. heights.

Some Early Observations (attempting to constrain our model to our measurements):

  1. TFs before and after the F1 drift manifested (now vs 7 days ago) barely change the actual peak locations, but that's expected due to nature of TFs (I think).
  2. The difference between -1mm and -3mm drastically changes the 1.05Hz peak's position. In general, small mm changes have noticeable decimal freq. changes.
  3. The shape of the model curve is different for -3mm and -5mm, having positive inflection. Anything higher has our straight/negative inflection shape.

Attachments:

F1Drift09052024: BOSEM Drift over the last 7 days. Notice that the F1 OSEM is the only one showing a drift. LF and RT show a diurnal temperature based change due to suspension sagging but this is unrelated.
F1DriftEuler09052024: BOSEM Euler Basis Drift over the last 7 days. Notice that only Pitch is showing the drift
F1DriftM2CountsEuler09052024: BOSEM Counts Drift in the M2 (PUM) Stage for both euler and direct. Notice that there is no percieveable drifting or pitching here. Disclaimer: The M2 Sat-amp box is old and has a transimpedance issue. I just got a spare and will switch it out when not on-shift.
triplemodelcomp_2024-08-30_2300_BBSS_M1toM1: Oli's TF model to measurement comparison with different physical d1 +- mm distances. Pitch here is the most important. We want to empically fit the model to the measurement but we do not yet know the absolute height of the calibration block in model terms.
allbbss_2024-jan05vJuly12Aug30_X1SUSBS_M1_ALL_ZOOMED_TFs: Oli's Drift v. No Drift v. Model Comparison. Oli is planning on posting an alog both with this information and the d1 distance comparisons once we ascertain calibration block absolute units.
Images attached to this report
Non-image files attached to this report
Comments related to this report
oli.patane@LIGO.ORG - 16:19, Saturday 07 September 2024 (79969)

Update to the triplemodelcomp_2024-08-30_2300_BBSS_M1toM1 file Ibrahim attached - there is an update to the legend. In that version I had the description for the July 12th measurement as 'New wire loop, d1=-1.5mm, no F1 drift', but there was actually F1 drift during that measurement - it had just started over a week before so the OSEM values weren't declining as fast as they had been earlier that week. I also want to be more specific as to what d1 means in that context, so in this updated version I changed July's d1 to be d1_indiv to hopefully better show that that value of -1.5mm is the same for each blade, whereas for the August measurements (now posted ) we have d1_net, because the blades heights differ by multiple .1 mms, but they still average out to the same -1.5mm.

Non-image files attached to this comment
Displaying reports 8481-8500 of 86139.Go to page Start 421 422 423 424 425 426 427 428 429 End