Displaying reports 1801-1820 of 85612.Go to page Start 87 88 89 90 91 92 93 94 95 End
Reports until 13:36, Saturday 09 August 2025
H1 CAL
anthony.sanchez@LIGO.ORG - posted 13:36, Saturday 09 August 2025 (86279)
Well Thermalized Calibration Sweep

Command Ran:
pydarm measure --run-headless bb
2025-08-09 12:06:24,938 config file: /ligo/groups/cal/H1/ifo/pydarm_cmd_H1.yaml


notification: new test result
notification: end of measurement
notification: end of test
diag> save /ligo/groups/cal/H1/measurements/PCALY2DARM_BB/PCALY2DARM_BB_20250809T190625Z.xml
/ligo/groups/cal/H1/measurements/PCALY2DARM_BB/PCALY2DARM_BB_20250809T190625Z.xml saved
diag> quit
EXIT KERNEL

2025-08-09 12:11:35,388 bb measurement complete.
2025-08-09 12:11:35,388 bb output: /ligo/groups/cal/H1/measurements/PCALY2DARM_BB/PCALY2DARM_BB_20250809T190625Z.xml
2025-08-09 12:11:35,389 all measurements complete.
------------------------------------------------------------------------------------

Command Ran:
gpstime;python /ligo/groups/cal/src/simulines/simulines/simuLines.py -i /ligo/groups/cal/H1/simulines_settings/newDARM_20231221/settings_h1_20250212.ini;gpstime
Approximate Time ran:
~UTC:  2025-08-09 19:17:38
~GPS: 1438802276

2025-08-09 19:17:38,535 | INFO | File written out to: /ligo/groups/cal/H1/measurements/DARMOLG_SS/DARMOLG_SS_20250809T191738Z.hdf5
2025-08-09 19:17:38,543 | INFO | File written out to: /ligo/groups/cal/H1/measurements/PCALY2DARM_SS/PCALY2DARM_SS_20250809T191738Z.hdf5
2025-08-09 19:17:38,548 | INFO | File written out to: /ligo/groups/cal/H1/measurements/SUSETMX_L1_SS/SUSETMX_L1_SS_20250809T191738Z.hdf5
2025-08-09 19:17:38,553 | INFO | File written out to: /ligo/groups/cal/H1/measurements/SUSETMX_L2_SS/SUSETMX_L2_SS_20250809T191738Z.hdf5
2025-08-09 19:17:38,557 | INFO | File written out to: /ligo/groups/cal/H1/measurements/SUSETMX_L3_SS/SUSETMX_L3_SS_20250809T191738Z.hdf5
2025-08-09 19:17:38,557 | INFO | Overall driving parameters:
2025-08-09 19:17:38,557 | INFO | Preparing to scan the following sweeps ['DARM_OLGTF', 'PCALY2DARMTF', 'L1_SUSETMX_iEXC2DARMTF', 'L2_SUSETMX_iEXC2DARMTF', 'L3_SUSETMX_iEXC2DARMTF']
2025-08-09 19:17:38,557 | INFO | With parameters: BW: 0.14; DrivingCycles: 40; MinDriveTime: 3; Averages: 5; Overlap: 0; RampUpTime: 3; RampDownTime: 3; SettleTimePercent: 10; ScanTime: 1229
2025-08-09 19:17:38,558 | INFO | For DARM_OLGTF, prepaing EXC: H1:LSC-DARM1_EXC; A: ['H1:LSC-DARM1_IN2', 'H1:LSC-DARM1_EXC']; B: ['H1:LSC-DARM1_IN1', 'H1:LSC-DARM1_IN2']
2025-08-09 19:17:38,560 | INFO | For PCALY2DARMTF, prepaing EXC: H1:CAL-PCALY_SWEPT_SINE_EXC; A: ['H1:CAL-PCALY_RX_PD_OUT_DQ', 'H1:CAL-PCALY_RX_PD_OUT_DQ', 'H1:CAL-PCALY_TX_PD_OUT_DQ']; B: ['H1:CAL-DELTAL_EXTERNAL_DQ', 'H1:LSC-DARM_IN1_DQ', 'H1:LSC-DARM_IN1_DQ']
2025-08-09 19:17:38,562 | INFO | For L1_SUSETMX_iEXC2DARMTF, prepaing EXC: H1:SUS-ETMX_L1_CAL_EXC; A: ['H1:SUS-ETMX_L1_CAL_EXC']; B: ['H1:LSC-DARM_IN1_DQ']
2025-08-09 19:17:38,563 | INFO | For L2_SUSETMX_iEXC2DARMTF, prepaing EXC: H1:SUS-ETMX_L2_CAL_EXC; A: ['H1:SUS-ETMX_L2_CAL_EXC']; B: ['H1:LSC-DARM_IN1_DQ']
2025-08-09 19:17:38,565 | INFO | For L3_SUSETMX_iEXC2DARMTF, prepaing EXC: H1:SUS-ETMX_L3_CAL_EXC; A: ['H1:SUS-ETMX_L3_CAL_EXC']; B: ['H1:LSC-DARM_IN1_DQ']
2025-08-09 19:17:38,579 | INFO | Start data collection while monitoring 5 processes
2025-08-09 19:17:49,583 | INFO | Scanning frequency 7.97 in Scan : PCALY2DARMTF on PID: 2113968
2025-08-09 19:17:49,583 | INFO | Scanning frequency 12.86 in Scan : L1_SUSETMX_iEXC2DARMTF on PID: 2113971


...
...
...


2025-08-09 19:40:11,968 | INFO | Drive, on L1_SUSETMX_iEXC2DARMTF, at frequency: 12.33, and amplitude 15.362, is finished. GPS start and end time stamps: 1438803608, 1438803625
2025-08-09 19:40:12,277 | INFO | 0 still running.
2025-08-09 19:40:12,277 | INFO | gathering data for a few more seconds
2025-08-09 19:40:18,285 | INFO | Finished gathering data. Data ends at 1438803635.0
2025-08-09 19:40:18,396 | INFO | It is SAFE TO RETURN TO OBSERVING now, whilst data is processed.
2025-08-09 19:40:18,396 | INFO | Commencing data processing.
2025-08-09 19:40:18,396 | INFO | Ending lockloss monitor. This is either due to having completed the measurement, and this functionality being terminated; or because the whole process was aborted.
2025-08-09 19:41:00,607 | INFO | File written out to: /ligo/groups/cal/H1/measurements/DARMOLG_SS/DARMOLG_SS_20250809T191738Z.hdf5
2025-08-09 19:41:00,615 | INFO | File written out to: /ligo/groups/cal/H1/measurements/PCALY2DARM_SS/PCALY2DARM_SS_20250809T191738Z.hdf5
2025-08-09 19:41:00,620 | INFO | File written out to: /ligo/groups/cal/H1/measurements/SUSETMX_L1_SS/SUSETMX_L1_SS_20250809T191738Z.hdf5
2025-08-09 19:41:00,625 | INFO | File written out to: /ligo/groups/cal/H1/measurements/SUSETMX_L2_SS/SUSETMX_L2_SS_20250809T191738Z.hdf5
2025-08-09 19:41:00,630 | INFO | File written out to: /ligo/groups/cal/H1/measurements/SUSETMX_L3_SS/SUSETMX_L3_SS_20250809T191738Z.hdf5
PDT: 2025-08-09 12:41:00.802402 PDT
UTC: 2025-08-09 19:41:00.802402 UTC
GPS: 1438803678.802402

 

 

 



 

Images attached to this report
Non-image files attached to this report
LHO VE
david.barker@LIGO.ORG - posted 10:21, Saturday 09 August 2025 (86278)
Sat CP1 Fill

Sat Aug 09 10:06:22 2025 INFO: Fill completed in 6min 18secs

 

Images attached to this report
H1 General
anthony.sanchez@LIGO.ORG - posted 08:11, Saturday 09 August 2025 (86277)
Saturday Ops Shift Start & Lockloss From NLN


TITLE: 08/09 Eve Shift: 2330-0500 UTC (1630-2200 PST), all times posted in UTC
STATE of H1: Observing at 153Mpc
OUTGOING OPERATOR: Tony
CURRENT ENVIRONMENT:
    SEI_ENV state: EARTHQUAKE
    Wind: 0mph Gusts, 0mph 3min avg
    Primary useism: 0.25 μm/s
    Secondary useism: 0.12 μm/s
QUICK SUMMARY:
When I walked in H1 was locked and Observing but there was an Incoming 6.0 Mag Earthquake from Severo-Kuril’sk, Russia that caused a lockloss at 14:36 UTC.
 

Images attached to this report
LHO General
ibrahim.abouelfettouh@LIGO.ORG - posted 22:00, Friday 08 August 2025 (86276)
OPS Eve Shift Summary

TITLE: 08/09 Eve Shift: 2330-0500 UTC (1630-2200 PST), all times posted in UTC
STATE of H1: Lock Acquisition
INCOMING OPERATOR: Oli
SHIFT SUMMARY:

IFO is LOCKING at PREP_FOR_LOCKING

IFO was locked for the majority of the shift with one lockloss 04:07 UTC killing a near 9hr lock. Cause still unknown but seems like a glitch from looking at the saturation plots on the H1 lockloss tool.

H1 had poor flashes in DRMI and even worse flashes in PRMI so after letting it try for a bit, I started an initial alignment, which just completed fully auto. We're now locking hopefully also fully automatically with H1 in managed (for the OWL shift).

LOG:

None

H1 General
ibrahim.abouelfettouh@LIGO.ORG - posted 21:36, Friday 08 August 2025 (86275)
Lockloss 04:07 UTC

Unknown Cause Lockloss. H1 Lockloss tool doesn't flag it as an ETM Glitch though EX L3 seems to have glitched before the other suspensions (attached).

Images attached to this report
H1 TCS (TCS)
ibrahim.abouelfettouh@LIGO.ORG - posted 16:44, Friday 08 August 2025 (86272)
TCS Monthly Trends - FAMIS 28463

Closes FAMIS 28463. Last checked in alog 85690.

CO2 Trends:

HWS FAMIS:

Plots attached

Images attached to this report
H1 General
anthony.sanchez@LIGO.ORG - posted 16:32, Friday 08 August 2025 (86271)
Ops Friday Mid Shift report

TITLE: 08/08 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Observing at 153Mpc
INCOMING OPERATOR: Ibrahim
SHIFT SUMMARY:
Failrly quick shift due to all the Tours coming through.
Lockloss at 17:26:00 UTC alog 86264
GRB-Short E588408 at 18:26:03 UTC
Nominal_Low_Noise Reached at 19:21:18 UTC
Observing reached at 19:32:18 UTC
LOG:
                                                                                  

Start Time System Name Location Lazer_Haz Task Time End
19:33 SAF Laser HAZARD LVEA YES LVEA is Laser HAZARD 11:33
17:04 Tour Jennie & Tour Overpass N Running a tour 17:34
17:29 PEM Sam LVEA Yes Removing tape from accelerometers 17:40
17:45 Tour Cassidy & Tour Overpass N Running a tour 18:45
19:53 Tour Cassidy & Tour Control room & over pass N Running a tour 20:53
22:32 ISS Jennie Optics Lab N ISS Array Work 23:32
H1 SUS (SUS)
ibrahim.abouelfettouh@LIGO.ORG - posted 16:20, Friday 08 August 2025 (86270)
Weekly In-Lock SUS Charge Measurement - FAMIS 28417

Closes FAMIS 28417. Last checked in alog 86121.

Both IX and EX plots could not generate newest values due to insufficient coherence. EX is one Tuesday behind while IX is now 2 Tuesdays behind.

All plots attached.

Images attached to this report
H1 General (Lockloss)
anthony.sanchez@LIGO.ORG - posted 16:13, Friday 08 August 2025 (86264)
Lockloss from NLN

After a Good run of 21 Hours and 48 minutes H1 was unlocked by an ETM Glitch at 17:26:01 UTC.

Images attached to this report
LHO FMCS (PEM)
ibrahim.abouelfettouh@LIGO.ORG - posted 16:08, Friday 08 August 2025 (86269)
Checking HVAC Fans - Weekly FAMIS

Closes FAMIS 26590. Last checked in alog 86188

Everything below threshold but:

- MY FAN 270 1 ACC jumped from 0.2 counts to ~0.35 counts 1.5 days ago.

- All active Corner Station fans turned on and off I presume due to the drill that happened on 08/05 at ~1PM PT

Plots attached.

Images attached to this report
LHO General
ibrahim.abouelfettouh@LIGO.ORG - posted 16:00, Friday 08 August 2025 (86268)
OPS Eve Shift Start

TITLE: 08/08 Eve Shift: 2330-0500 UTC (1630-2200 PST), all times posted in UTC
STATE of H1: Observing at 151Mpc
OUTGOING OPERATOR: Tony
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 17mph Gusts, 11mph 3min avg
    Primary useism: 0.04 μm/s
    Secondary useism: 0.07 μm/s
QUICK SUMMARY:

IFO is in NLN and OBSERVING since 19:32 UTC (3hr 40 min lock)

Nothing else to report.

H1 CAL
anthony.sanchez@LIGO.ORG - posted 13:38, Friday 08 August 2025 (86267)
PCAL Lab SPI Beamsplitter Investigations

On tuesday  Rick and I went into the PCAL lab to change the optical layout to assist with the SPI beam splitter characterization.
Everything outside of the TX module was changed.

The Spheres were lowered.
Periscopes were removed.

We are now just using 1 beam. (Inner beam.)
Which now hits a Polarizing Beam Cube before hitting a Beam Splitter that then splits the beam down one arm into an PCAL Power Sensor integration sphere.
The other beam is sent to an HF mirror which send the majority of the beam down to the other PCAL Power Sensor integrating Sphere. Some of the beam power goes through the back of that HR mirror and is ~ 153 uW according to our power meter. 

Images attached to this report
H1 SUS (SEI)
brian.lantz@LIGO.ORG - posted 11:44, Friday 08 August 2025 (86265)
updated blend filter for SR3 yaw OSEM estimator

I've updated the blend filter for the SR3 yaw estimator, v2 is called "skinnynotch". v1 was "doublenotch". I have not yet updated the installation scripts for it. It's based on the yaw fits by Ivey at the end of July. There are more recent fits, but they seem very similar, so I'm going to post this and let folks use if for the prediction calculations.

The new blend is a based on putting a simple notch in the model path at each lightly-damped resonance, and then rolling off the model at low frequency. The goal is to use the OSEM signal at the resonance and for the position info. (NOTE - The damping loop is AC coupled, so the estimator probably doesn't need the real OSEM signal at DC, but for now I'm leaving it in place so that the OSEM signal and the estimator signal will be as similar as possible for comparison reasons.)

Figure 1 shows the MODEL path of the blend vs. the plant model. The dashed lines are at the plant resonances. The notch width is chosen by-eye. The minimum notch transmission is 0.2, so we would expect 0.2 of the signal from the model, and 0.8 from the OSEM. However - note the phase at the botom of the third notch is about 15 deg, instead of 0. This means, see later, that the peak of the complimentary OSEM signal is slightly shifted. For -v3 I think we should try to make peaks for the OSEMs instead of notches for the models, because I think this will be better. Ivey and Edgard have some math to look at the performance

Figure 2 and 3 show the complementary filters, figure 3 is a zoom around the resonances

Figure 4 compares the plant model with the OSEM path. 

I've also attached a pdf of the four plots.

Design script: Estimator_blend_skinnynotch_SR3yaw_20250723.m 
in the SUS_svn at  .../HLTS/Common/FilterDesign/Estimator/   (revision 12586)

The first blend and its installation are described in LHO log 84004

Images attached to this report
Non-image files attached to this report
LHO VE
david.barker@LIGO.ORG - posted 10:40, Friday 08 August 2025 (86262)
Fri CP1 Fill

Fri Aug 08 10:07:45 2025 INFO: Fill completed in 7min 41secs

 

Images attached to this report
H1 General
anthony.sanchez@LIGO.ORG - posted 08:10, Friday 08 August 2025 (86260)
Friday Morning Ops Day shift

TITLE: 08/08 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Observing at 156Mpc
OUTGOING OPERATOR: Ryan C
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 1mph Gusts, 0mph 3min avg
    Primary useism: 0.01 μm/s
    Secondary useism: 0.07 μm/s
QUICK SUMMARY:
H1 Has been locked for 19 hours and is currently Observing. 

CDS Overview screen has the same VAC channel alarm on that was listed in an alog 86186 from Monday. 
All other systems look good.

LHO General
ryan.short@LIGO.ORG - posted 22:04, Thursday 07 August 2025 (86258)
Ops Eve Shift Summary

TITLE: 08/08 Eve Shift: 2330-0500 UTC (1630-2200 PST), all times posted in UTC
STATE of H1: Observing at 147Mpc
INCOMING OPERATOR: Ryan C
SHIFT SUMMARY: Extremely quiet shift with H1 observing throughout; current lock stretch is at 9.5 hours. The wind picked up a couple hours ago and gusts have been peaking at around 30mph, but this hasn't appeared to have much of an impact.

H1 PEM
samantha.callos@LIGO.ORG - posted 18:09, Thursday 07 August 2025 - last comment - 09:17, Monday 11 August 2025(86257)
20-40 Hz HVAC Noise

Samantha Callos, Robert Schofield

There is a persistent peak at 20 Hz that has been appearing and disappearing intermittently for several months. I have found the times it existed corresponded to work hours during week days. Robert and I looked at the summary pages and DTT for the floor accelerometers and noted that the noise was most present in the area around YCRYO, we ruled out other locations around site and determined the source of the noise was coming from the Vacuum Prep Warehouse. We tested seismic isolation of the various HVAC units around the warehouse and found the Liebert AC unit inside the VPW which can manually be turned on and off and was moved there sometime in the last year.

I cycled through intervals of turning the Liebert unit off and on and checked the CS floor accelerometers for those times (see times below, Fig. 1, and blue arrows in Fig. 4). When the unit is off, the spike at 20 Hz in the accelerometers disappears. I then checked coupling to DARM and noted the noise at 20 Hz was present and that there was a harmonic at 40 Hz as well (see Fig. 2).

Additionally, for the external AC (Daikin), we noted that the springs it is mounted on are entirely compressed, so there is little to no seismic isolation for the entire unit. Noise from the previous unit has been found in DARM before (see alog 77477). The closest accelerometer to this HVAC unit is the YCRYO floor accelerometer and the shut down period can be seen in the spectrogram for it (see Fig. 3). It can also be seen automatically turning off and on the summary page 24-hour spectrogram (see Fig. 4). Comparing this shut down period to when it is on, it does not look like the noise is making its way into DARM for now, however, we recommend seismically isolating the Daikin as we have seen it couple to DARM before. 

Liebert AC on/off times for 08-05-2025:

Liebert AC on/off times for 08-06-2025:

Liebert/Daikin on/off times 08-06-2025:

Images attached to this report
Comments related to this report
shivaraj.kandhasamy@LIGO.ORG - 02:25, Friday 08 August 2025 (86259)

In the a-log 85984, we noted that in addition to the sharp line at 21.26 Hz in DARM there was also a ~1Hz broad feature around the 21.26 Hz line. At that time, it wasn't clear whether those two features were connected. Looking at the on/off times from this test, it seems they are connected.  The first attached figure is the DARM (GDS STRAIN CLEAN) and the second figure is the LVEAFLOOR YCRYO accelerometer. The feature in accelerometer is sharp, while in DARM we see both the sharp as well as 1Hz broad feature. In the comment to the a-log 85984, we looked at different auxiallary and PEM channels to check if there are any other channels that show both these features. Among the channels we looked at, we saw both these featues in ASC-PRC1_{Y,P} signals. The third figure shows the spectrogram of ASC-PRC1_P_IN1_DQ during this on/off tests during which we see the sharp as well as ~1Hz broad feature in that channel.  It is not clear whether this is just another witness of DARM feature or this is a place where it gets into the DARM. 

Images attached to this comment
richard.mccarthy@LIGO.ORG - 08:10, Friday 08 August 2025 (86261)

The liebert unit was installed circa 2018.  The outdoor unit was replace more recently.

elenna.capote@LIGO.ORG - 09:17, Monday 11 August 2025 (86293)

I just want to add that we do not use PRC1 P/Y in loop in full IFO lock. However, the PRC1 error signal is the POP A DC signal from the POP QPD on HAM3. This means that the POP A QPD may be a good witness of this line, but is not the coupling source of the line.

H1 PEM
carlos.campos@LIGO.ORG - posted 17:19, Thursday 07 August 2025 - last comment - 11:28, Friday 08 August 2025(86252)
Broken 3T Seismometer

Carlos Campos, Robert Schofield.

We had a faulty Guralp 3T seismometer that would not unlock the pendula. When we contacted the supplier, we were told we could send it back to be diagnosed and fixed, or we could open it up on site. We choice to ladder, as we thought it was a simply mechanical issue.

fig 1  fig 2  fig 3 

The seismometer is protected by a metal outer layer as well as two layers of shielding, to reduce noise affecting the system. Additionally, the internal components are made of brass to further limit noise.

While looking around at the internal structures, we found that a ball bearing had fallen onto the table. We then searched for where it came from, as it was most likely the cause of the failure.

fig 4

The pendula that measure horizontal movement rest on a triangular base of two ball bearing and a drive screw. This drive screw is connected to a motor which can lock, unlock, and center the mass. This screw also sits on a ball bearing. This way, the pendula rock on the two free bearing, while the drive screw can control the movement of the masses.

fig 5

This is a picture of how the mass system should look like. The screw presses on the bearing and will push the mass up or pull it down.

fig 6

This is a picture of the problem pendulum. The ball bearing for the drive screw is missing. Meaning when the seismometer first tried to unlock and balance the mass, the motor drove the screw into the brass. This either caused damage to the screw, the bass, the motor, or all of the above.

The manufacture told us that we would have to ship it back so they could fix it.

 

Images attached to this report
Comments related to this report
jennifer.wright@LIGO.ORG - 11:28, Friday 08 August 2025 (86263)EPO

Tagging EPO for cool equipment pictures.

H1 SUS (SEI)
brian.lantz@LIGO.ORG - posted 17:59, Friday 18 April 2025 - last comment - 11:53, Friday 08 August 2025(84004)
Estimator progress/ SR3 model status

We've had an excellent week of progress on the estimator - thanks to everyone on site for the great hospitality!

Status of things as we go

1. The estimator is OFF. We set the damping of M1 Yaw back to -0.5.

2  There are new YAW estimator blends in the SR3 model. These were put into foton with autoquack. The foton file in userapps was committed to the SVN

3. We updated the safe.snap SDF file with a decent version of the OFF estimator. We HAVE NOT updated the observing.snap file. At this point, all the estimator settings should be the same in safe and observing. (I'm not sure how to update the observing.snap file)

4. All the work on the estimator design is all committed to the {SUS_SVN}/sus/trunk/HLTS/Common/FilterDesign/Estimator/  

 

-- some detailed notes on the blend design and svn commits follow --

Design new blend filters, load them into the model, commit the updated foton file

seems like a 2% error in the peak finding makes a bunch of noise in the estimator with the agressive blend, and is not a reasonable error (judgement call by Brian and Edgard)
 
>> print -dpng fig_2pcnt_error.png
>> print -dpng fig_2pcnt_error_result.png

Check noise again with 2% error in model/actual using the robust blend (EB blend) - we see the peaks are not any better.

I can't get a broad notch for notch 3 without causing the OSEM filter to be larger than 1. Issue seems to be the freq of the notches going past 60 deg. Could be tuned further.
Instead - use a simple notch.  This means we'll need to be quite accurate with the 3 peak - probably withing 0.5% of the actual frequency

figures
gain error - no performance hit
2% freq error - clear perf hit
1% freq error - acceptable perf hit - top mode clearly worse, but only a little
0.5% freq error - tiny perf hit at top mode only

print -dpng fig_perf1_gainerror.png
>> print -dpng fig_perf2_0p5freqerror.png
>> print -dpng fig_perf3_1p0freqerror.png
>> print -dpng fig_perf4_perfectmatch.png

turn the script into a blend design script - Estimator_blend_doublenotch_SR3yaw.m

update the yaw frequencies to 1.016, 2.297, 3.385
can we use autoquack? - yes!
Real foton file is: /opt/rtcds/userapps/release/sus/h1/filterfiles/H1SUSSR3.txt

(make a backup copy): /opt/rtcds/userapps/release/sus/h1/filterfiles$ cp H1SUSSR3.txt H1SUSSR3backup.txt

the file make_SR3_yaw_blend.m uses autoquack to put the new filters into the SR3 foton file.

(log notes)
please review the recent foton -c log file at
/opt/rtcds/lho/h1/log/h1sussr3/autoquack_foton_log_recent.log
   Checking foton file to see if filters got implemented correctly
BAD - Filter SR3_M1_YAW_EST_MEAS_BP has issues in sect. 1 : DBL_notch
at least one filter got messed up, please follow up...
Autoquack process complete
initial foton call succeeded
foton file ready for updating
starting foton cleanup process
final foton call succeeded
log file updated
please review the recent foton -c log file at
/opt/rtcds/lho/h1/log/h1sussr3/autoquack_foton_log_recent.log
   Checking foton file to see if filters got implemented correctly
BAD - Filter SR3_M1_YAW_EST_MODL_BP has issues in sect. 1 : DBL_notch
at least one filter got messed up, please follow up...
Autoquack process complete
>>

Check the foton file - it looks good - I checked the TFs by eye, and they look correct. the matlab error checker is irritated, but the matlab plots it makes look fine. I think it's OK.

Do a diff on the updated file and my backup - the only diffs I see are the new lines I added (that's good)

save the foton file, delete my backup.

press 'coef load' to get the new filters
(the CFC light goes green)

commit the updated foton file in userapps R31301

Save the work in the estimator folder

Estimator$ svn1.6 add fig*
A  (bin)  fig_2pcnt_error.png
A  (bin)  fig_2pcnt_error_result_EBblend.png
A  (bin)  fig_2pcnt_error_result.png
A  (bin)  fig_blend.png
A  (bin)  fig_perf1_gainerror.png
A  (bin)  fig_perf2_0p5freqerror.png
A  (bin)  fig_perf3_1p0freqerror.png
A  (bin)  fig_perf4_perfectmatch.png

$ svn1.6 add Estimator_blend_doublenotch_SR3yaw.m make_SR3_yaw_blend.m
A         Estimator_blend_doublenotch_SR3yaw.m
A         make_SR3_yaw_blend.m

committed in R12257

Set the model to a good state:

final switch = OFF.
gain of the normal yaw damping set back to -0.5

OSEM_Damper  = populated, but off (in=off, out=off, gain=0)
Estim_Damper = populated, but off (in=off, out=off, gain=0)

OSEM bandpass = populated and set to running state (on, gain=1)
MODEM bandpass =  populated and set to running state (on, gain=1)

 

accept SDF changes in H1:SUS-SR3_M1_
YAW_EST_MODL_BP
YAW_EST_OSEM_BP
YAW_DAMP_EST
YAW_DAMP_OSEM
save this to the safe file - I have not changed the observing file!

the SDF shows 0 differences

 

notes on Diff of foton file:

brian.lantz@cdsws44:/opt/rtcds/userapps/release/sus/h1/filterfiles$ diff H1SUSSR3.txt H1SUSSR3backup.txt
1025,1030d1024
< # DESIGN   SR3_M1_YAW_EST_MEAS_BP 0 sos(0.00026333867650529759, [0.99999999999999867; 0; -0.9999616512111712; 0; -1.999953052714962; \
< #                                   0.99995343154104388; -1.9997062783494941; 0.99970796324744837; -1.9998957078482871; \
< #                                   0.99989686159668212; -1.9999090565491939; 0.99990984807473893; -1.9999426937932141; \
< #                                   0.99994346902553977; -1.9999108726927539; 0.99991163318180731; -1.9999774146135141; \
< #                                   0.99997744772231478; -1.9999375279667919; 0.99993768590749621; -1.999963134023643; \
< #                                   0.99996328560740799; -1.9999399837273171; 0.9999401295235868])
1032,1037d1025
< SR3_M1_YAW_EST_MEAS_BP 0 21 6      0      0 DBL_notch  2.633386765052975870236851e-04  -0.9999616512111712   0.0000000000000000   0.9999999999999987   0.0000000000000000
<                                                                  -1.9997062783494941   0.9997079632474484  -1.9999530527149620   0.9999534315410439
<                                                                  -1.9999090565491939   0.9999098480747389  -1.9998957078482871   0.9998968615966821
<                                                                  -1.9999108726927539   0.9999116331818073  -1.9999426937932141   0.9999434690255398
<                                                                  -1.9999375279667919   0.9999376859074962  -1.9999774146135141   0.9999774477223148
<                                                                  -1.9999399837273171   0.9999401295235868  -1.9999631340236430   0.9999632856074080
1042,1047d1029
< # DESIGN   SR3_M1_YAW_EST_MODL_BP 0 sos(0.99973666135688777, [-1.000000000000002; 0; -0.9999616512111712; 0; -1.999965862142562; \
< #                                   0.99996754725922932; -1.9997062783494941; 0.99970796324744837; -1.9999754834715859; \
< #                                   0.99997627502341802; -1.9999090565491939; 0.99990984807473893; -1.999975984305477; \
< #                                   0.99997674481928778; -1.9999108726927539; 0.99991163318180731; -1.9999871245769849; \
< #                                   0.99998728252160574; -1.9999375279667919; 0.99993768590749621; -1.9999876354434321; \
< #                                   0.99998778124317389; -1.9999399837273171; 0.9999401295235868])
1049,1054d1030
< SR3_M1_YAW_EST_MODL_BP 0 21 6      0      0 DBL_notch  9.997366613568877680151559e-01  -0.9999616512111712   0.0000000000000000  -1.0000000000000020   0.0000000000000000
<                                                                  -1.9997062783494941   0.9997079632474484  -1.9999658621425620   0.9999675472592293
<                                                                  -1.9999090565491939   0.9999098480747389  -1.9999754834715859   0.9999762750234180
<                                                                  -1.9999108726927539   0.9999116331818073  -1.9999759843054770   0.9999767448192878
<                                                                  -1.9999375279667919   0.9999376859074962  -1.9999871245769849   0.9999872825216057
<                                                                  -1.9999399837273171   0.9999401295235868  -1.9999876354434321   0.9999877812431739

Images attached to this report
Comments related to this report
brian.lantz@LIGO.ORG - 11:53, Friday 08 August 2025 (86266)

see LHO log 86265 for v2 of this blend.

Displaying reports 1801-1820 of 85612.Go to page Start 87 88 89 90 91 92 93 94 95 End