Displaying reports 2521-2540 of 77280.Go to page Start 123 124 125 126 127 128 129 130 131 End
Reports until 05:12, Sunday 07 April 2024
H1 PEM (DetChar, PEM)
shivaraj.kandhasamy@LIGO.ORG - posted 05:12, Sunday 07 April 2024 (77009)
A couple of disconnected(?) accelerometer channels

The spectra of CS_ACC_LVEAFLOOR_YCRYO_Z, CS_ACC_HAM3_PR2_Y are much lower than the other accelerometer channles.

(i) The channel CS_ACC_LVEAFLOOR_YCRYO_Z has been like that since March 12. 

(ii) The channel CS_ACC_HAM3_PR2_Y has been like that since Jan 30. 

It seems they are either disconnected or having a lose connection. 

H1 General
oli.patane@LIGO.ORG - posted 00:02, Sunday 07 April 2024 (77008)
Ops EVE Shift End

TITLE: 04/07 Eve Shift: 23:00-07:00 UTC (16:00-00:00 PST), all times posted in UTC
STATE of H1: Observing at 157Mpc
INCOMING OPERATOR: TJ
SHIFT SUMMARY: Currently Observing and have been Locked for 3 hours. Lockloss earlier in the evening needed me to adjust ALSY and move the beamsplitter for MICH/PRMI/DRMI but we went up quickly after that.
LOG:

23:00UTC Detector Observing and Locked for 7.5 hours

02:47 Lockloss
    - ALSY had really high flashes but was struggling to catch until I bumped the flashes up by ~0.02.
    - We ended up going through CHECK_MICH_FRINGES but I was able to get us aligned enough to go through DRMI without needing to do an initial alignment - we'll definitely need to run an initial alignment next time
03:54 NOMINAL_LOW_NOISE
03:56 Observing

H1 General (Lockloss)
oli.patane@LIGO.ORG - posted 19:48, Saturday 06 April 2024 (77007)
Lockloss

Lockloss 04/07 02:47UTC from unknown cause

H1 General
oli.patane@LIGO.ORG - posted 16:03, Saturday 06 April 2024 (77006)
Ops EVE Shift Start

TITLE: 04/06 Eve Shift: 23:00-07:00 UTC (16:00-00:00 PST), all times posted in UTC
STATE of H1: Observing at 156Mpc
OUTGOING OPERATOR: Ryan S
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 9mph Gusts, 6mph 5min avg
    Primary useism: 0.02 μm/s
    Secondary useism: 0.19 μm/s
QUICK SUMMARY:

Detector is Observing and has been Locked for 7.5 hours. Things are looking good!

LHO General
ryan.short@LIGO.ORG - posted 16:02, Saturday 06 April 2024 (77005)
Ops Day Shift Summary

TITLE: 04/06 Day Shift: 15:00-23:00 UTC (08:00-16:00 PST), all times posted in UTC
STATE of H1: Observing at 154Mpc
INCOMING OPERATOR: Oli
SHIFT SUMMARY: Quiet shift today just with one break from observing for calibration sweeps. The expected sitewide network outage didn't happen today.

H1 has now been locked for over 7 hours.

LOG:

No log for this shift.

H1 CAL
ryan.short@LIGO.ORG - posted 12:03, Saturday 06 April 2024 (77004)
Broadband and Simulines Calibration Sweeps

Following instructions from the TakingCalibrationMeasurements wiki, broadband PCal and Simulines sweeps were run at 18:30 UTC after dropping observing.

Broadband start:

PDT: 2024-04-06 11:31:05.690691 PDT
UTC: 2024-04-06 18:31:05.690691 UTC
GPS: 1396463483.690691

Simulines start:

PDT: 2024-04-06 11:37:45.702793 PDT
UTC: 2024-04-06 18:37:45.702793 UTC
GPS: 1396463883.702793

Files written:

2024-04-06 18:59:41,732 | INFO | File written out to: /ligo/groups/cal/H1/measurements/DARMOLG_SS/DARMOLG_SS_20240406T183810Z.hdf5
2024-04-06 18:59:41,740 | INFO | File written out to: /ligo/groups/cal/H1/measurements/PCALY2DARM_SS/PCALY2DARM_SS_20240406T183810Z.hdf5
2024-04-06 18:59:41,745 | INFO | File written out to: /ligo/groups/cal/H1/measurements/SUSETMX_L1_SS/SUSETMX_L1_SS_20240406T183810Z.hdf5
2024-04-06 18:59:41,750 | INFO | File written out to: /ligo/groups/cal/H1/measurements/SUSETMX_L2_SS/SUSETMX_L2_SS_20240406T183810Z.hdf5
2024-04-06 18:59:41,755 | INFO | File written out to: /ligo/groups/cal/H1/measurements/SUSETMX_L3_SS/SUSETMX_L3_SS_20240406T183810Z.hdf5

H1 resumed observing at 19:01 UTC.

Images attached to this report
LHO VE
david.barker@LIGO.ORG - posted 10:12, Saturday 06 April 2024 (77003)
Sat CP1 Fill

Sat Apr 06 10:06:39 2024 INFO: Fill completed in 6min 35secs

Dave confirmed a good fill curbside.

Images attached to this report
H1 SEI
ryan.short@LIGO.ORG - posted 09:56, Saturday 06 April 2024 (77002)
Ground Seismometer Mass Position Check - Monthly

FAMIS 26488, last checked in alog75989

There are 12 T240 proof masses out of range ( > 0.3 [V] )!
ITMX T240 1 DOF X/U = -1.14 [V]
ITMX T240 1 DOF Y/V = 0.364 [V]
ITMX T240 1 DOF Z/W = 0.471 [V]
ITMX T240 3 DOF X/U = -1.177 [V]
ITMY T240 3 DOF X/U = -0.535 [V]
ITMY T240 3 DOF Z/W = -1.549 [V]
BS T240 1 DOF Y/V = -0.368 [V]
BS T240 3 DOF Y/V = -0.326 [V]
BS T240 3 DOF Z/W = -0.464 [V]
HAM8 1 DOF X/U = -0.366 [V]
HAM8 1 DOF Y/V = -0.365 [V]
HAM8 1 DOF Z/W = -0.647 [V]

All other proof masses are within range ( < 0.3 [V] ):
ETMX T240 1 DOF X/U = -0.065 [V]
ETMX T240 1 DOF Y/V = -0.021 [V]
ETMX T240 1 DOF Z/W = -0.058 [V]
ETMX T240 2 DOF X/U = -0.259 [V]
ETMX T240 2 DOF Y/V = -0.209 [V]
ETMX T240 2 DOF Z/W = -0.212 [V]
ETMX T240 3 DOF X/U = -0.01 [V]
ETMX T240 3 DOF Y/V = -0.14 [V]
ETMX T240 3 DOF Z/W = -0.003 [V]
ETMY T240 1 DOF X/U = 0.122 [V]
ETMY T240 1 DOF Y/V = 0.136 [V]
ETMY T240 1 DOF Z/W = 0.202 [V]
ETMY T240 2 DOF X/U = -0.059 [V]
ETMY T240 2 DOF Y/V = 0.187 [V]
ETMY T240 2 DOF Z/W = 0.114 [V]
ETMY T240 3 DOF X/U = 0.218 [V]
ETMY T240 3 DOF Y/V = 0.15 [V]
ETMY T240 3 DOF Z/W = 0.142 [V]
ITMX T240 2 DOF X/U = 0.17 [V]
ITMX T240 2 DOF Y/V = 0.274 [V]
ITMX T240 2 DOF Z/W = 0.277 [V]
ITMX T240 3 DOF Y/V = 0.176 [V]
ITMX T240 3 DOF Z/W = 0.151 [V]
ITMY T240 1 DOF X/U = 0.111 [V]
ITMY T240 1 DOF Y/V = 0.095 [V]
ITMY T240 1 DOF Z/W = 0.014 [V]
ITMY T240 2 DOF X/U = 0.065 [V]
ITMY T240 2 DOF Y/V = 0.24 [V]
ITMY T240 2 DOF Z/W = 0.114 [V]
ITMY T240 3 DOF Y/V = 0.084 [V]
BS T240 1 DOF X/U = -0.185 [V]
BS T240 1 DOF Z/W = 0.117 [V]
BS T240 2 DOF X/U = -0.077 [V]
BS T240 2 DOF Y/V = 0.038 [V]
BS T240 2 DOF Z/W = -0.138 [V]
BS T240 3 DOF X/U = -0.169 [V]

There are 2 STS proof masses out of range ( > 2.0 [V] )!
STS EY DOF X/U = -4.089 [V]
STS EY DOF Z/W = 2.831 [V]

All other proof masses are within range ( < 2.0 [V] ):
STS A DOF X/U = -0.506 [V]
STS A DOF Y/V = -0.75 [V]
STS A DOF Z/W = -0.641 [V]
STS B DOF X/U = 0.431 [V]
STS B DOF Y/V = 0.933 [V]
STS B DOF Z/W = -0.456 [V]
STS C DOF X/U = -0.681 [V]
STS C DOF Y/V = 0.854 [V]
STS C DOF Z/W = 0.421 [V]
STS EX DOF X/U = -0.072 [V]
STS EX DOF Y/V = 0.052 [V]
STS EX DOF Z/W = 0.046 [V]
STS EY DOF Y/V = 0.101 [V]
STS FC DOF X/U = 0.252 [V]
STS FC DOF Y/V = -1.001 [V]
STS FC DOF Z/W = 0.692 [V]

LHO General
ryan.short@LIGO.ORG - posted 07:57, Saturday 06 April 2024 (77001)
Ops Day Shift Start

TITLE: 04/06 Day Shift: 15:00-23:00 UTC (08:00-16:00 PST), all times posted in UTC
STATE of H1: Lock Acquisition
OUTGOING OPERATOR: TJ
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 4mph Gusts, 3mph 5min avg
    Primary useism: 0.01 μm/s
    Secondary useism: 0.19 μm/s
QUICK SUMMARY: H1 lost lock seconds before I arrived; working on relocking now. The network is expected to go down sometime soon.

H1 CAL (AOS)
louis.dartez@LIGO.ORG - posted 01:53, Saturday 06 April 2024 (76886)
Updated LHO calibration & Procedure for deploying new pyDARM release at the sites
The calibration has been updated at LHO using Cal report 20240330T211519Z. 
The TDCF EPICS channel changes and the CALCS filter changes are attached. 

In O4b, the Calibration group is using a slightly different scheme for keeping track of cal reports, the front end pipeline settings, the GDS pipeline, and the hourly online uncertainty budgets. The biggest change is that each cal report is now also a git repository with its own history. This will allow for better tracking in situations for which it is deemed necessary to regenerate / reprocess calibration measurements. 

Additionally, there are now two additional channels available on the front end: H1:CAL-CALIB_REPORT_HASH_INT and H1:CAL-CALIB_REPORT_ID_INT. These new channels are populated by the pyDARM tools when someone in the control room runs 'pydarm export --push'. 

New Channels and their purpose:
H1:CAL-CALIB_REPORT_HASH_INT: numeric representation of the git commit hash for the report that was used to generate the current calibration pipeline 
H1:CAL-CALIB_REPORT_ID_INT: numeric representation of the report id (e.g. 20240330T211519Z) the current calibration pipeline is configured with.


Current channel values:

caget H1:CAL-CALIB_REPORT_HASH_INT H1:CAL-CALIB_REPORT_ID_INT


H1:CAL-CALIB_REPORT_HASH_INT   4.14858e+07
H1:CAL-CALIB_REPORT_ID_INT     1.39587e+09


End-to-end procedure for updating the Calibration pipeline:
0. Take new calibration measurements following the instructions in OpsWiki/TakingCalibrationMeasurements.

1. make sure that the current pyDARM deployment version is up-to-date

    1.a) run pydarm -v and check that the returned version (e.g. 20240405.1) matches the latest 'production release' tag listed at https://git.ligo.org/Calibration/pydarm/-/tags.
    1.b) if the tags do not match, have a member of the Calibration group deploy the latest pyDARM tools to the site and the ldas cluster. They should follow the instructions laid out here.

2. generate a new cal report 
    2.a) run pydarm report (if this measurement set should be considered an epoch in sensing or actuation then apply the appropriate command line options as listed in the pyDARM help menu (pydarm  report -h)). Report generation will now populate the report directory at /ligo/groups/cal/H1/reports/<report id>/ with various 'export' products. These include dtt calibration files, inverse sensing foton exports, and TDCF EPICS records that would be updated if this report were to be exported to the front end.

    Here is a quick list of the some of the products that get stored at this step:
    pcal_calib_dtt.txt: Pcal calibration into meters of displacement
    deltal_external_calib.txt: calibration of DELTAL_EXTERNAL into strain
    pydarm_version: the pyDARM tag indicating the version of pyDARM used to generate the report
    export_epics_records.txt: list of each EPICS channel name and the value it would get set to when the report is exported to the front end
    gstlal_compute_strain_C00_filters_H1.npz: a set of GDS filters and meta data that is sent to the GDS pipeline when the report is exported.

3. inspect the plots in the cal report to make sure they're reasonable. Typically this done by a member of the calibration group that is well-acquainted with the IFO and the calibration pipeline.
    3.a) if the cal report is valid, set the 'valid' tag in the cal report: touch /ligo/groups/cal/H1/reports/<report id>/tags/valid.
    3.b) if the cal report was marked valid in 3.a), then 'commit' the report now that its contents have been changed: pydarm commit <report id>. If you have not done this before, you may see a message from git complaining about dubious ownership. If that happens, follow the instructions in the message and try committing again. If you continue to have trouble, reach out to me, Jamie Rollins, or another member of the Calibration group that is knowledgeable about the new infrastructure.

4. if the report is 'valid', export the new calibration to the front end.
    4.a) to first compare the cal report against the currently installed calibration pipeline, run pydarm status
    4.b) to have pyDARM list all of the changes it would make if exported, run pydarm export
    4.c) once you are certain that you want to update the calibration, run pydarm export --push. This will write to all of the EPICS channels listed in export_epics_records.txt and perform various CAL-CS frontend foton filter changes. 
    4.d) reload the CAL-CS front end coefficients via the MEDM screen system to make sure the new changes are loaded into place.
    4.e) add an 'export' tag to the current report (touch /ligo/groups/cal/H1/reports/<report id>/tags/exported) and commit it again (pydarm commit <report id>).

5. upload the newly exported report to the ldas cluster.
    5.a) run pydarm upload <report id>
    5.b) wait about 1-2 minutes after the upload to allow time for the systemd timers on the ldas cluster to recognize that the new report exists. You can confirm that the latest report is recognized by the ldas cluster by verifying that https://ldas-jobs.ligo-wa.caltech.edu/~cal/archive/H1/reports/latest/ points to the correct report.

6. restart the GDS pipeline
    6.a) run pydarm gds restart to begin the process of restarting the GDS pipeline. This will show prompts from the DMT machines (DMT1 and DMT2) asking you to confirm that the hash for the GDS pipeline package (gstlal_compute_strain_C00_filters_H1.npz). The prompts will contain the following line:

        b1c9f6cd1ba3c202a971c6b56c7a1774afb1931625a7344e9a24e6795f3837d7  gstlal_compute_strain_C00_filters_H1.npz

        To confirm that the hash above is correct, run sha256sum /ligo/groups/cal/H1/reports/<report id>/gstlal_compute_strain_C00_filters_H1.npz and verify that the two hashes are identical. If they are the same then type 'yes' and continue with the GDS restart process. After performing this process for the second DMT machine, pyDARM will continue with the pipeline restart. The GDS pipeline currently takes about 12 minutes to fully reboot and begin producing data again. During this time, no GDS calibration data will be receivable. 

    6.b) If the two hashes are not the same and all of the above checks were done, then something is likely wrong with the pyDARM+GDS pipeline system and you cannot continue with the calibration push. Take the following steps to reset the calibration to its former state: 

        1. open the CAL-CS SDF table and revert all of the EPICS channel pushes listed in export_epics_records.txt.
        2. reset the foton filters by reverting to the last installed h1calcs filter before you exported the calibration report.
        3. remove the exported tag from the new report (rm /ligo/groups/cal/H1/reports/<report id>/tags/exported), commit it (pydarm commit <report id>), and re-upload it to the ldas cluster (pydarm upload <report id>).


Summary of pyDARM commands (for use in the control room):
pydarm report [<args ...> <report id>]: generate a calibration report based on the measurement set at <report id>. See output of pydarm report -h for additional customization.
pydarm status [<args ...> <report id>]: compare current calibration pipeline against what the pipeline would be if report <report id> were exported to the front end
pydarm commit [<args ...> <report id>]: make a new commit in the report <report id> git repository. this creates a new hash and should be done any time the report's contents are changed.
pydarm upload <report id>: upload/sync the report <report id> with the ldas cluster
pydarm gds restart: initiate a GDS pipeline restart
pydarm ls -r: list all reports

Images attached to this report
H1 General
oli.patane@LIGO.ORG - posted 00:07, Saturday 06 April 2024 (77000)
Ops EVE Shift End

TITLE: 04/06 Eve Shift: 23:00-07:00 UTC (16:00-00:00 PST), all times posted in UTC
STATE of H1: Observing at 157Mpc
INCOMING OPERATOR: TJ
SHIFT SUMMARY: We are in Observing and have been Locked for 8 hours now. Super quiet shift but we had a gw candidate a bit ago (S240406aj).
LOG:

23:00UTC Detector in NOMINAL_LOW_NOISE
23:07 Observing

04:10 Kicked out of Observing when squeezer lost lock
04:14 Squeezer relocked itself and we went back into Observing

06:29 Superevent S240406aj

Start Time System Name Location Lazer_Haz Task Time End
23:08 PCAL Francisco PCAL Lab y(local) PCALing 00:03
23:36 RUN Camilla MY n RUN 00:06
H1 General
oli.patane@LIGO.ORG - posted 20:08, Friday 05 April 2024 (76999)
Ops Eve Midshift Status

We're Observing at 160 Mpc and have been Locked for just over four hours. Quiet evening so far.

H1 ISC
gabriele.vajente@LIGO.ORG - posted 18:38, Friday 05 April 2024 (76998)
Coherence with PRCL and REFL_RIN

Coherence with PRCL and REFL_RIN is back, so maybe the PRCL offset tuned a few days ago is not optimal anymore.

Images attached to this report
H1 ISC (ISC)
jennifer.wright@LIGO.ORG - posted 16:54, Friday 05 April 2024 - last comment - 18:35, Friday 05 April 2024(76994)
Tuning Y2L gains for ITMY

Sheila, Jennie W

Today Sheila changed the guardian to use the camera offsets that we determined via beam walking yesterday. Since we improved the camera offset that sets the spot on the BS (CAM YAW1) we decided to optimise the YAW to Length gains for the ITMs to improve on this change.

Sheila and tuned the Y2L drive align gains to see if it increased or decreased our coupling to DHARD and CHARD.

We used templates at each step to check the transfer function and coherence from H1:ASC_{CHARD,DHARD}_Y_SM to DARM by injecting broadband noise.

The nominal Y2L gains are 2.1 for ITMX and -1.9 for ITMY.

We found a minimum in the coupling with both a common and differential step (that brought us back to our nominal gain for ITMX) but changed the gain on ITMY DRIVEALIGN to -2.5. The yellow trace in both plots shows the transfer function and coherence for this configuration between dhard and darm and chard and darm where nominal is in dark blue.

Templates are saved in /ligo/home/sheila.dwyer/Alignment/DHARD/DHARD_A2L_tuning.xml and
/ligo/home/sheila.dwyer/Alignment/CHARD/CHARD_A2L_tuning.xml

I updated the gains in SDF and Sheila has also put them in the guardian.

Images attached to this report
Comments related to this report
gabriele.vajente@LIGO.ORG - 18:35, Friday 05 April 2024 (76997)

Coherence with CHARD_Y and DHARD_Y is gone, as expected.

Now time to tackle CHARD_P

Images attached to this comment
H1 OpsInfo (ISC)
jennifer.wright@LIGO.ORG - posted 16:15, Friday 05 April 2024 (76992)
Accepted Y2L gains in SDF and reverted ramp time changes

I accepted the current DRIVEALIGN gain for the SUS-ITMX_L2_DRIVEALIGN_Y2L_SPOT_GAIN (which is unmonitored) we ended up leaving it at its set point which is 2.1 but this was set by the guardian so had nbot been accepted in SDF which I did.

We updated the Y2L gain for SUS-ITMY_L2_DRIVEALIGN_Y2L_SPOT_GAIN (also unmonitored) to -2.5 so I accpted this in sdf also.

I also reverted changes we had made to the tramps for bvoth of these as they made diffs in the OBSERVE.snap - we might want to shorten these Tramps after the guardian changes that were made to the camera servos today,  but I figured we can do that not during observing.

Images attached to this report
H1 ISC
camilla.compton@LIGO.ORG - posted 16:12, Friday 05 April 2024 (76993)
New SRCL FF in place

New SRCL feedforward in place, some improvement 10-10Hz. Still work to be done to improve above 100Hz but SRCL coherence is lower here (see Bruco from 76927).

Measurement was taken yesterday in 76967. Old FM3, new FM1, accepted in safe and observe sdf and ISC_LOCK updated.

Images attached to this report
LHO General
ryan.short@LIGO.ORG - posted 16:08, Friday 05 April 2024 (76987)
Ops Day Shift Summary

TITLE: 04/05 Day Shift: 15:00-23:00 UTC (08:00-16:00 PST), all times posted in UTC
STATE of H1:
INCOMING OPERATOR: Oli
SHIFT SUMMARY:

LOG:

Start Time System Name Location Lazer_Haz Task Time End
16:15 FAC Kim H2 - Technical cleaning 16:23
16:41 VAC Gerardo LVEA - Retrieving aux cart 16:48
21:37 VAC Gerardo, Jordan MX - Retrieving cable trays 22:26
H1 General
oli.patane@LIGO.ORG - posted 16:03, Friday 05 April 2024 (76991)
Ops EVE Shift Start

TITLE: 04/05 Eve Shift: 23:00-07:00 UTC (16:00-00:00 PST), all times posted in UTC
STATE of H1: Commissioning
OUTGOING OPERATOR: Ryan S
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 10mph Gusts, 7mph 5min avg
    Primary useism: 0.03 μm/s
    Secondary useism: 0.29 μm/s
QUICK SUMMARY:

Detector just got into NOMINAL_LOW_NOISE

H1 AOS (CAL)
louis.dartez@LIGO.ORG - posted 23:51, Tuesday 12 March 2024 - last comment - 17:48, Friday 05 April 2024(76315)
Successful transition to new DARM offloading state + quiet-ish time
I was able to request and transition to the new DARM state without issue. ETMX still saw a pretty large kick. I'll have to circle back re: how large compared to previous attempts. We will also want to take a look at any effects moving the integrator on L2 LOCK L had.

Note: We do get a few SUS_PI warnings shortly after transitioning to this state.

Quiet time to monitor for non-stationarity:
Start GPS: 1394344327
Stop GPS: 1394346169

Turned off cal lines at GPS 1394346770.39

Elenna started a Bruco after the cal lines were turned off.  here is a screenshot of DARM in the new configuration. 20-90Hz looks high and below 15Hz looks low. It's hard to tell how much of this is real until CAL-CS is calibrated & that calibration is propagated to the DTT template.

I started an L2 LOCK IN1/IN2 injection using noise recorder at 1394347034.426. We used a tuned broadband measurement that Craig and I put together. Apparently it was too strong because we lost lock from this injection. It also tripped EX. I requested DOWN and the EX trip alarm reset. 
Images attached to this report
Comments related to this report
elenna.capote@LIGO.ORG - 23:53, Tuesday 12 March 2024 (76316)

Bruco is here: https://ldas-jobs.ligo-wa.caltech.edu/~elenna.capote/brucos/New_DARM/

Just from first glance looks like some residual LSC coherence, but not enough to explain the strange shape of New DARM.

craig.cahillane@LIGO.ORG - 11:29, Wednesday 13 March 2024 (76333)
Here are plots comparing the MASTER OUTS on ETMX during Louis's quiet time here with L3 offloaded, vs Gabriele's quiet time in alog 76278 during Nominal DARM.

We are concerned only with the filter changes that Louis and Sheila made to offload more of L3's length actuation onto L2.  
Because L2 is used to control both length and angular degrees of freedom, it can be easy to ask too much of L2. 
This lock seems to indicate that this configuration is fairly stable.

I looked at the UL MASTER OUTs for the L1, L2, L3 on ETMX.
1) RMS on L3 drives is halved.  There is much less L3 drive from 1 to 6 Hz, which dominates the RMS.  
2) L2 drives are largely unchanged.  
3) L1 drives are changed, but the RMS remains similar.  There is much less HF content in the L1 drive with L3 offloaded, and the shape of the resonances around 3 and 6 Hz is altered.

Overall it's hard to tell which stage is picking up L3's slack from these PSDs.  I believe the intention was to offload to L2, but we don't see any obvious change in what control signal is being sent to the L2 stage.  This could simply mean that the angular controls are relatively stronger in the L2 controllers.  We'll look at the DRIVEALIGN signals to try and figure that one out quantitatively. 
Images attached to this comment
gabriele.vajente@LIGO.ORG - 12:29, Wednesday 13 March 2024 (76341)

The new DARM loop configuration reduces the DARM noise non-stationary at low frequency.

First plot compares the ESD drive with the Old DARM and the New DARM, confirming that the RMS is significantly reduced, especially at the relevant frequencies.

Second and third plots are spectrograms and whitened spectrograms of GDS-CALIB_STRAIN in the two configuration. Despire GDS-CALIB_STRAIN being wrongly calibrated with the New DARM, it is clear that the low frequency non stationarity is gone in New DARM.

Last two plots are the bicoherence of DARM with the ESD drives, showing that in the Old DARM there is still some bicoherence for noise in the 10-30 Hz region, while in the New DARM this is gone.

 

Images attached to this comment
sheila.dwyer@LIGO.ORG - 15:23, Wednesday 13 March 2024 (76344)

These transitions last night were made with a different L2 LOCK filter (which is in L2 LOCK L FM6, replacing the filter used in earlier new DARM configurations that was at FM2).  The attached screenshot shows the filter change, I replaced the poles at zero with poles at 0.03 Hz to get rid of the integrator here without changing the phase at the crossover much. This was done with the guardian version 27211

Plots of the actuators during the transitions are attached, here and here,  they can be compared to the one that Louis posted where we used L2 LOCK FM2.   This suggests that the change to these poles didn't help to reduce the transient during the transition. 

Today we tried another change to the transition, this time Evan and I moved the poles in L2 LOCK L from 0.03 Hz to 0.1 Hz, and changed the ramp time for the transition to 10 seconds (from 5).  The model is shown in the attached PDF where the new filter is in place in the transition traces.  This transition wasn't smoother than the others, see here

Images attached to this comment
Non-image files attached to this comment
evan.hall@LIGO.ORG - 15:30, Wednesday 13 March 2024 (76355)CAL, ISC

The new UGF is 70 Hz with 20° of phase margin. The crossover between L2 and L3 is at 18 Hz with probably about 40° of phase margin (low coherence due to interference with calibration lines). We have not measured the L1 to L2 crossover yet.

Images attached to this comment
Non-image files attached to this comment
louis.dartez@LIGO.ORG - 19:36, Wednesday 13 March 2024 (76365)
S. Dwyer, E. Capote, E. Hall, S. Pandey, L. Dartez


Here are some notes from our efforts to measure IN1/IN2 at the L1 LOCK L input.

- Sheila adjusted UIM measurement template for new darm config. This template is at /opt/rtcds/userapps/release/lsc/h1/templates/DARM/UIM_crossover.xml.
- Evan ran the template initially and saw that the UGF is near 1Hz. He adjusted the excitation amplitude along the way to improve coherence for the next time we run this measurement.
- Evan added a high pass filter in the L3 DRIVEALIGN bank with a cutoff frequency at at 5Hz
  - first filter attempt was at 8Hz; possibly caused a roll mode to excite near 13.75Hz
  - second filter attempt was at 5Hz; this seemed to improve the roll mode excitation

We ended up losing lock a shortly after the injection finished due to PRC activity.
Images attached to this comment
Non-image files attached to this comment
evan.hall@LIGO.ORG - 17:48, Friday 05 April 2024 (76995)

Comparison of DARM ESD drive from end of O4a versus a few days ago. The microseism was about 0.2 µm/s in both cases. The rms DAC drive from 0.1 Hz to 0.3 Hz is about 400 ct, so even in cases of exceptionally high microseism it will be subdominant to the 7000 ct rms that is accumulated above 1 Hz.

Images attached to this comment
Displaying reports 2521-2540 of 77280.Go to page Start 123 124 125 126 127 128 129 130 131 End