Displaying reports 16301-16320 of 86617.Go to page Start 812 813 814 815 816 817 818 819 820 End
Reports until 16:45, Friday 25 August 2023
H1 ISC
elenna.capote@LIGO.ORG - posted 16:45, Friday 25 August 2023 (72441)
H1 O3b to O4a DARM noise comparison

Inspired by Valera's alog demonstrating the improvement in the LLO sensitivity from O3b to O4a (LLO:66948), I have made a similar plot.

I chose a time in February 2020 for the O3b reference. I'm not aware of a good time without calibration lines, so I used the CALIB_STRAIN traces from both times. Our range today just got a new bump up, either due to the commissioning work today or the EY chiller work (72414). Note: I am quoting the GDS calib strain range!

I am adding a second plot showing O3a to O4 a as well, using a reference time from May 2019.

There is a significant amount of work that gave us this improvement. I will try to list what I can recall:

Images attached to this report
H1 SEI
austin.jennings@LIGO.ORG - posted 16:44, Friday 25 August 2023 (72442)
BRS Drift Trends - Monthly

Closes FAMIS 26434, last completed in alog 71777

All BRS channels are within their nominal regions.

Images attached to this report
LHO General
corey.gray@LIGO.ORG - posted 16:10, Friday 25 August 2023 (72414)
Fri DAY Ops Summary

TITLE: 08/25 Day Shift: 15:00-23:00 UTC (08:00-16:00 PST), all times posted in UTC
STATE of H1: Observing at 150Mpc
INCOMING OPERATOR: Austin
SHIFT SUMMARY:

H1's been locked 53.5hrs (1hr from our O4 longest lock *knock on wood*).

Main news of today was the H1 Chiller Pump#1 going down.  Tyler as well as a contractor spent most of the last 6hrs dealing with this and we are currently set and running with Chiller Pump #2.

There was 2hrs of commissioning by Gabriele & Elenna as well.

We appear to have a ~5Mpc increase in range....waiting to hear if this is due to the commissioning or working with different chiller pump!

LOG:

H1 OpsInfo (CDS)
thomas.shaffer@LIGO.ORG - posted 16:06, Friday 25 August 2023 (72439)
Added more filtering for external events that we normally ignore

Recently we've been receiving groups of SubGRBs or long grbs from our igwn-alert subscription system that are often old events or repeated. Since we don't normally react to these and will ignore them, we don't need to even have them update in epics. I've added more filtering for these types of events in igwn_response.py and also in the tests.py for VerbalAlarms. The new code is running for both of these systems.

Another related issue I'm looking into is why we haven't received any superevents from out igwn-alert listener in maybe a month. I don't think the issue is on our end, but I have a few more checks. We still get phone calls in the control room, so operators are still informed.

LHO General
austin.jennings@LIGO.ORG - posted 16:04, Friday 25 August 2023 (72438)
Ops Eve Shift Start

TITLE: 08/25 Day Shift: 15:00-23:00 UTC (08:00-16:00 PST), all times posted in UTC
STATE of H1: Observing at 159Mpc
OUTGOING OPERATOR: Tony
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 13mph Gusts, 9mph 5min avg
    Primary useism: 0.02 μm/s
    Secondary useism: 0.06 μm/s
QUICK SUMMARY:

- H1 has been locked for 53.5 hours, all systems appear stable

H1 ISC
elenna.capote@LIGO.ORG - posted 15:24, Friday 25 August 2023 (72432)
HAM1 FF updated following new HEPI FF

Jim installed and updated the HAM1 HEPI feedforward, which he describes in alog 72393. The HAM1 to ASC feedforward has not been updated in some time, and Jim's results showed some difference in CHARD P from his HEPI FF update. Jenne and Jim turned off the HAM1 ASC FF a few days ago (72395) to gather data for retraining.

I used data from that time to retrain the HAM1 to CHARD P, PRC2 P, and INP1 P feedforward. All new filters are labeled with today's date "0825". I turned on these new filters and saw small improvement in the CHARD P and INP1 P error signals. There was no evident improvement in DARM. We are not limited by CHARD P in DARM at this time so I am not that surprised (see third attachment in 72245).

The attached plot shows all ASC REFL loop error signals. The blue trace is the previous HAM1 feedforward in use, and the red live trace shows the new feedforward on.

The filters are not guardian controlled. I updated the SDF observe file in the SEIPROC model. I then asked Corey to take us to the "SDF to SAFE" guardian state so I could also update the SEIPROC safe file. However, that guardian state does not change the SEIPROC model to safe, so I had to change to the safe table by hand. Here are the steps:

"SDF restore screen">"! select request file">choose "safe.snap">open>"load table". Once the diffs are accepted and confirmed, follow those steps to load the "observe.snap" file. I took a screenshot of the safe SDFs I accepted before hitting "confirm".

 

Images attached to this report
H1 ISC (ISC)
gabriele.vajente@LIGO.ORG - posted 15:02, Friday 25 August 2023 - last comment - 12:44, Wednesday 30 August 2023(72435)
Boosting DARM low frequency gain

[Elenna, Dan, Gabriele]

We tested a filter (FM8 in DARM2) that increased the DARM gain below 3-4 Hz, where most of the RMS is accumulated.

The DARM RMS is reduced by a factor of 3. There is no immediate evident effect on the DARM noise aboev 10 Hz. However, it would be useful to test this new filter for a longer time in the future.

 

Images attached to this report
Comments related to this report
jenne.driggers@LIGO.ORG - 15:10, Friday 25 August 2023 (72436)CAL

Tagging the Cal group, since I think we'd like them to weigh in on when they might have availability to recalibrate using this new filter. 

elenna.capote@LIGO.ORG - 12:44, Wednesday 30 August 2023 (72562)

We have implemented this filter today and a calibration sweep was run with the filter on to determine the changes to the calibration.

The new filter is in FM8 of DARM2, and will be engaged in lownoise length control along with another DARM res g that is engaged.

I accepted the SDF in observe and loaded the guardian.

Images attached to this comment
H1 SEI (SEI)
ibrahim.abouelfettouh@LIGO.ORG - posted 14:58, Friday 25 August 2023 (72434)
H1 ISI CPS Noise Spectra Check - Weekly

Weekly FAMIS Task

1. Alerts from running the script:

ITMX_ST2_CPSINF_H1 high freq noise is high!

2. All other "floors" from the spectra look normal.

Non-image files attached to this report
H1 ISC
gabriele.vajente@LIGO.ORG - posted 14:30, Friday 25 August 2023 (72433)
Trying increasing SRCL gain

[Elenna, Gabriele]

We tried increasing the SRCL gain by a factor of 2. As expected SRCL_IN got better, SRCL_OUT did not change. No effect on DARM RMS.

Gain is back to nominal

Images attached to this report
H1 ISC (DetChar, ISC)
gabriele.vajente@LIGO.ORG - posted 14:03, Friday 25 August 2023 - last comment - 14:20, Friday 25 August 2023(72430)
Retuned MICH feedforward

As the title says, we retuned the MICH feedforward, and the new filter performs better at all relevant frequencies.

Guardian has been updated to engage FM9 instead of FM8.

Quoting Elenna: "It's been 0 days since we retuned the LSC FF"

Images attached to this report
Comments related to this report
elenna.capote@LIGO.ORG - 14:20, Friday 25 August 2023 (72431)

I have accepted the SDF diff in both OBSERVE and SAFE. Forgot to screenshot both times, sorry.

Process for accepting in SAFE:

Select "SDF_TO_SAFE" guardian state in ISC_LOCK

Wait for SDF table to switch to safe

Search for my SDF diff in the LSC table and sorting on substring

Accept diff

Confirm

Select "Nominal Low Noise" in ISC_LOCK guardian

H1 FMP (DetChar, ISC)
jeffrey.kissel@LIGO.ORG - posted 13:41, Friday 25 August 2023 - last comment - 12:01, Monday 28 August 2023(72428)
Chilled Water Pump Has Failed for EY HVAC Air Handlers
J. Kissel, for T. Guidry, R. McCarthy

Just wanted to get a clear separate aLOG in regarding what Corey mentioned in passing in his mid-shift status LHO:72423:

The EY HVAC Air Handler's chilled water pump 1 of 2 failed this morning 2023-08-25 at 9:45a PDT, and thus the EY HVAC system has been shut down for repair at 17:35 UTC (10:35 PDT). The YVEA temperature is therefore rising as it equilibrates with the outdoor temperature; thus far from 64 deg F to 67 deg F.

Tyler, Richard, and an HVAC contractor are on it, actively repairing the system, and I'm sure we'll get a full debrief later.

Note -- we did not stop our OBSERVATION INTENT until 2h 40m hours later 2023-08-25 20:18 UTC (13:18 PDT), when we've gone out to do some commissioning.
Images attached to this report
Comments related to this report
jenne.driggers@LIGO.ORG - 13:53, Friday 25 August 2023 (72429)

The work that they've been doing so far today to diagnose this issue has been in the 'mechanical room'.  Their work should not add any additional significant noise over what aready occurs in that room at all times, so I do not expect that there should be any data quality issues as a result of this work.  But, we shall see (as Jeff points out) if there are any issues from the temperature itself changing. 

corey.gray@LIGO.ORG - 15:51, Friday 25 August 2023 (72437)FMP

They are done for the weekend and temperatures are returning to normal values. 

Chiller Pump #2 is the chiller we are now running.

Chiller Pump #1 will need to be looked at some more (Tyler mentioned the contractor will return on Tues).

Attached is a look at the last 4+yrs and both EY chillers (1 = ON & 0 = OFF).

Images attached to this comment
jeffrey.kissel@LIGO.ORG - 12:01, Monday 28 August 2023 (72484)DetChar, ISC, SUS
See Tyler's LHO:72444 for more accurate and precise description of what had happened to the HVAC system.
H1 CAL
louis.dartez@LIGO.ORG - posted 13:01, Friday 25 August 2023 - last comment - 13:16, Friday 25 August 2023(72422)
DARM OLG UGF now
J. Kissel, L. Dartez

I'm attaching a plot of the DARM UGF that compares two measurements taken ~3.5 months apart. The blue trace is taken from the calibration sweep that Ryan C. took a few days ago (LHO:72392) and the orange trace is from a similar sweep taken back in May (20230506T170817Z). 

The DARM loop UGF has moved from ~58Hz to ~66.4Hz and the phase margin has increased by about a degree since May. 

There is no immediate need to adjust the DARM open loop gain (DRIVEALIGN_L2L gain, as mentioned in LHO:72416).


The fact that we don't see the loop dip near 20Hz is a rough indicator that the [actuation stage] crossovers are stable. I'll be following up with a more in-depth look at that.


The script used to generate the attached plot lives at: /ligo/home/louis.dartez/projects/20230825/plot_olg/plot_olg_meas.py
Images attached to this report
Comments related to this report
jeffrey.kissel@LIGO.ORG - 13:16, Friday 25 August 2023 (72427)ISC
Tagging ISC, and adding a copy of Louis' script.
Non-image files attached to this comment
LHO General
corey.gray@LIGO.ORG - posted 12:57, Friday 25 August 2023 (72423)
Friday Day Mid Shift Status

H1 is currently at almost 50.5hrs for a lock (current record is ~54hrs).

We are approaching 1pm with the start of 2hrs of commissioning.

We are also dealing with increasing temperatures at EY due to HVAC chilled water issues.  Contractor is on their way.

Images attached to this report
H1 CAL (ISC)
jeffrey.kissel@LIGO.ORG - posted 12:05, Friday 25 August 2023 - last comment - 13:14, Friday 25 August 2023(72416)
Why is DELTAL_EXTERNAL BNS Range Reporting So Much Higher Than GDS-CALIB_STRAIN BNS Range? Test Mass Actuation Strength Has Drifted by 8% And CAL-DELTAL Doesn't Compensate For It; GDS-STRAIN Does.
J. Betzwieser, L. Dartez, J. Kissel

Ryan Short recently updated the control room FOM for the BNS range (LHO:72415) which now shows -- with a clear legend -- the range computed using CAL-DELTAL_EXTERNAL_DQ vs. GDS-CALIB_STRAIN_CLEAN for both H1 and L1 observatories -- see example attached. 

This makes it dreadfully obvious that "L1's DELTAL_EXTERNAL range is right on top of the CALIB_STRAIN range -- but H1's is not, and DELTAL_EXTERNAL is *higher*." -- see First Attachment from the control room FOM screenshots.

The natural questions to ask then are "why?" "is something wrong with H1's calibration?"

No, there's nothing wrong.***
The discrepancy between DELTAL_EXTERNAL and CALIB-STRAIN at H1 is because the static test-mass stage actuation strength hasn't been updated since 2023-05-04 -- before the observing run started -- and it has slowly drifted due to test mass ESD charge accumulation -- and it's now at 8% larger than the May 04 2023 value. See the current value for the past 24 hours and a trend of the whole run thus far. L1's ESD strength has *not* drifted as much (see similar L1 trend), and they also regularly "fudge" their DELTAL_EXTERNAL actuator strength gains in order to get DELTAL_EXTERNAL more accurate (and they do so in a way that doesn't impact GDS-CALIB_STRAIN). H1 has chosen not to, to date.

This drift is tracked and accounted for in our "time dependent correction factor" or TDCF system for that test-mass stage actuation strength, \kappa_T -- and GDS-CALIB_STRAIN (and STRAIN_NOLINES, and STRAIN_CLEANED) all have this correction in place. Check out the Second attachment from the same day's "CAL" > "h(t) generation" summary page, and walk with me:
This plot is showing the ASD ratio (and thus roughly analogous to the magnitude of the transfer function) between all of the various stages of the calibration pipeline.
    - GDS-CALIB_STRAIN, GDS-CALIB_STRAIN_NOLINES, and GDS-CALIB_STRAIN_CLEANED are all this same from this perspective. Thus the ratio between these three channels with DELTAL_EXTERNAL in the denominator is highlighting the DELTAL_EXTERNAL is a preliminary product, and NOT corrected for TDCFs and thus there's a huge ~16% systematic difference between the two "stages" of product.
    - Recall that *all* of the four paths of the calibraion -- UIM, PUM, TST, and Sensing -- are being summed, and the cross-over frequency for these sums are all culminating around 50-200 Hz -- and in that region there's a factors of 2x to 3x gain peaking (see e.g. Figure 4 of P1900245) -- and thus the 8% drift in the TST stage strength means 16% systematic error in the DELTAL_EXTERNAL calibration.
    - However, the front-end version of the preliminary product that is corrected for TDCFs is also shown in the plot -- CFTD-DELTAL_EXTERNAL. The ASD ratio between this channel has MUCH less systematic discrepancy -- indicating that correcting for time-dependence (get it? CFTD!) does a LOT of the heavy lifting of accounting for this 8% TST drift. 

Of course, these ratios of different portions of the calibration pipeline don't *really* tell you if you've *really* done the right thing in an absolute sense. They only tell you what changes from step to step. (And indeed, the CFTD-DELTAL_EXTERNAL to GDS-CALIB_CLEANED ratio still shows *some* discrepancy.)

The fact that fifth attachment, from the archive showing the constant *direct measurement* of the systematic error in the calibration -- from the absolute reference, the PCALs -- is nice and low (i.e. the transfer function is close to unity magnitude and zero phase) indicates that all of the correction for time-dependence is doing the right thing.

*** Yet. In O3, L1 suffered a lot from TST strength drift. Joe has shown repeatedly that if you let an actuator TDCF drift too far beyond 10%, then the approximation we use to calculate these TDCFs breaks down (and see Aaron's work discussing it as a motivation for P2100107). In addition, since the real ESD strength is changing -- :: -- which is corroborated by the in-lock charge measurements -- I think -- see highlighted red region of sixth attachment from LHO:72310 -- :: -- that means the DARM open loop gain TF gain is also changing. 

This may impact the DARM loop stability (see e.g. LLO aLOGs 50900 and 50639). So, *eventually* we should resurrect the two things we've done in O3:
    (1) Reset the model of the static actuation strength for the TST stage to a more current value. (And thus start a new calibration epoch)
    (2) Potentially change the actual DARM loop by adjusting the DRIVEALIGN_L2L gain
    (3) Work up a solution to mitigate the drift -- perhaps doing something similar to what was done in O3, and play gains with turning on the ESD Bias voltage with the opposite sign when we're not in observing.
Images attached to this report
Comments related to this report
jeffrey.kissel@LIGO.ORG - 13:14, Friday 25 August 2023 (72426)
Louis has plotted DARM open loop gain transfer functions from May 2023 vs. Aug 2023 in LHO:72422. The comparison concludes we do NOT need to adjust the actual DARM loop (as suggested in item (2) above). In fact, the DARM OLG TF from August is *more* stable than it was in May (but not by much). This is indicative of a good robust loop design -- that ~10% level drifts don't impact the stability of the loop.

We discussed further actions (1) and (3) based on the OLG TF results, and conclude the actions can wait until next week. But... probably not 2 weeks, again because the drift is close enough to the TDCF calculation's approximation breakdown point that we need to take action and "reset" the TST stage actuation strength.
H1 PEM (DetChar, FMP, OpsInfo, PEM)
lanceanderson.blagg@LIGO.ORG - posted 11:40, Thursday 24 August 2023 - last comment - 13:07, Friday 25 August 2023(72404)
Potential Noise in DARM from Garbage Truck at LSB

Following up on TJ's alog from 8/17 (72293), it was noted that a garbage truck at LSB was quite loud. The seismometers in the LVEA clearly picked up the noise, and it seems to coincide with noise in DARM. It's hard to be certain with only one signal, but it probably warrants further investigation.


Spectrogams attached of one seismometer and DARM for 2 minute time span around when noise was reported:
-Of signal
-Zoomed in with boxes on noise regions
-Stacked with some boxes around correlated noise

Images attached to this report
Comments related to this report
lanceanderson.blagg@LIGO.ORG - 13:07, Friday 25 August 2023 (72425)DetChar, FMP, OpsInfo
Oli relayed to Genevieve that the garbage truck left at 16:05:00 local time yesterday (8/24). No noise was reported on site, but approximately 2 minutes before the truck left we see a signal in the LVEA seismometers similar to that from the truck last week, and the signal once again shows up in DARM.
Images attached to this comment
Displaying reports 16301-16320 of 86617.Go to page Start 812 813 814 815 816 817 818 819 820 End