Displaying reports 6641-6660 of 83348.Go to page Start 329 330 331 332 333 334 335 336 337 End
Reports until 08:16, Wednesday 10 July 2024
H1 OpsInfo (SUS)
thomas.shaffer@LIGO.ORG - posted 08:16, Wednesday 10 July 2024 - last comment - 08:25, Wednesday 10 July 2024(78997)
ITMY Violin modes 5 & 6 slowly ringing up again

I noticed this at the end of my shift yesterday, and now that the IFO has had a longer lock it has confirmed that ITMY mode 6 is slowly ringing up through the lock. These modes are right on top of each other so we have always struggled to damp them well, but we have found settings that have worked for most of O4. Whatever changes we've made with the IFO recently have made those settings not work.

I'm mainly writing this as a reminder for operators to keep an eye on these modes. Hopefully I'll find a setting that will work, but these ones are always a headache.

Images attached to this report
Comments related to this report
rahul.kumar@LIGO.ORG - 08:25, Wednesday 10 July 2024 (78998)SUS

I am on it now, will play with the gain and phase to try and find out a setting which works.

LHO General
thomas.shaffer@LIGO.ORG - posted 07:32, Wednesday 10 July 2024 (78996)
Ops Day Shift Start

TITLE: 07/10 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Observing at 156Mpc
OUTGOING OPERATOR: Tony
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 7mph Gusts, 6mph 5min avg
    Primary useism: 0.01 μm/s
    Secondary useism: 0.05 μm/s
QUICK SUMMARY: Locked for almost 6 hours after the earthquake passed through. range is much better than yesterday, tipping into the 160's a bit.

H1 General
oli.patane@LIGO.ORG - posted 01:20, Wednesday 10 July 2024 (78995)
Ops Eve Shift End

TITLE: 07/10 Eve Shift: 2300-0800 UTC (1600-0100 PST), all times posted in UTC
STATE of H1: Earthquake
INCOMING OPERATOR: Tony
SHIFT SUMMARY: Currently relocking and at MOVE_SPOTS. We were knocked out by a large earthquake and it kept us from trying to lock for two hours. I ran an initial alignment and relocking is going quickly. The two other locklosses from my shift are a mystery to me, but from the first lock we can clearly see that our range is lower after today's activities, although the second lock of my shift, the 36 minute one, we did get up to 155Mpc pretty fast, so maybe it was just that first lock back.

And maybe this already has an alog somewhere, but Gerardo noticed that a peak at 20 Hz was pretty big during our first lock back after maintanence. Looking back, it first appeared on may21st, and although its height fluctuates, the past week and a half it seems to have gotten progressively larger.


LOG:

23:00 Detector Observing at 147 Mpc
00:29 Out of Observing to tune sqz
00:39 Back to Observing

01:20 Lockloss
01:47 We were cycling through MICH_FRINGES so I took us to DOWN to start an initial alignment
02:11 Initial alignmet done, relocking
02:57 NOMINAL_LOW_NOISE
03:00 Observing

03:37 Lockloss
    - going to run initial alignment again since during the last one we had some ground motion that might've affected alignment
    - ALS_XARM had to adjust the fiber polarization
04:06 Initial alignment done, relocking
05:04 NOMINAL_LOW_NOISE
05:06 Observing

05:16 Lockloss from earthquake, staying in downn while earthquake passes
07:26 Starting an initial alignment
07:45 Initial alignment done, relocking

 

Images attached to this report
H1 SUS
oli.patane@LIGO.ORG - posted 00:51, Wednesday 10 July 2024 (78994)
Weekly In-Lock SUS Charge Measurements FAMIS
Closes FAMIS#28361, last checked in alog78530

Note: ITMs still only have measurements processed through May 21st. I can look into this over my next couple of shifts.

Images attached to this report
H1 SEI
oli.patane@LIGO.ORG - posted 00:34, Wednesday 10 July 2024 (78993)
H1 ISI CPS Noise Spectra Check - Weekly FAMIS

Closes FAMIS#25998, last checked in alog78873

Frequencies above 10 Hz look good for all spectra.

Many spectra have peaks at 1 Hz on their vertical sensors that haven't been there at least the past few weeks - every HAM and every BSC suspension on Stage 1

Compared to the last check, ETMY Stage1's vertical sensors are much louder between 4 and 10 Hz, and seem to line up with the oscillations of the horizontal sensors. The horizontal sensors are also elevated between 4 and 6 Hz.

Non-image files attached to this report
H1 General (Lockloss)
oli.patane@LIGO.ORG - posted 22:20, Tuesday 09 July 2024 (78992)
Lockloss

Another :(

Lockloss @ 07/10 05:16UTC due to an incoming 6.7 earthquake. We had been locked for only 12 minutes :(

We were knocked out by the S waves, the R waves aren't coming for another hour so we are going to be down for a good while.

H1 CAL
francisco.llamas@LIGO.ORG - posted 21:06, Tuesday 09 July 2024 (78964)
Pcal XY comparison investigation - ninth movement, inner beam right

FranciscoL

After one week of having the inner beam centered on the Rx sensor (LHO:78810), on July 09, we moved the beam from center to the right.

This movement was done to double check the changes seen after our move on May 14 (LHO:77840). LHO_XY_Comparison_dataVmodel_1399180506-1404571480.pdf shows the full cycle and nominal results of our series of pcal beam movements, which ended on 24-07-09 @ 08:00 PT. The plot legend shows the value given by the weighted mean of each week using the mean value of each observing stretch for the distribution. The *raw* data for each observing stretch is plotted in gray but, for scaling purposes, some of that data is outside the range of the plot. The black line is what we expected the X/Y comparison to yield from the beam movements that we made. EDITORS NOTE: The vertical axis on the plot is in HOPs (Hundreth of a percent). Although some changes are unexpected, these are not concerning or significant changes that will affect LHO calibration. It is also an ongonig investigation, we are learning as we go.

The movement done in May 14 is seen on the first step of the plot, a week after having both beams centered, where the weighted mean yields a value of 0.99747 – a change of -25 HOPs from the previous week. However, as seen in the model, we expected the line to change by +19.2 HOPs. The rest of the movements *agree* with our model. Further analysis and diagnostics are pending but the main takeaway is that the data does not match the model for this one move, which motivates repeating the measurement.

During this procedure, we used keithley DVM (KVM) 1424277 instead of the FLUKE handheld DVM.

EndStationLog.pdf lists the voltage values after each significant step of procedure T2400163. The steps represent writing down a voltage value after a particular change to the beam position. Some highlights from the recorded values.

The KVM was calibrated using a Martel voltage/current calibrator (Martel) after the procedure. The KVM was configured to a range of 10V and a resolution of 6.5 "FAST digits ("FAST" is referred to the time it takes for the ADC on the KVM to integrate; a 6.5 FAST resolution corresponds to an integration time of 16s) . The following table lists the voltages provided by the Martel and the corresponding readings displayed on KVM 1424277.

Martel [V] KVM [V]
0.000 00.00012
3.000 03.00018
3.383 03.38328
4.000 04.00027

 

 

 

 

 

 

In summary, (1) using a KVM increased the resolution of our measurements for the beam movement procedure. (2) The 'Initial' measurement changed by 8 HOPs from the last voltage measurement from the previous movement (3.382V), done on July 02 (LHO:78810). (3) The initial and final voltage measurement during today procedure changed by 4 HOPs.

Images attached to this report
Non-image files attached to this report
H1 General (Lockloss)
oli.patane@LIGO.ORG - posted 20:41, Tuesday 09 July 2024 - last comment - 22:07, Tuesday 09 July 2024(78988)
Lockloss

Lockloss @ 07/10 03:37

Comments related to this report
oli.patane@LIGO.ORG - 22:07, Tuesday 09 July 2024 (78991)

05:06 Observing

H1 General (Lockloss)
oli.patane@LIGO.ORG - posted 18:21, Tuesday 09 July 2024 - last comment - 21:01, Tuesday 09 July 2024(78985)
Lockloss

Lockloss @ 07/10 01:20UTC

Comments related to this report
oli.patane@LIGO.ORG - 21:01, Tuesday 09 July 2024 (78986)

Currently not sure of cause, but the lockloss was a lot smoother/less sudden than most other locklosses, which I thought was interesting. I think EX L3 sees it first (the little jolt right after the cursor and before it dips down), but there aren't any saturations or glitches that happen before the lockloss like we tend to see in a lot of other locklosses.

Images attached to this comment
oli.patane@LIGO.ORG - 20:00, Tuesday 09 July 2024 (78987)

03:00 Observing

 

H1 SQZ
oli.patane@LIGO.ORG - posted 17:31, Tuesday 09 July 2024 - last comment - 17:41, Tuesday 09 July 2024(78983)
Out of Observing for SQZ tuning

07/10 00:29 UTC I took us out of Observing to tune the squeezer since we have now been locked for over three hours.

Comments related to this report
oli.patane@LIGO.ORG - 17:41, Tuesday 09 July 2024 (78984)

00:39UTC New opticalign offsets accepted in sdf, back to Observing

Images attached to this comment
LHO General
thomas.shaffer@LIGO.ORG - posted 16:31, Tuesday 09 July 2024 (78954)
Ops Day Shift End

TITLE: 07/09 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Observing at 146Mpc
INCOMING OPERATOR: Oli
SHIFT SUMMARY: Relock after maintenance day was fairly straight forward, aside from needing to run the baffle align scripts to get light in the xarm. This was maybe expected since EX tripped during DAC card work. Once the IFO was up, there was some excess noise ~25-60Hz, but things are now starting to improve in that area. Our range is still lower than usual, but it's still increasing.

Lock acquisition notes:


LOG:

Start Time System Name Location Lazer_Haz Task Time End
16:08 SAF LVEA LVEA YES LVEA IS LASER HAZARD 15:47
14:56 FAC Tyler VPW n Forklifting package to VPW 15:16
15:09 FAC Kim EX n Tech clean 16:24
15:09 FAC Karen, Nelly EY n Tech clean 16:10
15:23 CDS Fil EY n Unplug picomotors 16:19
15:29 SAF Ryan C LVEA yes Transition LVEA to laser safe 15:48
15:30 PCAL Francisco EX YES PCAL meas. 16:46
15:36 ISC Jennifers CR n IMC alignment 17:22
15:42 VAC Travis MX, EX - Turbo tests 18:36
15:52 PSL Jason, Ryan S LVEA - PSL local PSL alignment from PMC 18:59
15:55 CC Ryan C Ends - Dust monitor checks 16:58
16:11 FAC Karen, Nelly, Kim FCES n Tech clean 16:46
16:17 FAC Tyler LVEA, mids n FAMIS checks 16:52
16:19 FAC Chris EY, EX n Hepa filter swaps in fan room 18:49
16:22 ISC Keita LVEA n Look at HAM6 racks 16:37
16:24 SEI Jim FCES n SEI GS13 investigation 17:49
16:55 CDS Fil EX n Unplug picomotors 17:54
16:58 PCAL Francisco PCAL lab local PCAL meas wrapup 18:34
16:59 CC Ryan C LVEA, FCES n Dust monitor tests 17:36
17:08 FAC Karen, Kim, Nelly LVEA n Tech clean 18:27
17:11 - Rick, SURFs LVEA n Tour the SURF students 18:49
18:16 SEI Jim LVEA n Setup L4Cs under HAM3 18:24
18:30 CDS Marc, Erik EX n Swap card for sus ex 19:08
18:59 - Ryan C LVEA n Sweep 19:13
20:57 CC Ryan C LVEA n Turn off DM5 20:57
H1 PSL
ryan.short@LIGO.ORG - posted 16:04, Tuesday 09 July 2024 (78980)
PSL 10-Day Trends

FAMIS 21007

Work done in the enclosure today and last week (including PMC swap and adjustments needed as a result) all show up clearly on this week's trends.

I still see a very slight increase in PMC REFL since last week's PMC swap, which we also saw with the old PMC and informed us about its increasing cavity loss, but now that we've made things a bit more stable, we'll keep an eye on it. PMC REFL was able to get at least a few watts lower than before the PMC swap, so that's promising.

Relevant alogs: 78813, 78839, 78814, and 78979

Images attached to this report
H1 General
oli.patane@LIGO.ORG - posted 15:59, Tuesday 09 July 2024 (78981)
Ops Eve Shift Start

TITLE: 07/09 Eve Shift: 2300-0800 UTC (1600-0100 PST), all times posted in UTC
STATE of H1: Observing at 146Mpc
OUTGOING OPERATOR: TJ
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 7mph Gusts, 5mph 5min avg
    Primary useism: 0.07 μm/s
    Secondary useism: 0.05 μm/s
QUICK SUMMARY:

Detector Observing at 147Mpc and has been locked for almost 2 hours.

H1 PSL
jason.oberling@LIGO.ORG - posted 15:40, Tuesday 09 July 2024 (78979)
Work on PSL Stabilization Systems Post-PMC Swap (WP 11947)

J. Oberling, R. Short

Today we worked to better optimize the PSL's stabilization systems (PMC, FSS, ISS) after last week's PMC swap.

PMC

We found PMC Refl at ~21 W this morning, when we had left it ~23 W at then end of maintenance last week.  The drop is due to the enclosure settling after having the environmental controls on for almost 7 hours (this is normal behavior).  PMC Refl had been drifting between 20 W and 21 W since settling down after our incursion, and we did not see a sharp increase over the last week as we had been seeing with the previous PMC.  This is good news, but we will continue to monitor this as the slow increase in PMC Refl took place over many weeks.

We began by adjusting the operating currents of the PSL pump diodes, as these have a large affect on the output beam quality; this work was done with the ISS OFF.  We had adjusted these while PMC Refl for the old PMC was increasing, so it made sense that they would be off with the new PMC.  We found that this PMC really likes Amp1 to be pumped less and Amp2 to be pumped more.  In the end we had PMC Trans at ~108 W and PMC Refl at ~17.8 W.  The pump diode currents are shown in the first picture and given here:

With these pump diode currents Amp1 is outputting ~65.5 W and Amp2 is outputting ~138.0 W (Amp1 output down a little while Amp2 is unchanged).  Interesting note, lowering the pump diode currents for Amp1 (and therefore lowering its output power) did not have a large affect on the output power of Amp2, and increasing the pump diode currents for Amp2 did not have a large affect on its output power but did have a large affect on PMC Refl (0.1 A changes resulted in PMC Refl changes of 1.0 W or more).

We then went to the LVEA and took a TF of the PMC, results shown in the 2nd picture.  UGF is ~1.15 kHz and phase margin is 54.4 degrees.  Also, with the enclosure environmental controls OFF the TF is much smoother than the first one we took after the swap (with the environmental controls still ON).  Good to see things looking better here.  This done we went into the enclosure to tune the ISS and FSS.  In discussion with Jenne we decided to wait on tweaking the mode matching lenses until we have done a PMC sweep to see what our mode content looks like.

ISS

We opened up the ISS box to tweak the power on the two ISS PDs.  The ISS likes the PDs to output 10 VDC, so the ISS box's internal half-wave plate was adjusted until PDB (which usually reads a little higher) was outputting roughly 10 V.  The alignment on each PD was tweaked; PDB was well aligned but PDA was a little off.  After tweaking the alignment both PDs read closer to the same value than they have in a while (the BS that splits the light to both ISS PDs isn't an absolutely perfect 50/50), so that's a nice improvement.  The final PD voltage values are:

This done, we shut the ISS box and moved on to the FSS.

FSS

The FSS tune up was done with the ISS ON and diffracting ~3% (RefSignal set to -1.99 V); this is done to keep measured power levels stable and helps to speed up the alignment (hard to align an AOM when the power is jumping around).  As usual we began by taking a power budget of the FSS beam path, using our 3W-capable Ophir stick head:

We then set about tweaking the beam alignment to increase both the single- and double-pass diffraction efficiencies, as these are both regularly above 70%.  There was a change in power state between the initial power budget and our adjustments.  We tweaked the HWP WP05, which controls the input polarization of the FSS (WP05 is used in combo with PBS01 to set horizontal polarization w.r.t. the table), and upon re-checking the FSS input power it had "increased" to 292.8 mW.  We've seen this with these little Ophir stick heads before.  Their small size is very convenient for fitting into tight spaces, like found in the FSS beam path, but they come with drawbacks.  They aren't the most stable in the world, and they are VERY angle of incidence dependent; we usually have to spend some time aligning the stick head to the beam to ensure it is as perpendicular as we can get.  My guess is this is what happened here, we thought we had it perpendicular when we did not.  We confirmed our FSS In and AOM in powers made sense and moved on with the alignment.  We tweaked the AOM alignment to increase the single-pass diffraction efficiency, tweaked mirror M21 to improve the double-pass diffraction efficiency, and ensure we were not clipping on the FSS EOM (provides the 21.5 MHz PDH sidebands for locking the RefCav).  Our final power budget:

We did have to move the EOM a little to re-center its input and output apertures on the beam.  We then adjusted mirrors M23 and M47 (these are our picomotor equipped mirrors) to align the beam to the RefCav using our alignment iris (use M23 to the front of the iris, M47 to the back) and locked the RefCav.  The picomotors were used to tweak up the beam alignment into the RefCav; we started with an initial TPD of ~0.63 V and ended with a TPD of 0.793 V.  The beam alignment on the RefCav RFPD was then checked and tweaked, and we took locked and unlocked voltages for a visibility measurement:

The visiblity here is a little lower than its norm, which generally hangs out in the lower-80% range (last couple of FSS tune ups had visibility around 83%).  Best guess here is a combo of mode matching changes (some evidence of the output mode of this PMC being slightly different from the old) and not quite tweaked up alignment (hard to get the alignment really good while in the enclosure with environmental controls on).  Seems there may be some room for mode matching improvement to the RefCav, as both the visibility and TPD are lower than we've usually see after a tune up.  Power in the FSS beam is also a little lower than usual (generally around 300 mW, as seen above it's closer to 290 mW now), which also contributes to this.  This being the best we could do with what we currently have in the time alloted, we left the enclosure.  We'll probably have to do some remote alignment tweaks to both the PMC and RefCav at a later date, once the enclosure is at a better thermal equilibrium.  We left the enclosure ACs running for ~15 minutes to bring the enclosure temperature down to its usual ~71.4 degF temperature (as measured by the Table South temperature sensor), and then put everything in Science mode and left the LVEA.

Once back in the control room we turned the ISS back ON.  It was already diffracting close to 2.5% so we left the RefSignal alone.  Watchdogs were re-enabled and Ryan performed a rotation stage calibration, and we handed things off to TJ for IFO recovery.  This closes WP 11947.

We will continue to monitor PMC Refl with this new PMC over the coming weeks.

Images attached to this report
H1 General (Lockloss)
oli.patane@LIGO.ORG - posted 18:47, Monday 08 July 2024 - last comment - 21:04, Tuesday 09 July 2024(78947)
Lockloss

Lockloss @ 07/09 01:46UTC after 8.5 hours locked

Comments related to this report
oli.patane@LIGO.ORG - 19:58, Monday 08 July 2024 (78948)

02:57 UTC Observing

oli.patane@LIGO.ORG - 21:04, Tuesday 09 July 2024 (78989)
Images attached to this comment
H1 General
thomas.shaffer@LIGO.ORG - posted 20:54, Sunday 07 July 2024 - last comment - 21:50, Tuesday 09 July 2024(78930)
Lock loss 0341 UTC

Lock loss 1404445312

5 hours and 42 mins seems like a standard lock for this week. LSC-DARM saw an odd wiggle before lock loss, normally I see an ETMX output that matches this but I didn't see that this time.

Images attached to this report
Comments related to this report
thomas.shaffer@LIGO.ORG - 22:28, Sunday 07 July 2024 (78932)

Back to Observing at 0528 UTC.

I ended up slightly moving PR3 P from -122.2 -> -121.6 and Y from 98.8 -> 98.6. This brought the beatnote up to around -14.5dBm, up from the -18 or so that I started at, and ALSX power moved up as well. when DRMI locked POP18 looks to be higher than it was the last handful of locks over the last few days. I ran through an initial alignment and then it went right up.

oli.patane@LIGO.ORG - 21:50, Tuesday 09 July 2024 (78990)

EX L3 does oscillate 16 ms before DARM sees the lockloss, although it might be possible that the slight drop (marked in green) is DARM seeing it??

Images attached to this comment
Displaying reports 6641-6660 of 83348.Go to page Start 329 330 331 332 333 334 335 336 337 End