Displaying reports 7361-7380 of 83539.Go to page Start 365 366 367 368 369 370 371 372 373 End
Reports until 17:07, Saturday 15 June 2024
H1 General (Lockloss)
oli.patane@LIGO.ORG - posted 17:07, Saturday 15 June 2024 - last comment - 19:15, Saturday 15 June 2024(78460)
Lockloss

Lockloss @ 06/16 00:01UTC (LDAS currently down so no link). We had only been locked for 20 minutes and I was trying to get the SQZ OPO to lock, so we had not been in Observing. There was a small jump up in wind at that time, but not sure if that would have caused the LL

Images attached to this report
Comments related to this report
oli.patane@LIGO.ORG - 18:58, Saturday 15 June 2024 (78461)

01:56 UTC Observing

oli.patane@LIGO.ORG - 19:15, Saturday 15 June 2024 (78462)

Went back out of Observing for ten minutes (02:01 - 02:10 UTC) to tune sqz because our range was 147Mpc. Now back in Observing at 154 Mpc

LHO General
ibrahim.abouelfettouh@LIGO.ORG - posted 16:32, Saturday 15 June 2024 (78459)
OPS Day Shift Summary

TITLE: 06/15 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Wind
INCOMING OPERATOR: Oli
SHIFT SUMMARY:

IFO is in and is locking at LOWNOISE_COIL_DRIVERS

Wind gusts have been high since the last lockloss and per usual, I would put H1 in down when the wind would go higher than 30mph and wait for oppurtunistic low-wind periods to lock ALS - here's a timeline since the last LL. The only sdf revert I had to accept was in my midshift alog 78457

All times are UTC

19:31 - NLN/Observing Lockloss

19:38 - ALS Lockloss

19:39 - DOWN due to 40-50mph winds

21:43 - Attempt to re-lock due to wind speeds below 30mph

22:08 - Two failures at MICH_FRINGES and no ability to lock PRMI let alone DRMI - decied to run initial alignment

22:25 - Initial Alignment done fully auto, attempt to relock - winds picking up about to surpass 30mph

22:41 - DOWN again after 2 ALS locklosses and 3 IR Locklosses, with winds around 35mph.

22:54 - Saw small oppurtunistic sub 30mph calmness (that only lasted 10 mins) and decided to try locking again - it worked!

23:02 - DRMI locked!

23:30 - Shift ended while in LOWNOISE_COIL_DRIVERS

LOG:

None

H1 General
oli.patane@LIGO.ORG - posted 16:06, Saturday 15 June 2024 (78458)
Ops EVE Shift Start

TITLE: 06/15 Eve Shift: 2300-0800 UTC (1600-0100 PST), all times posted in UTC
STATE of H1: Wind
OUTGOING OPERATOR: Ibrahim
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 28mph Gusts, 21mph 5min avg
    Primary useism: 0.05 μm/s
    Secondary useism: 0.08 μm/s
QUICK SUMMARY: Relocking and at DARM_TO_RF

 

LHO General
ibrahim.abouelfettouh@LIGO.ORG - posted 12:13, Saturday 15 June 2024 (78457)
OPS Day Midshift Update

IFO is in NLN and OBSERVING as of 18:29 UTC

We lost lock 2 hours after reaching observing at 16:56 UTC due to unknown reasons. The range was definitely lower than usual and while I was attempting to investigate this using Sheila and TJ's new wiki low range checks page, we had our initial lockloss. SDF Table of acceptances is attached.

After PRMI failed, I ran an initial alignment and then we locked swiftly despite winds in the high 20's and low 30's (mph). The range is still quite low now but I will wait the recommended hour or two (per Sheila's instruction) until IFO is thermalized enough to run the low range check dtt.

Wind forecast show gusts are expected to continue picking up until around 4PM so here's hoping we don't need to lock ALS during that time!

 

Images attached to this report
LHO VE
david.barker@LIGO.ORG - posted 10:12, Saturday 15 June 2024 (78455)
Sat CP1 Fill

Sat Jun 15 10:10:28 2024 INFO: Fill completed in 10min 25secs

 

Images attached to this report
LHO General
ibrahim.abouelfettouh@LIGO.ORG - posted 07:34, Saturday 15 June 2024 (78454)
OPS Day Shift Start

TITLE: 06/15 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Lock Acquisition
OUTGOING OPERATOR: Corey
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 19mph Gusts, 15mph 5min avg
    Primary useism: 0.05 μm/s
    Secondary useism: 0.08 μm/s
QUICK SUMMARY:

IFO is LOCKING at MOVE_SPOTS

IFO was mid auto relock from its last lockloss at 12:56 UTC when I arrived. Expect to get locked and observing soon.

LHO VE (VE)
gerardo.moreno@LIGO.ORG - posted 02:05, Saturday 15 June 2024 - last comment - 02:18, Monday 17 June 2024(78453)
HAM4 Annulus Ion Pump Signal is Railed

HAM4 annulus ion pump signal railed about 7:50 utc 06/15/2024.  No immediate attention is required, per trend of PT120, an adjecent gauge, the internal pressure does not appear to be affected.  HAM4 AIP will be assesed next Tuesday.

Images attached to this report
Comments related to this report
gerardo.moreno@LIGO.ORG - 02:18, Monday 17 June 2024 (78482)VE

AIP is showing some good signs of coming back, see plot attached, regardless, we will keep the appointment to go and investigate this system on Tuesday.

Images attached to this comment
H1 General
oli.patane@LIGO.ORG - posted 01:00, Saturday 15 June 2024 (78452)
Ops EVE Shift End

TITLE: 06/15 Eve Shift: 2300-0800 UTC (1600-0100 PST), all times posted in UTC
STATE of H1: Observing at 148Mpc
INCOMING OPERATOR: Corey
SHIFT SUMMARY: We are Observing at 150 Mpc and have been locked for 13 hours now. Super quiet shift with no issues.
LOG:

23:00 Detector Observing and Locked for over 4 hours                                                                                                                                                                                                                                                                                                

Start Time System Name Location Lazer_Haz Task Time End
23:31 PCAL Francisco PCAL Lab y(local) Putting covers on 23:33
H1 General
oli.patane@LIGO.ORG - posted 21:07, Friday 14 June 2024 (78451)
Ops Eve Midshift Status

We are Observing at 154 Mpc and have been Locked for over 9 hours. Nothing to note

LHO VE
janos.csizmazia@LIGO.ORG - posted 18:00, Friday 14 June 2024 (78450)
Another scroll pump is running in the Mech. room
We continue the activities with the experimental vacuum chamber in the Mechanical Room, along the Eastern-wall. Now we are running another scroll pump there. Similarly to the already running one, a foam is placed underneath it.
In the meantime, the bakeout of this chamber's RGA has been finished.
LHO General
ibrahim.abouelfettouh@LIGO.ORG - posted 16:30, Friday 14 June 2024 - last comment - 17:04, Friday 14 June 2024(78448)
OPS Day Shift Summary

TITLE: 06/14 Day Shift: 1430-2330 UTC (0730-1630 PST), all times posted in UTC
STATE of H1: Observing at 147Mpc
INCOMING OPERATOR: Oli
SHIFT SUMMARY:

IFO is in NLN and OBSERVING as of 21:33 UTC.

21:07 UTC to 21:33 UTC: COMISSIONING

19:28 UTC to 21:07 UTC: OBSERVING

Things of note:

LOG:

Start Time System Name Location Lazer_Haz Task Time End
15:29 PCAL Tony Pcal Lab Local LLO Measurements 17:59
16:33 PCAL Francisco Pcal Lab Local LLO Measurements 17:21
17:13 SQZ Sheila LVEA Y Move magnetometer 17:59
19:03 PCAL Tony, Miriam PCAL Lab N LLO Measurements 19:55
19:43 RUN Camilla, Neil Y Arm N Improve and/or maintain health 19:43
19:56 PCAL Francisco PCAL N Preparing for lab upgrade steps 19:58
22:12 PCAL Francisco, Miriam PCAL Lab Local Preparing for lab upgrade steps 22:40
22:41 SAF LVEA LVEA YES LVEA IS LASER HAZARD 15:52
22:42 WALK Francisco, Miriam Overpass N Walking 23:02
Images attached to this report
Comments related to this report
camilla.compton@LIGO.ORG - 17:04, Friday 14 June 2024 (78449)

SRCLFF gain started as 1.14, was adjusted to 1.2 then readjusted to 1.18. First sdf accepted attached.

Images attached to this comment
H1 General
oli.patane@LIGO.ORG - posted 16:08, Friday 14 June 2024 (78447)
Ops EVE Shift Start

TITLE: 06/14 Eve Shift: 2300-0800 UTC (1600-0100 PST), all times posted in UTC
STATE of H1: Observing at 147Mpc
OUTGOING OPERATOR: Ibrahim
CURRENT ENVIRONMENT:
    SEI_ENV state: CALM
    Wind: 5mph Gusts, 3mph 5min avg
    Primary useism: 0.02 μm/s
    Secondary useism: 0.10 μm/s
QUICK SUMMARY:

Detector Observing and has been locked for over 4 hours. Range isn't great, around 148 MPc, but besides that everything is good.

H1 ISC
jennifer.wright@LIGO.ORG - posted 16:07, Friday 14 June 2024 (78446)
Checking SRC (SEC) alignment before and after April 22nd and comparing with SRC alignment now

I looked at the SUS-{SR2,SR3,SRM_M1_DAMP_{P,Y}_INMON channels for a period on the 21st April before the OFI, and after we had re-aligned through the OFI to rcover our optical gain on the 25th April after the OFI burn.

I have compared these to the drift in alignment in the signal extraction cavity yesterday. This is partly to figure out why we have to keep retuning our SRCL feedfoward.

Date Mirror delta pitch value delta yaw value
21st April SR2 2 0.8
25th April SR2 3 2
13th June SR2 3 3
21st April SR3 0.5 0.9
25th April SR3 0.3 6
13th June SR3 0.5 0.3
21st April SRM 13 6
25th April SRM 23 7
13th June SRM 7 7

 

The only angular degrees of freedom that changed significantly between these dates were:

 

Images attached to this report
H1 SQZ (SQZ)
corey.gray@LIGO.ORG - posted 05:25, Friday 14 June 2024 - last comment - 10:27, Monday 17 June 2024(78428)
SQZ OPO ISS Hit Its Limit and Took H1 Out Of Observing

Woke up to see that the SQZ_OPO_LR Guardian had the message:

"disabled pump iss after 10 locklosses. Reset SQZ-OPO_ISS_LIMITCOUNT to clear message"

Followed 73053, but did NOT need to touch up the OPO temp (it was already at its max value); then took SQZ Manager back to FRE_DEP_SQZ, and H1 went back to OBSERVING.

Comments related to this report
corey.gray@LIGO.ORG - 05:38, Friday 14 June 2024 (78429)

Received wake-up call at 440amPDT (1140utc).  Took a few minutes to wake up, then log into NoMachine.  Spent some time figuring out the issue, and ultimately doing an alog search to find steps to restore SQZ (found an alog by Oli which pointed to 73053).  Once SQZ relocked, automatically taken back to OBSERVING at 517am(1217utc).

camilla.compton@LIGO.ORG - 11:05, Friday 14 June 2024 (78435)

Sheila, Naoki, Camilla. We've adjusted this so it should automacally relock the ISS.

IFO went out of observing from the OPO without the OPO Guardian going down as the OPO stayed locked, just turned it's ISS off. We're not sure what the issue with the ISS was, SHG power was fine as the controlmon was 3.5 which is near the middle of the range. Plot attached. It didn't reset until Corey intervened.

Sheila and I changed the logic in SQZ_OPO_LR's LOCKED_CLF_DUAL state so that now if the ISS lockloss counter* reaches 10, it will go to LOCKED_CLF_DUAL_NO_ISS, where it turns off the ISS before trying to relock the ISS to get back to LOCKED_CLF_DUAL. This will drop us from observing but should resolve itself in a few minutes. Naoki tested this by changing the power to make ISS unlock.
The message "disabled pump iss after 10 locklosses. Reset SQZ-OPO_ISS_LIMITCOUNT to clear message." has been removed, wiki updated. It shouldn't get caught in a loop as in ENGAGE_PUMP_ISS, if it's lockoss counter reaches 20, it will take the OPO to DOWN.

* this isn't really a lockloss counter, more of a count of how many seconds the ISS is saturating.

Images attached to this comment
camilla.compton@LIGO.ORG - 15:23, Friday 14 June 2024 (78445)

Worryingly the squeezing got BETTER while the ISS was unlocked, plot attached of DARM, SQZ BLRMs and range BLMS.

In the current lock, the SQZ BLRMs are back to the good values plot, why was the ISS injecting noise last night? Has this been a common occurrence? What is a good way of monitoring this? Coherence with DARM and the ISS

Images attached to this comment
camilla.compton@LIGO.ORG - 10:27, Monday 17 June 2024 (78488)

Check on this is 78486. Think that the SQZ OPO temperature or angle wasn't well tuned for the green OPO power at this time, when the OPO ISS was off, the SHG launch power dropped from 28.8mW to 24.5mW, plot. it was just chance that SQZ was happier here.

Images attached to this comment
H1 ISC
camilla.compton@LIGO.ORG - posted 16:34, Wednesday 12 June 2024 - last comment - 15:10, Friday 14 June 2024(78399)
April 22nd Output Arm Shift: Check on VAC trends and OFI tempurature

Following on from Sheila's alog 77427, checked on VAC trends and OFI temperature around April 22nd.

I checked (maybe a repeat of someone else) that there were no vacuum spikes during the locks when Kappa_C dropped or the locklosses afterwards. Plot of Kappa_ C with VAC channels attached.

Looking at the OFI TEC readbacks during that time, they were significantly noisier than usual during Tuesday maintenance after the 5% optical gain drop, before out alignment shift.  Noisy between 6:30am and 4pm. This is before any Tuesday maintenance 77363 or injections start. Plot attached, including zoom out and zoom in. It also happened the May 28th to May 30th.

Images attached to this report
Comments related to this report
camilla.compton@LIGO.ORG - 15:10, Friday 14 June 2024 (78444)

In 78442, we show that these larger than usual OFI temperature swings on locklosses and powerups so happened May 28th/29th when we adjusted our SRC alignment (SRC 250urad). Is this a sign there is an alignment where we are hitting something in the OFI?

H1 CAL
anthony.sanchez@LIGO.ORG - posted 17:52, Tuesday 28 May 2024 - last comment - 17:36, Friday 14 June 2024(78105)
PCAL End Station Measurement

Today Dripta and I went to EY and did what would have previously call a standard ES measurement with PS4.

And We also employed the new End Station procedure.

Details and analysis ccoming in a comment to this alog.

Comments related to this report
anthony.sanchez@LIGO.ORG - 17:36, Friday 14 June 2024 (78360)

A PCAL ENDY Station Measurement was done on May, the PCAL team (Dripta B. & Tony S.) went to ENDY with Working Standard Hanford aka WSH(PS4) and took two End station measurements to verify that the results were consistent with each other. One with our previous version of T1500062-V16 procedure and another with T1500062-V17.
The ENDY Station Measurements was carried out mostly according to the procedures outlined in Document LIGO-T1500062-v16 & v17, Pcal End Station Power Sensor Responsivity Ratio Measurements: Procedures and Log.


Measurement Log
First thing we did was take a picture of the beam spot before anything is touched!


Martel:
Martel Voltage source applies voltage into the PCAL Chassis's Input 1 channel. We record the GPStimes that a -4.000V, -2.000V and a 0.000V voltage was applied to the Channel. This can be seen in Martel_Voltage_Test.png . We also did a measurement of the Martel's voltages in the PCAL lab to calculate the ADC conversion factor, which is included on the above document.

Plots while the Working Standard(PS4) is in the Transmitter Module during Inner beam being blocked, then the outer beam being block, followed by the background measurment: WS_at_TX.png.

The Inner, outer, and background measurement while WS in the Receiver Module: WS_at_RX.png.

The Inner, outer, and background measurement while RX Sphere is in the RX enclosure, which is our nominal set up without the WS in the beam path at all.:  TX_RX.png.

 


-----------------------------------------------------------------------------------------------------------------------------------------------

The New Document has a different order of measurements that are taken and are taken in a different way.

We placed the Working Standard (PS4) in the path of the INNER Beam at the TX module.
Then the Working Standard (PS4) in the path of the OUTER Beam at the TX module.
A background measurement.

Then we take the Working Standard and put it in the RX module to get the INNER Beam.
Then the OUTTER Beam in the RX Module.
And a Background.

This is where things get different....
We remove the beam block and give the Working Standard Both Inner and Outer Beams at the SAME TIME while it's at the RX module.
We also put the RX sphere back to the RX module and put both beams on it at the same time. Like nominal opperation when the PCAL lines are turned off.
Then we take a background.

This was repeated ~10 mins later because we wanted to see if there is any time dependent variations.

The last picture is of the Beam spots after we had finished the measurement.

Old procedure measurement results : "rhoR_prime": 10565.2
New Procedure Measurement: rhoR_prime : 10571.1.  This 5 hop difference is well within our uncertainty .
second New Procedure rhoR_prime 10574.3, was off by less than 3hops ,well within uncertainty.

Preliminary analysis suggests that discrepancy in rhoRprime calculated via two methods is allowed within the uncertainty.


All of this data and Analysis has been commited to the SVN or GIT:
https://svn.ligo.caltech.edu/svn/aligocalibration/trunk/Projects/PhotonCalibrator/measurements/LHO_ENDY/


Obligitory BackFront PS4/PS5 Responsivity Ratio:
PCAL Lab Responsivity Ratio Measurement:
A WSH/GS (PS4/PS5)BF Responsivity Ratio measurement was ran, analyzed, and pushed to the SVN.
PS4PS5_alphatrends.pdf to show that the recent changes to the lab have not impacted the Lab measurements

This adventure has been brought to you by Dripta B. & Tony S.

 

 

Images attached to this comment
Non-image files attached to this comment
H1 ISC
sheila.dwyer@LIGO.ORG - posted 16:27, Thursday 25 April 2024 - last comment - 16:08, Saturday 13 July 2024(77427)
Summary of alogs related to changes in our output arm this week

In the observing stretch that started Monday around 8 pm, there was a drop in optical gain (by 4%), and power at AS port PDs (4% drop in AS_A sum, AS_B sum, AS_C sum, and OMC REFL).  There not at that time any change in circulating power in the arms, PRG, coupled cavity pole, or alignment of suspensions (shown are SR2, SRM, OM1+2, but we have also looked at many other suspension, ISI and HEPI related channels for this time 77382 ).  This time is shown by the first vertical cursor in the attachment. 

We were unable to relock the interferometer after that Monday evening lock (a new lockloss type appeared early Tuesday morning, 77359), and after the maintence window we were uable to lock with difficulties powering up (77363) and an AS camera imagine that had lobes whenever the beam was well centered on AS_C.  We locked that evening by adding large offsets to AS_C's set point, (77368).  This is the cause of the large alignment shifts seen in the screenshot on Tuesday evening to Wed morning, which allowed us to lock the IFO but created a large scatter shelf and resulted in a lower coupled cavity pole, and lower optical gain, and we did not have much squeezing that night. 

We do not think that Tuesday maintence activities are the cause of our problems, although some of the Tuesday work has been reverted (77350 77369).

Yesterday we spent much of the day with SRY locked, single bounce or using the squeezer beam reflected off SRM to investigate our AS port alignment.  77392, We are able to recover the same throughput of a single bounce beam to HAM6 by moving the alignment of SR2 + SR3 by a huge amount 77388, which also produced a round looking beam on the AS camera.  We haven't since tried to explore this aperture, to see if we could have also recovered this thoughput with a pitch move, for example.  We also saw that the squeezer beam does not arrive in HAM6 with a good transmission when injected with the alignment used previously, but that we could recover good transmission to HAM6 by moving ZM5 by 150-200 urad, in pitch or yaw.   With this shift in SRC axis, and squeezer alignment we were able to relock and not have the large scattering issues we had Tuesday night.

Today's time was spent recovering from a seemingly unrelated problem in the SQZ racks, and some commissoning aimed at recovering our previous (165Mpc) sensitivity with this new alignment.  This will continue during tomorow's commissoning time.


additional related information:

 

Images attached to this report
Comments related to this report
sheila.dwyer@LIGO.ORG - 13:58, Wednesday 08 May 2024 (77718)

Adding some trends, which have otherwise been mentioned.  The optiocal levers and top mass osems suggest that there hasn't been any shift in the PRC, Michelson, or arms.  There was also not a shift in alignment of the SRC in the first low optical gain lock on the 22nd, that alignment didn't shift until the manual move of the SRC in the recovery effort.

Images attached to this comment
sheila.dwyer@LIGO.ORG - 10:45, Saturday 15 June 2024 (78456)

Camilla notes that the OFI temperature controler had unusual behavoir in the Tuesday relocking attempts: 78399

 

sheila.dwyer@LIGO.ORG - 16:08, Saturday 13 July 2024 (79100)

comparisons of single bounce throughputs: 77441

Displaying reports 7361-7380 of 83539.Go to page Start 365 366 367 368 369 370 371 372 373 End