Summary: The core functionality of the Matlab DARM model has now been replicated in Python. Attached are figures in a single PDF file showing the primary results. By eye, these look reasonable. A more detailed study comparing to the Matlab model is forthcoming. I also replicated a study made for the L1 detector by Joe B. (see LLO aLOG 29622). I propose to call this code pyDARM. Details: I have ported most of the core functionality of the DARM model from Matlab into Python. So far, I have done spot checks by eye to make sure that the results look sensible. I have not yet done a detailed study comparing to the Matlab version, but will do so soon. I have produced plots showing: 1) DARM digital filters 2) Sensing function 3) Actuation function 4) Open loop gain 5) Frequency dependent actuation authority of each stage compared to inverse sensing 6) Ratio of each stage to the overall calibration As can be observed, the scale of each figure appears reasonable (comparing with, e.g., G1700316), the OLG is stable, and with a UGF with the correct value (by eye). The contribution of each suspension stage is closely matching L1's results using the Matlab model (see LLO aLOG 29622). What was done: - Write python version of Matlab functions to parse Foton filter files and to compute IOP downsampling filters from RCG code coefficients - Exported numerical values of zeros, poles, gain, delay from analog AA and AI models (these are objects in .mat files) - Exported ASCII file of the frequency response for the suspension force-to-length for each stage. This is read in and used in the python DARM model, and so far can only be at specific frequency points - DARM filter bank digital filters computed, sensing function and actuation functions are computed from parameters - Intermediary data products can be accessed - Code structure is "flatter", meaning less jumps between different functions/files. Hopefully this makes the code more accessible and readable. Required python modules (so far): scipy (e.g., filters), numpy (e.g., arrays/array math), collections (for namedtuples), matplotlib (for plotting) Quirks found along the way: - You can't directly multiply or add filter objects together in Python (the + and * functions are not overloaded in Python). I had to code up my own version to 1) add filters by using polynomials from roots, computing a transfer function filter, and then converting to a zpk object, all using scipy built-in functions; and 2) multiply filters by appending zeros together, poles together, and multiplying gain. - The scipy sos2zpk() function is not exactly like Matlab's version. I found an extra zero and pole when converting because Matlab removes any zeros in the 3rd and 6th positions of an sos section before computing a filter from the sos coefficients To do list (short term): - More detailed study comparing Python with Matlab models - Read in a config file - Make an L1 model to check for any differences - See if there is a way to read Matlab .mat file and objects therein. This didn't look trivial when I tried at first, which is why the analog AA and AI models were exported as well as the frequency response of each suspension stage - Address how to get the force-to-length transfer function for each stage for arbitrary frequencies - Add computations for GDS / DCS pipelines Longer term: - Hook this into a pipeline from measurement to model to uncertainty estimate pipeline
Attached is an updated figure to include the inverse sensing contribution ratio to the overall calibration. Observe that the inverse sensing has impact on the overall calibration above ~10 Hz.
Evan, this looks great, but I don't see any links to the actual pyDARM code? Can you push that to a git repository somewhere? I would be happy to help with python packaging if needed so that this is trivially distributable.