RESUMO
Third generation (3G) gravitational-wave detectors will observe thousands of coalescing neutron star binaries with unprecedented fidelity. Extracting the highest precision science from these signals is expected to be challenging owing to both high signal-to-noise ratios and long-duration signals. We demonstrate that current Bayesian inference paradigms can be extended to the analysis of binary neutron star signals without breaking the computational bank. We construct reduced-order models for â¼90-min-long gravitational-wave signals covering the observing band (5-2048 Hz), speeding up inference by a factor of â¼1.3×10^{4} compared to the calculation times without reduced-order models. The reduced-order models incorporate key physics including the effects of tidal deformability, amplitude modulation due to Earth's rotation, and spin-induced orbital precession. We show how reduced-order modeling can accelerate inference on data containing multiple overlapping gravitational-wave signals, and determine the speedup as a function of the number of overlapping signals. Thus, we conclude that Bayesian inference is computationally tractable for the long-lived, overlapping, high signal-to-noise-ratio events present in 3G observatories.
RESUMO
Strongly squeezed states of light are a key technology in boosting the sensitivity of interferometric setups, such as in gravitational-wave detectors. However, the practical use of squeezed states is limited by optical loss, which reduces the observable squeeze factor. Here, we experimentally demonstrate that introducing squeezed states in additional, higher-order spatial modes can significantly improve the observed nonclassical sensitivity improvement when the loss is due to mode-matching deficiencies. Our results could be directly applied to gravitational-wave detectors, where this type of loss is a major contribution.