The first direct detection of gravitational waves from a binary black hole merger in September 2015 heralded the start of gravitational wave astronomy. 11 detections later, the first detection of a merger of two neutron stars has provided multi-messenger astronomy with a new observation method. This is only the start of this exciting new era - for statistically meaningful gravitational wave astronomy it is imperative to continually improve detector sensitivity and hence increase the detection rate.
Interferometric gravitational wave detectors (such as Advanced LIGO) employ high power solid-state lasers to maximise their detection sensitivity and hence their “reach†into the universe. These sophisticated light sources are ultra-stabilised with regard to output power, emission frequency, and beam geometry; this is crucial to obtain low detector noise. However, even when all laser noise is reduced as far as technically possible, unavoidable quantum noise of the laser still remains. This is a consequence of the Heisenberg Uncertainty Principle, the basis of quantum mechanics: In this case, it is fundamentally impossible to simultaneously reduce both the phase noise and the amplitude noise of a laser to arbitrarily low levels. This fact manifests in the detector noise budget as two distinct noise sources – photon shot noise and quantum radiation pressure noise – which together form a lower boundary for current-day gravitational wave detector sensitivities, the standard quantum limit of interferometry. In order to overcome this limit, various techniques are being proposed, among them different uses of non-classical light, and alternative interferometer topologies. MH will explain how quantum noise enters and manifests in an interferometric gravitational wave detector, and will touch on some of the proposed schemes to overcome this seemingly fundamental limitation, all aimed at the goal of higher gravitational wave event detection rates.