We’re all familiar with the use of lasers for fiber-based communication’s links, and even short- to medium-range free-space links; you can buy standard systems for the latter, and they are effective and straightforward to set up.
But what about using a laser-based link for deep-space communication? That’s a very different story, as detailed in an interesting interview with Bill Klipstein, a physicist at the Jet Propulsion Laboratory who is project manager for NASA’s Deep Space Optical Communications (DSOC) mission, Figure 1. The interview, “NASA Test Lasers’ Ability to Transmit Data from Space” in a recent issue of Tech Briefs Photonics and Imaging, gives an overview of the project and its unique challenges, and also made me reconsider some basic assumptions about the nature of sensor-based data.
While using laser-based optical links for short-distance free-space communications is a standard technique, doing so for deep-space communications between fast-moving objects is a far different challenge; While using laser-based optical links for short-distance free-space communications is a standard technique, doing so for deep-space communications between fast-moving objects is a far different challenge; NASA’s Deep Space Optical Communications System (DSOC) project is an attempt to see if it is viable, given all the extraordinary impediments. (Image source: NASA)
In brief, the objective of this project is to develop a practical optical link between Earth and spacecraft, as such links could support much higher data rates than present RF links can offer. The need is clear: all those satellites and space probes are generating more data than the RF bandwidth can offer, and the situation is getting worse as advanced sensors and data acquisition system stream even more useful, important data like a fire hose with ever-growing water pressure and volume. The project’s goal is to launch a test system on the Psyche mission to an asteroid orbiting the Sun between Mars and Jupiter, with a target launch date of 2023 and on-station arrival of 2026.
The interview, which was relatively short, readable, and interesting – pointed out the obvious: sending data via a laser form Earth to an object in space is difficult but feasible, since you can increase the source laser power (up to a fairly high maximum, of course). It’s the downlink that is the real challenge, as the optical output of photons from the spacecraft is very limited.
The immediate problem is twofold: to precisely aim the source laser at the target in space, and to get the spacecraft perform the complementary operation of pointing its own laser back to Earth. An interesting aspect to this problem is that both source and target are moving, and so the downlink beam must be pointed to where the Earth will be when those photons actually arrive, due to propagation-delay transit time.
OK, so everything is lined up and corrections for transit delay are worked into the setup…now what? According to Dr. Klipstein, in the near-Earth environment, even out to the moon, enough photos arrive to make their capture relatively easy. But when the mission extends to deep space, there are very few photons coming back, so the ground-based sensor must look for individual photons via a photon-counting detector.
Anytime you start looking for individual photons, you are in a very strange world, since photons do not like to be captured, according well-known principles of quantum physics. Any conventional photo detector, such as a CCD device, no matter how good it is, simply won’t work in this unforgiving world of photon detection.
According to the interview, the project instead uses a nanowire array made of tungsten silicide, which becomes superconducting at about 1 K. When a single photon hits this material, and deposits enough energy, it develops a very strong signal, and so you can detect a single-photon event.
Adding to the detection challenge is timing, as even the data encoding format is non-intuitive: all the available energy of the spacecraft-based source goes into single, narrow laser pulse. Rather than send many photons for a single bit, a given sequence of ones and zeros is encoded by the time position of the pulse, so skew and variation along the path and at the detector contribute to decoder errors.
Here’s my quandary: we’ve always associated “analog” with sensors, because the physical parameters such as temperature, pressure, speed, flow, energy, and others are inherently analog in the real world. Translating these analog sensor signals and doing the signal conditioning has always been one of the key roles of analog components, and one which keeps their vendors very happy despite the relentless push to “digital” – in fact, providing the interface between the sensor world and the digital domain is what many analog components do.
But is providing a single-photon detector the realization of an analog function, or a digital one? That’s a subject for some interesting philosophical discussions, since it is in the realm of metaphysics and the meaning of quantum reality. It certainly makes you wonder if equating “sensors” with “analog” will continue to be a natural alignment, as sensing technologies do a deep dive into functioning at atomic-particle level, and also engage in single-photon capture.
What’s your view on the analog-versus-digital nature this sort of photon and quantum-level sensing?
Optical link forgoes constricting fiber, finds bliss in free space
Terahertz Waves for Production-Line Inspection?
Add Fiber to Your Test Sensor Diet
Determining Wind Chill: When the Sensor is Not the Issue
Optical temperature measurement raises existential questions