Interferometer Device Sees Text from a Mile Away

(physics.aps.org)

140 points | by bookofjoe 4 days ago ago

35 comments

  • IAmBroom 5 hours ago ago

    OK, this part was brilliant:

    "To avoid this problem, the team divided their 100-milliwatt laser into eight beams. Each beam travels along a slightly different path through the turbulent atmosphere and thus receives a different random phase perturbation. Counterintuitively, this incoherent illumination makes the interference effects observable.

    When I first started studying optical engineering, my teacher had worked on the first under-the-RADAR guidance system for bombers. He told lots of amusing stories, like how the pilots insisted on a manual override - so they "agreed" to provide a switch, noting to us manual piloting at near-treetop level and 1,000 ft/s is insane.

    He taught us about the nominal amount of turbulence in the atmosphere, and that it limited space-based cameras to about half a foot resolution - a limit he said couldn't be broken. Therefore, license plates would never be readable from space...

    Before I was out of grad school, they had broken it with laser techniques on nearby targets. Flash the laser at the same time as the image, scan the laser-illuminated spot, calculate the perturbance, and reverse-filter the image. A lot of processing (for that day), but it could be done back on Earth.

    As you can see from the test images, the 8 lasers aren't enough to perfectly smooth out the noise. The noise is probably square-root-8 improved, so resolution should improve by a factor of not quite 3. Move those lasers slightly and repeat 12 times; you've improved resolution by 10. This is easy to do quickly; you should be able to read fine print held by a car passenger on the highway.

    • dekhn 2 hours ago ago

      We are in the middle of a renaissance of image processing across a wide range of fields. Many of the previous limits are being smashed by using new materials and algorithms. See https://en.wikipedia.org/wiki/Fourier_ptychography for an example

    • kevmo314 3 hours ago ago

      That's how night mode works on Pixel phones, right? I believe it takes a few images in rapid succession and took advantage of the noise being random which meant a high quality image under a noisy sensor with some signal processing.

      • picture 3 minutes ago ago

        Integrating over a longer time to get more accurate light measurements of the a scene has been a principal feature of photography. You need to slow down the shutter and open up the aperture in dark conditions.

        Combining multiple exposures is not significantly different from a single longer exposure, except the key innovation of combining motion data and digital image stabilization which allows smartphones to approximate longer exposures without the need of a tripod.

      • Calwestjobs 2 hours ago ago

        some phones shine IR floodlight, too.

    • hammock 3 hours ago ago

      >He told lots of amusing stories, like how the pilots insisted on a manual override - so they "agreed" to provide a switch, noting to us manual piloting at near-treetop level and 1,000 ft/s is insane.

      You ought to read Tom Wolfe’s “the right stuff” asap if you haven’t already

    • perihelions 3 hours ago ago

      - "Flash the laser at the same time as the image, scan the laser-illuminated spot, calculate the perturbance, and reverse-filter the image"

      That's also how some adaptive optics work in astronomy,

      https://en.wikipedia.org/wiki/Laser_guide_star

      • embwbam 2 hours ago ago

        The adaptive optics system for the DKIST solar telescope actually deforms each point of the mirror at 60Hz or something to do wavefront correction!

  • mrexroad 41 minutes ago ago

    > He imagines that the remote-imaging system could have several applications, including monitoring insect populations across agricultural land.

    “Insect populations” is a funny way to spell secrets. Jokes aside, it does seem like this could serve a wide range of non-espionage related use cases. Really cool.

  • unyttigfjelltol 39 minutes ago ago

    That interesting article led me down a research rabbit hole of microwave maser inferometers and whether that could be an explanation for the controversial Havana Syndrome. And, having skimmed descriptions of historical SIGINT projects Buran[1] and Luch[2], and the theoretical advantages of such a system ... my curiosity in Faraday cages is renewed.

    [1]https://en.wikipedia.org/wiki/Laser_microphone

    [2]https://en.wikipedia.org/wiki/Olymp-K

  • 27theo 5 hours ago ago

    > The team demonstrated that this intensity interferometer can image millimeter-wide letters at a distance of 1.36 km

    • mturmon 2 hours ago ago

      1mm at 1.36 km works out to about 150 milliarcsec (mas), if you're used to those units from astronomy contexts.

    • abcd_f 3 hours ago ago

      Letters were 8 mm.

      > To demonstrate the system’s capabilities, the team created a series of 8-mm-wide targets, each made from a reflective material and imprinted with a letter.

      • hannasanarion 2 hours ago ago

        I checked the paper, by "8mm wide" they mean that the letters were 8mm tall, which is a 22pt font (name-tag size), for those curious.

    • Calwestjobs 4 hours ago ago

      intensity interferometer means it interferometers intensity of light.

      imaging technologies you mistook for imagination technologies and their gpu inside of a sega dreamcast or iphone, ipad,...

      1.36 km = 0.85 miles

  • hammock 3 hours ago ago

    Lasers really are an underrated miracle. So many diverse uses for things that would be impossible without them.

    And we are about to be saturated in them as soon as LiDAR full self driving goes mainstream

  • 1minusp 4 hours ago ago

    i think the applications to spy-craft could be quite interesting here. Something for the next mission impossible movie maybe?

  • admash 4 hours ago ago

    Presumably this could be used for color imaging by using lasers of different wavelengths?

    • jdiff 2 hours ago ago

      I believe it'd be pretty wonky coloring, or at least it could be, since it'd be capturing snapshots of individual frequency responses. If something is visibly green, reflecting across most of the greenish areas of spectrum, but happens to absorb the exact frequency of the laser, it'd appear black when imaged this way. Or at least not green.

      • echoangle 2 hours ago ago

        I think that’s the case for regular cameras too though, the filter for the pixels doesn’t exactly replicate the response of the cones in the eyes either, right? So you have things where the camera sees a different color than a human eye.

        • jofer 2 hours ago ago

          Regular cameras respond to a wide range of wavelenghts, and they do actually reasonably mimic the response of the human eye.

          Either way, it's the "range" vs "single wavelength" that's key here. The green band (or blue band or red band) isn't one wavelength. It's an average over a fairly broad range. Single-wavelength (or very narrow range) images are quite different.

  • knotimpressed 5 hours ago ago

    I wonder if the requirement to rotate the target is inherent, or if it could be optimized away eventually?

    • stevemadere 5 hours ago ago

      I suspect this was an easy way to test it without having to build a rotatable optical bench.

      A practical device may be an array of light sources and telescopes on a rotating mount or a set of moveable mirrors that achieve the same effect.

    • nkrisc 5 hours ago ago

      If it is required, then in a real application you could just rotate the laser array instead.

    • Noumenon72 4 hours ago ago

      I also wonder about the requirement for the letters to be made of reflective material.

    • xnx 5 hours ago ago

      Or rotate the telescopes

      • IAmBroom 5 hours ago ago

        ... which are radially symmetric.

        • xnx 4 hours ago ago

          I recognize the ambiguity, but was referring to the orientation of the telescope system to the target.

  • erikerikson 4 hours ago ago

    How does this compare to the state of the art?

  • ck2 5 hours ago ago

    My favorite "lasers at distance" thing will be when amateurs can get a few photons back from the mirrors left on the moon

    https://en.wikipedia.org/wiki/Lunar_Laser_Ranging_experiment...

    Not quite there yet at the amateur level, private industry soon, but then there is the question of safety to air traffic.

    Can you imagine the first moon data link? JWST has 8mbps

  • codeulike 3 hours ago ago

    ... but only if its written on shiny paper