By Patricia Daukantas
If you’ve watched TV lately, you may have noticed an ad from a domestic beer company that made an extraordinary claim involving lasers. The company said it would project its logo on the near side of the moon during its full phase last Friday.
It didn’t take much Googling for me to debunk the “moonvertising” plan as a hoax dreamed up by an advertising agency team. But it raises a fascinating question: Is it possible to shine a laser beam onto the moon and see the light from Earth?
One entertainment executive says that a soft-drink company had similar moon-lighting plans for New Year’s eve in 2000, but the Federal Aviation Administration nixed the idea out of concern for aviation safety. Although the executive claims that scientists had done the math and found that such a feat was possible, he doesn’t present the research to back up the claim.
So I asked Tony Campillo, OSA’s senior director of science policy, to work out some back-of-the-envelope calculations. Tony has more than 40 years of experience in optics and photonics, as both an optical scientist with the Naval Research Labs and other institutions and as the former editor of the journal Optics Letters. His verdict: Making a light spot on the moon that people could see without assistance would likely require continuous laser power beyond anything we have on Earth right now.
Let’s imagine a single beam of green laser light originating on Earth’s surface and aimed at the moon. (The human eye is most sensitive to light of about 555 nm in wavelength, and green just happens to be the favorite color of the beer company that started this whole thing.) Assume that the outgoing beam is 3.5 m wide, as in the Apache Point Observatory laser-ranging program. After atmospheric distortion, the beam width would be 2 km at the moon, and about 90 percent of the light would reach the lunar surface.
But wait! The moon reflects only about 10 percent of the light that hits its surface. And even that, according to Tony, is scattered into a Lambertian pattern that covers an area that is 100 times the size of the Earth by the time it returns.
We know that the light-gathering part of the human eye—the dark-adapted pupil—is 1 cm wide at best, and let’s make a rough assumption that the diameter of the scattered beam is 1 million km wide when it hits the Earth. Thus, the naked-eye observer is catching only about 1 photon in every 1022.
How much light is needed for the human eye to see? In other words, what’s the threshold of human vision? I know more about astronomy than biology, so I’m a little shaky on that answer. Although some say that, in principle, the human eye should be sensitive to single photons of visible frequencies, in practice noise from both the visual field and the observer’s own neurological system gets in the way. Astronomers must subtract out background light when they are trying to measure the brightness of heavenly objects with their telescopes.
In Optics InfoBase, I found a 1919 (!) report by P.G. Nutting, OSA’s very first president. On the next-to-last page of the 25-pp. document, there’s a table that provides visual detection thresholds for sources of various areas; for the smallest source, the threshold light energy is 17.1 × 10-10 erg/s, or 0.17 femtowatt (fW). (You could quibble that the table heading should be “power entering eye” instead of “energy entering eye” because the erg is the CGS unit for energy, not power. However, maybe unit accounting was different in 1919.)
Tony also guessed that an input of at least 1 fW from a continuous-wave (cw) laser would be needed to trigger sight in the human eye. Extrapolating to the originating laser, he guesstimated that a 100-GW cw laser would be needed to produce a visible spot on the moon.
Another way of thinking about the sensitivity of human vision involves the astronomers’ system of apparent magnitudes for measuring the brightness of objects in the night sky.
The apparent-magnitude scale is a logarithmic scale dating back to ancient days. The faintest stars that the human eye can see have a magnitude of 6, while stars with a magnitude of 1 are 100 times brighter than their 6th-magnitude cousins. Nowadays, you might still be able to see 6th-magnitude stars from a high desert on a clear night. However, from the typical suburb of a brightly lit American city, you would probably see only 3rd-magnitude and brighter stars.
Suppose that the brightness of the one-pixel lunar display is 1st magnitude. With the assumption that the pupil of the dark-adapted eye is 7 mm in diameter, Tony calculated that the eye will receive 200 photons/s from a 6th-magnitude star and 20,000 photons/s from a 1st-magnitude star. According to Tony, 1 W of green light corresponds to 2 × 1018 photons/s. By this line of reasoning, 0.01 fW of light from a 1st-magnitude star hits the retina, and thus only a 1-GW cw laser would be required to make a visible dot on the moon.
In this case, Tony adds, his estimate assumes that the laser is painting a single 1st-magnitude pixel on the darkened portion of a first-quarter or last-quarter moon. The beer commercial calls for shining an image on the full moon, which would require a lot more energy.
How does this estimate compare with existing lasers? The petawatt laser-fusion projects at the University of Rochester and Lawrence Livermore National Laboratory will generate huge energies – but over pulses in the nanosecond to picosecond range, not cw.
What would it take to build and fire a 100-GW cw laser? My best guess (from U.S. Energy Information Administration data) is that the United States generates roughly 4,000 GW of electricity every hour, but I could be way off. Would it really take 2.5 percent of the U.S. electrical supply to make a visible green dot on the surface of the moon?
Here’s where you come in. We’d like to ask for your feedback on our calculations and estimates. Many of the assumptions that Tony and I have made need to be refined, especially the size of the Lambertian scattered beam returned to the Earth. A better calculation of that beam size would have to take into account the cosine-square nature of the scattering to estimate the peak flux; that could change the guesstimate by an order of magnitude or more. Some questions to ask yourself:
§ What is the threshold at which a human on Earth could detect a bright spot on the moon? What is the optimal size of a pixel on the moon?
§ How much different would the experiment be if the bright spot was on the face of the full moon (which has an apparent magnitude of roughly −12) or a darkened region of the moon (between the quarter-moon and new-moon phases)? What is the required pixel brightness in each case?
§ What kind of a laser would be required to make this stunt work?
§ What would be the technical challenges involved in making such a laser?
Once you’ve tackled these yourself, you might pose these questions to your students as a thought experiment.
If any of your friends mention that they were disappointed about not seeing a logo on the moon, at least you’ll be able to explain why.
Image of the near side of the moon taken by the Clementine mission.
2008-03 March, Astronomy, Lasers, Optics history