If an atom absorbs a photon, it’s obvious that the photon must haven been emitted by an atom or molecule at some earlier time. On the other hand, if an atom emits a photon, does that mean it must be absorbed by another atom at some point in the future? Intuitively that seems like a very odd idea, as if a flashlight couldn’t shine unless there was a wall for it to shine upon. And yet physics would seem to imply that if absorbed photons must be emitted, emitted photons must be absorbed. It all comes down to the time symmetry of physics, and it was a topic of Richard Feynman’s doctoral dissertation.
One of the basic ideas in physics is that simple interactions between objects are symmetric under time reversal. For example, suppose you made a video of two billiard balls colliding and bouncing off each other. If you played the video backwards, it would still look like two billiard balls colliding. For a simple interaction you have no way to determine which version of the video was the “forwards” one, and which was the “backwards” one. When you apply this to interactions between electrons and photons, the absorption of a photon looks exactly like the emission of a photon “played backwards.” This means if an absorbed photon must be emitted, by time symmetry an emitted photon must be absorbed.
This idea becomes even more bizarre if you consider astronomical distances. If you were to look up at the Andromeda Galaxy in the night sky, you see light that has traveled for about 2.5 million years. That means a particular photon emitted by a star in Andromeda 2.5 million years ago must have somehow “known” that it would reach your eye. In fact, because of relativity, from the photon’s perspective your eye absorbed it the instant it was emitted by a star. You might balk at that idea, but if we reversed things you’d have no trouble with the idea that the light you see was emitted 2.5 million years ago. In relativity, the two are the same, since cause and effect depend upon your vantage point.
What Feynman showed was that despite it’s oddness, the requirement that emitted light be absorbed doesn’t violate causality. It came to be known as Wheeler–Feynman absorber theory (John Wheeler was Feynman’s advisor). There were some problems with the model, however. In particular Feynman assumed that charges couldn’t self-interact. In other words, an electron couldn’t emit a photon only to reabsorb it later. Of course there’s no real reason why that should be forbidden, but if you allow it in the theory you get a divergence of interactions and the model breaks down. This led Feynman to abandon the model eventually, but it was deeply influential in his development of quantum electrodynamics, for which he was awarded the Nobel prize.
So is it the case that any emitted photon must be absorbed? We aren’t sure, but we can’t rule out the idea either. It might just be that when we observe light from the most distant galaxies, the photons we detect are simply arriving at the destination they had all along.
Paper: Wheeler, J. A.; Feynman, R. P. Interaction with the Absorber as the Mechanism of Radiation. Reviews of Modern Physics 17 (2–3): 157–161 (1945)