Answer: Pretty close.
In 1958, the U.S. realized it was losing the space race, a pivotal part of the Cold War with the Soviet Union. Just one year prior, the Soviets had launched Sputnik 1, mankind’s first-ever satellite, and while the U.S. had launched our own satellite a few months later, we had nothing else to respond with. Our technology was nowhere near ready to send people to the moon, but it was good enough to send something else there, like say a nuclear bomb.
Enter Project A119. The plan was to detonate a nuclear bomb, which it should be noted was not as powerful as the ones we have today, in a crater on the moon. The effect would have been twofold, allowing scientists to study the effects of the blast and demonstrating the power of U.S. nuclear weapons to the world. Obviously, the project never got off the ground, and it remained top secret until 2000, when a former NASA executive revealed it.
The scientists at the time thought that it might have been possible to see the explosion from Earth with the naked eye, which would have been a propaganda boost for the government. As far as radiation damage on the moon, they estimated that any potential negative impacts would be minimal.
“The amount of radiation that you’re going to be creating — or more specifically, the amount of contamination — would be relatively low. We’re talking about relatively low-yield nuclear weapons. There would be some contamination,” Alex Wellerstein, a nuclear historian at the Stevens Institute of Technology, told Digital Trends. “My recollection from the report is that they calculated that a fair amount of the radioactive byproducts would basically not end up staying on the moon. They would be ejected because of the lack of atmosphere and things like that. Is that true? We don’t know.”
Never miss a story with the daily Govtech Today Newsletter.