1-2: The Danger of the Unknown

The documentary “White Light/Black Rain,” (which can be found at www.cultureunplugged.com/play/6651/White-Light-Black-Rain–The-Destruction-of-Hiroshima-and-Nagasaki, in case it has not appeared on BlackBoard) focuses on the stories of the survivors of the atomic bombings at Hiroshima and Nagasaki, adding a powerful human element to the scientific achievement of weaponized nuclear fission. A key detail that stood out to me throughout the documentary was how novel everything to do with atomic bombs was at the time of its detonation during World War II in Japan. Not only were the pilots and crew members of the Enola Gay mission shocked at the magnitude of the devastation caused by the Little Boy and Fat Man bombs, but the fallout of the bombings was also new territory.

The discovery of nuclear fission, as well as the chain reaction which could lead to the creation of an extremely powerful bomb, was still relatively recent when the use of the nuclear weapon was implemented in 1945. The first test of the atomic bomb by the United States occurred on July 16, 1945, which was less than a month before the actual use of the bombs against Japan. The effect of the bomb in test circumstances was quite powerful, but these tests were held in desert conditions, leaving no structures or people affected in the manner they would be in Hiroshima and Nagasaki.

After the shock of the near annihilation of these two cities showed nations what nuclear power was capable of, the world teetered more carefully around the precarious issue, taking precautions to avoid the outbreak of nuclear war. However, the consequences of the atomic bomb could not be erased. Even though the United States helped rebuild Japan’s cities, neither the Japanese survivors nor the United States aid groups fully comprehended the biological consequences of nuclear radiation, which affected the survivors in years to come. Because they knew no better, Japanese citizens continued to eat irradiated food that grew from the area and suffer from mysterious medical conditions for which science had not yet developed a remedy. To this day, a stigma, which I have personally witnessed, exists against survivors who have lasting reminders, through their health and scars, of the powerful dangers of nuclear radiation.

As science develops at a rapid rate, what is the responsibility of the world in weaponizing it? Is it better to pursue newer technologies and use them in a nascent state, even though this could lead to unintended, misunderstood consequences? Or is it more prudent to extensively study these new technologies, risking prolonged violence by not putting these weapons into effect? Using Hiroshima and Nagasaki gives us a complicated case study; had the United States delayed and researched more about what the aftereffects of the bomb would have been, they would have risked another country discovering the technology first, or risked a public outcry against using the bomb at all. After seeing the documentary for this week, what are your opinions on the research and implementations of new weapons? — Nicole

9 thoughts on “1-2: The Danger of the Unknown

  1. I’m going to play devil’s advocate here, and propose that the corollary to your final question (what is our responsibility with regard to weaponization of new technologies how quickly can we go about it?) is whether we have a choice. Can we prevent new technologies from being applied to weaponry — either at home or abroad? And, if not, does it increase risk if we don’t follow suit? To put it more concretely, to delay means to risk falling behind militarily. And to fall behind militarily might increase the chance of conflict, because it means affording other entities a distinct military advantage. Of course, this is in many ways the sort of alarmist thinking that may have led to overly aggressive policies with respect to nuclear weaponry. But that doesn’t necessarily mean that it’s entirely wrong.

  2. I would argue that it is the responsibility of governments to research as much as possible the effects of new weapons technology before deploying it on military forces or civilian populations, as doing so is important in the pursuit of avoiding unintended consequences (which could have global implications) and in showing respect to the lives that the use of such weaponry will affect. However, I am not sure if knowledge of these effects would help to create resistance amongst the general public to the use of new technology during war time scenarios. Such information is likely to be classified and would not become known to civilian members of the population until after such weapons had been deployed in a real-world scenario. War time situations also, as Nicole has pointed out, complicate the idea of how much research, and the time spent collecting it, is considered reasonable. For instance, during World War II concerns for the lives of American troops posed a very real and legitimate impetus for action on the part of the U.S. government. However, even in such scenarios a delicate balance needs to be struck between responsibility to domestic military forces and to foreign civilian populations.

  3. I think one of the most significant questions of technological development is the background situation. For example, by the time Szilard tried to patent the idea of a nuclear chain reaction, Adolf Hitler was already in power in Germany and political scientists were already discussing whether there would be another world war. Within a year of the discovery of nuclear fission, the foundation of the Manhattan Project had been laid. Research was escalated during wartime and focused mainly on production of materials for weapons. Had nuclear fission been discovered in peacetime, I believe there would have been less immediate pressure to weaponize and use the technology, and thus a longer research period (which did in fact occur in the years after the war). Thermonuclear weapons, in contrast, were developed after WWII and have been tested extensively but not yet detonated in a war. Their presence was a significant part of the nuclear deterrent arsenal in the Cold War instead, showcasing power rather than using it.

    I feel that this is a pattern that will be difficult to change. Governments should consider themselves responsible for any and all possible outcomes, and so more time spent on research would be ideal. However, if a technology can be developed to end a war, a country will almost certainly use it, regardless of the consequences. This is less necessary in peacetime, and more research can be done (which serves the double purpose of displaying a country’s power and deterring aggression without necessarily killing a lot of people).

  4. I think the main idea of the questions you raise in the closing paragraph of your post can be broken up into two distinct issues. In researching and developing new technologies, there will sometimes be unforeseen consequences or unintended effects that go beyond the original purpose/intent of the technology. The two issues then become at what point it is acceptable (ethically, etc.) to utilize that new technology, and at when it is prudent, from a public policy perspective, to do so. Imagine, for instance, that the new technology in question is something the goal of which is fundamentally positive – such as an advance in medicine. In this case it is easier to appreciate and understand the pressure whoever is in charge of authorizing its use would feel given that the development has the potential to save hundreds or thousands of lives. While there are obvious differences between the two scenarios, particularly since the implementation of a new medicine is so well regulated nowadays, the general comparison stands. The medicine, however, might have unforeseen consequences that cause some individuals much more harm than good, but still be a net positive in the overall goal of combating a given disease. Would those that it harmed and their families wish that it had been more carefully researched and every possible side effect explored? Naturally. Those that it did manage to help, though, may have not lived to use it had that additional time been taken.

    Obviously this comparison is somewhat flawed within its own parameters and especially so because of the innate difference in the use of a medicine and that of an atomic bomb, but the point I am trying to make is that often these questions must come down to analysis of costs and benefits. In a perfect world the atomic bomb’s effects would have been understood perfectly before the military even thought of using it, but given the information they had at the time and the potential risks of holding on to it too long that you discussed in your post, it seems difficult to speak solely about the ideal and important to discuss real life situations where decisions must be made without perfect information and the inability to predict the future.

  5. By 1945, the number of deaths from WWII reached into the tens of millions.
    Something needed to be done to end the war, and the atomic bomb was the
    technology to do so. Yet at the time, the aftermath of the technology was not understood.
    Captain Lewis wrote, “My God, what have we done?” after dropping the bomb on Hiroshima.
    This was only after seeing the immediate destruction from the air, before he
    could even see or know about the thousands of dead and deformed bodies and
    those that later suffered from radiation poisoning. As others have said, I
    think it is very valuable to conduct a significant amount of research on
    technologies before they are deployed because it is important to understand not
    only how the technology will be used, but the potential outcome and aftermath
    that will result. However, extensive research is not always possible during
    times of war. While further research could have been conducted on the atomic
    bomb in the US, this would have extended the length of the war (leading to many
    more deaths) and increased the chance that another country would use this
    technology first.

  6. Most people would insist, and reasonably so, that using new technologies in under-studied, nascent states is a mistake and that research on such technologies should examine and exhaust all possible outcomes before it’s used in a potentially harmful way, particularly if the results may be catastrophic.
    However, in the domain of large-scale weaponry, this principle falls a little flat given that weapons of destruction are, by nature intended to cause widespread harm; their fundamental objective is damage. Therefore, while makers of a new handgun, for example, may halt the gun’s production if they learn that the effects of it being fired could indirectly kill thousands of people (rather than the one person that would directly and intentionally be killed from the shot), I’m not so certain that the same can be said for makers of larger-scale weapons such as atomic bombs. Developers of Little Boy were very much aware that it would kill hundreds of thousands of people, even without accounting for the ~160,000 deaths that resulted from the long-term effects of radiation. I’m struggling to convince myself that their minds would’ve been changed had they dedicated more time to researching, and eventually discovering, the bombs’ indirect effects, particularly during wartime when the stakes were as high as they were.

  7. I think you bring out a good point that creation of atomic bomb and fast utilization in Hiroshima and Nagasaki bombing were due to the pressure and urgency the U.S. had felt during the war. I agree that more research can be done during peacetime (and should) and I would more or less agree that a country is almost likely to develop technology and use it without much deliberation during a war irregardless of the consequences. However, this certainty should not be our excuse to justify such irresponsibility. Whether the bombing of Hiroshima was a right act or not remains debated, but we know that it did happen and provide us with the understanding of the severity of the danger of using such weapons. And the scarier part of the past is how easy it was for states to use of +2000 nuclear tests as more or less a political demonstration during 1945-1988, as Joe points out in the other blog post. But now that we know from the past that technology did and is capable of creating even more fatal weapons that will be catastrophic to both environment and social security, it is ever more important to make sure that states do not take such actions lightly. Thus, international agreements and cooperations to prevent development of such risky scenario and accountability and thorough researches within the scientific community are ever more important.

  8. I’d have to agree with ebde and argue that whether or not one should weaponize new technology is not really a choice that many countries can afford to have. Much like the Prisoner’s Dilemma, we see very clearly that the ideal choice is for both members playing the game to avoid betraying the other and cooperate, which could easily be interpreted here as two countries agreeing to avoid weaponizing technology. But human psychology is the
    one factor here that cannot easily be controlled, and behaviorally it would be incredibly difficult to prevent any type of defection from this contract. However, this isn’t necessarily a bad thing, as self-interest may very well promote increased security, as the knowledge of increased military security around the world would hamper any impulsive attacks from hostile nations. Although the tragedy of the nuclear detonations in WWII is not to be downplayed, I think that to an extent it is an isolated event, and it served as a milestone in human history for realizing the extent to which such an attack impacts global dynamics. Actually observing the colossal damage nuclear detonation creates may have positive ramifications in the long run, as we are now much more cautious about testing nascent weapons. To an extent, militarization (and the knowledge that all other countries are doing the same) can create an uneasy alliance between all nations to maintain the peace.

  9. Nuclear bombs, despite their potential for utter destruction, are better understood today as defensive–not offensive–weapons. In fact, I do not believe that the fundamental objective of “weapons of destruction” is actually destruction. The very concept of nuclear deterrence is the notion that without even having to deploy an a-bomb, the mere prospect of a nuclear-armed country’s destructive capacities should ironically be intimidating enough to prolong peace. In other words, a nuclear weapon is defensive to the extent that without actually being deployed, it could dissuade enemies from attacking (or in Japan’s case, from continuing an uphill battle against the US.)

    That said, the only time a nuclear weapon was actually used offensively is in Japan. But why should it have been offensive then, but defensive today? Even in the case of WW II, I fail to see why the a-bomb could not have functioned as a “defensive” weapon, the mere threat of which (not actual deployment) could perhaps have bullied Japan into surrendering. After all, it seems from the documentary that the Japanese were increasingly tired of war and less hopeful of victory. One man from the documentary said “Even as kids, we knew that we were losing the war. Any fool could see it. We didn’t have anything.” Another man from the documentary who was in Nagasaki during the War said that “Though the government kept saying we were winning [against the US], people realized Japan could not win.” Did a recently developed a-bomb really have to be dropped on Japan to earn its surrender, or could the bomb have been used “defensively,” meaning, could the mere threat of it sufficed to have the same effect on Japan as if it were to have been dropped?

Leave a Reply