Environmental Review


  A Monthly Newsletter of Environmental Science and Policy



May 2000

Geology and Nuclear Waste Policy

Rodney C. Ewing

Introduction:

The radioactive waste generated by commercial nuclear power plants in this country will be buried at Yucca Mountain, Nevada. This nuclear garbage dump is intended to keep the radioactive waste isolated from human contact for at least 10,000 years. Because of the backlog of spent fuel wastes from nuclear power plants Congress has directed that the waste site be built and licensed as soon as possible. The safety standards for the waste site are to insure that people twenty kilometers from the site will not be exposed for the next 10,000 years.

In 1998 the Department of Energy completed a system performance assessment for the waste site and reported that there were no show stoppers, no insurmountable problems to building the site. Work proceeds toward a license application as a repository in 2001.

However, it is not clear that the performance assessment would have recognized a show stopper if there were one. By the same token, the uncertainties in the assessment are so great that a safe repository might not be recognized. Professor Rodney Ewing has argued in Science magazine that uncertainty about the long term performance of the waste site could be reduced, and confidence in it increased, by applying the idea of multiple barriers to the assessment'.

We spoke with Professor Ewing about his ideas to improve the safety of the Yucca Mountain repository.

ER: Professor Ewing, what is your training?

RE: I got my Ph.D. at Stanford University in Earth Sciences, and my specialty was mineralogy and materials science. The subject of my dissertation was radiation effects in minerals. At the time this was an esoteric topic of little interest to anyone. After my Ph.D. I took my first academic job at the University of New Mexico in the Geology Department. In New Mexico we're close to two national laboratories - Sandia National Laboratories and Los Alamos National Laboratory - and so various seminar and colloquia speakers spoke about radioactive waste management strategies, and I was quite interested to learn what work had been done on radiation effects and would ask that question, and never received a very substantive answer.

In the U.S. there are approximately 100 nuclear power reactors, and each reactor generates between 20 and 30 metric tons of spent nuclear fuel each year

So I wrote a paper for Science and simply pointed out that there should be radiation effects, and they could have substantial effects on waste forms used to immobilize nuclear waste'. With my background in mineralogy I moved into studies of radiation effects in materials used for immobilizing nuclear waste and in various reactor materials.

I became involved in studying the chemical durability of the nuclear waste glasses, and for nearly ten years spent part of every year at the Hahn-Mietner Institute in Berlin, and there worked with a number of German colleagues, and through their connections began collaborations on other aspects of the problem, such as the corrosion of spent nuclear fuel. This work was first supported by the Swedish Nuclear Fuel and Waste Management Company. That led to collaborations in France and Japan and now we also have major collaborations in China and Australia.

Three years ago I was offered a position at the University of Michigan in the Nuclear Engineering and Radiological Sciences department. And I was pleased at the opportunity because I was quite interested to learn more about the nuclear fuel cycle and I would hope, enlarge the possibility of the impact of our research.

At the University of Michigan though, I also have appointments in the Geological Sciences and in Materials Science and Engineering. The theme of our research still is the durability of materials in a wide variety of radiation fields, and this now encompasses environmental sciences. In other research we use the radiation to modify materials for technological applications. We've studied radiation effects in geologic materials to determine effects on age dating techniques.

ER: Where do these wastes come from?

RE: Nuclear waste disposal or management has been a big concern in the United States for the last twenty years. There are two broad categories of nuclear waste in the United States. One category is the waste associated with the production of nuclear weapons, and these are wastes that were left after reprocessing of nuclear fuels where the reprocessing was to reclaim plutonium 239 and uranium 235. And in the high-level fluids that resulted, you have the very radioactive fission products; those wastes are now stored at sites around the country.

The two most important locations are at the Hanford site in Washington state, and the second and equally important is the Savannah River site in South Carolina. The total volume of high-level waste around the country is about 400,000 cubic meters, and the total activity is approximately I billion curies. [A curie is 37 billion disintegrations per second. Ed] The waste at these sites most of it will be immobilized in borosilicate glass, it will be mixed in the glass and sealed in a metal canister. All of these wastes are destined for the proposed repository at Yucca Mountain, Nevada.

That's the defense side, although it's worth noting there is a new defense waste stream that is associated with the dismantling of nuclear weapons. Based on agreements between the United States and former Soviet Union and now Russia, approximately 100 metric tons of weapons plutonium is available for disposition. That's an important topic of international discussion now.

The other source of nuclear waste in the United States is from commercial power generation. In the United States there are approximately one hundred power reactors, and each reactor generates between twenty and thirty metric tons of spent nuclear fuel each year. In the United States we have approximately 35,000 metric tons of commercially-generated spent fuel presently stored by the utilities at seventy sites throughout the country; some of it is in the reactor pools and other fuel is in dry storage.

This fuel is destined for the proposed repository at Yucca Mountain, and the total activity associated with this fuel is something like 20 billion curies; it represents the spent nuclear fuel associated with commercial power generation. It represents approximately 95 percent of the total radioactivity proposed for the repository at Yucca Mountain.

ER: What is the output of nuclear reactor waste worldwide?

RE: There are 437 nuclear power plants around the world now, and the worldwide production is approximately 10,000 metric tons of spent fuel per year, so it's accumulating pretty quickly. I mentioned the plutonium from weapons, but you should also realize we generate between 70 and 100 metric tons of plutonium per year in commercial power reactors. So the worldwide inventory for plutonium now is over 1,300 metric tons. So it's really quite a lot, particularly if you consider the critical mass of plutonium 239 is approximately ten kilograms. So weapons proliferation is a real important issue.

The high-level waste from defense programs, the spent nuclear fuel from commercial power generation, the spent fuel from naval reactors, and the weapons plutonium from dismantled weapons, all of this radioactive material is proposed for disposal at the repository at Yucca Mountain. That's what makes Yucca Mountain such an important site. It's our only site.

ER: What is your involvement with these waste sites?

RE: In parallel with all of these scientific activities, I served on a number of committees for the National Research Council reviewing various aspects of Department of Energy programs. One was the Waste Isolation Pilot Plant in New Mexico, which is for transuranic waste. I was on that committee for over a decade. And through these committees I would say I gained some experience in the broader aspects of nuclear waste management.

Most recently I was on a Department of Energy committee composed of six scientists; we spent two years reviewing the performance assessment of the proposed repository at Yucca Mountain. That's where I began to become concerned about the usefulness of the performance assessments.

ER: What are performance assessments?

RE: The concept of a performance assessment is actually a very good one; that is, you look carefully at all of the parts of your repository system the corrosion of the waste form, corrosion of canisters, the hydrology, the geochemistry - you put all of this information together to identify where you lack knowledge and where you need to focus future research. You connect all of this information in the form of computer codes, one connected to the next, and then you run the code and it calculates, as an example, a dose to some person or group of people, some years in the future, and that might be hundreds of years, tens of thousands of years or millions of years.

This whole approach grows out of reactor safety analysis where you estimate the probability of certain types of, for instance, valves failing, so it's a probabilistic analysis. If you don't know an important parameter or, let's say any parameter in the performance assessment of the repository, you assign a probability distribution over a range of possible parameter values. The performance assessment of the Waste Isolation Pilot Plant required approximately 1,600 input parameters and there may be hundreds of small programs or codes used to calculate flow, solubility, corrosion rate, and so on; then you get some estimate of the health effect or exposure at the end of the simulation.

As I looked at this I became very concerned that the uncertainty in these analyses was so large as to make them unusable. If you propagate the uncertainty for several thousand parameters through such an analysis, the uncertainty range becomes quite large. The article in Science wasn't directed so much at the performance assessment approach, except that now under the proposed rule making by the Nuclear Regulatory Commission and proposed regulations by the EPA, this type of analysis is going to be the sole quantitative criterion for determining whether a repository is approved for a license.

The worldwide inventory for plutonium is over 1,300 metric tons. So it's quite a lot particularly if you consider the critical mass of plutonium 239 is about ten kilograms.

It's such a complicated analysis it's very difficult to review it. For single parts of this code you'll have expert panels. It took our committee two years, and after two years of study our conclusion was that we couldn't tell whether the analysis was conservative or not. Our committee had six people from different disciplines: I was the materials scientist/geochemist, there were risk analysis experts, health physics, and so on. And it was our collective judgment that despite a superb effort - we couldn't fault at all the DOE effort to pull everything together - due to the complexity of the analysis, due to the absence of important data, due to poor understanding of how all of these subsystems are coupled one to the other, once we were given an answer we couldn't tell whether the resulting estimate of exposure was on the high side or low side of the proposed standard. We couldn't even say, they were very conservative and so it won't be any worse than this.

ER: Why?

RE: It has to do with the nonlinear coupling between the models. If you look in detail at the models, you find that the output parameters from one subroutine become the input parameters for another, and they are connected in a pretty complicated way, as you would expect for a model of a natural system. The analogy I used is if the performance assessment could be thought of as a series of lenses where each lens represents one of the subroutines, if each lens has a certain spherical aberration or error that makes the image fuzzy, if you put them in series you get, finally, an image that's so fuzzy - the uncertainty is so large - you can't tell whether the result is useful or not.

The thrust of the paper in Science was to object to the idea that probabilistic performance assessment would be the sole basis of determining whether a repository meets the license requirements. When people first started thinking about the difficulties of geologic disposal, the geologic community and various National Research Council committees early on pointed out that extrapolating over long periods of time in natural systems, the uncertainties would be huge.

The answer to mitigating that uncertainty was the concept of multiple barriers, where you would have a good waste form, a good canister, a good geology, a long travel time, geochemical conditions that made certain radionuclides less mobile or insoluble in water, a whole series of multiple barriers.

Another point I wanted to make in the Science paper was that the proposed regulation says that we're so confident in the advances we've made in our ability to model these systems and do a performance assessment that we will no longer impose any quantitative standards on the individual barriers. The reasoning for this was that this way you can optimize the performance of the repository. That is, by requiring in a regulation that a canister last 1,000 years, the whole repository may fail because you don't make that canister or can't demonstrate its performance.

The present regulation leans heavily in the direction of eliminating specifications for individual barriers and just looking at the end result. My point is just that the end result uncertainty is so large, I would sure like to see multiple barriers, and also, multiple barriers that could be analyzed specifically and individually or separately rather than incorporated into a very large single analysis.

ER: What is the endpoint for the model?

RE: The regulatory limit was set at 10,000 years, which is relatively short as compared to, say, the half-life of plutonium 239, which is 24,500 years. So by relying on this elaborate mathematical analysis, and realizing the uncertainties would grow very quickly as a function of time, the analysis is only useful for at most 10,000 years, and so the regulatory period is not on a geologic time scale.

ER: It seems that this is an effort to get to a point where you can move forward and that's where politics comes in.

RE: Right. I would agree that just based on geologic observations and general knowledge and behavior for certain geologic environments or repository environments, in fact you could pretty quickly write down what you would expect to happen. As an example, under reducing conditions the uranium dioxide in spent fuel is very insoluble. [Reducing in this case means a lack of oxygen. Ed] And so one could, using multiple barriers and common scientific sense arrive at, I would say, a good strategy for disposing of waste, a convincing strategy. On the other hand, you can imagine that you have a poor strategy, but with an elaborate analysis it would pass. For example, when we looked in detail at the analysis at Yucca Mountain, we found that they had taken a lot of credit for the performance of the fuel cladding. The fuel has a cladding a few millimeters thick, an alloy called zircaloy. It's not meant to be a barrier. It is corrosion resistant; if you just put it in water, it doesn't corrode very quickly. But they took a lot of credit for the zircaloy cladding, which with radiation becomes brittle, it's thin, so one can worry about whether that's reasonable. But in this elaborate analysis, by taking substantial credit for just a few barriers the final result of the analysis looks good.

As I looked at the performance assessments I became very concerned that the uncertainty in them was so large as to make the unusable

ER: So you can fudge your numbers to make the analysis look better than it probably should be.

RE: It's a matter of judgment. If, on the other hand, you're very conservative, the coupling between the models is such that if you're very conservative a "good" repository might fail the standard. So you can imagine both possibilities: a poor strategy would pass the analysis by optimistic assumptions hidden in the analysis, or a good repository might fail by just being too prudent in the selection of your input variables. Unless you know what the uncertainty is at the end, you can't tell which of those possibilities you've arrived at.

ER: What would be a better strategy than relying on the performance assessment?

RE: Well, first I would much prefer to see multiple barriers be included as an integral part of the approach; that there be a convincing analysis that if the canisters failed early, if the fuel cladding is brittle and fractured, the flow of water is slow enough that you don't expect significant release to the biosphere.

Yucca Mountain was initially attractive because of the argument that there is no water, it's an and environment. But if you look at the site you'll find that there is water, and in particular there's evidence for rather rapid movement of water through the rock.

As an example, chlorine 36 from atmospheric testing of nuclear weapons has been detected at the repository level.

ER: Which is how far down?

RE: 300 meters.

ER: And that could only have gotten there...

RE: In the last fifty years.

ER: So we know water has moved 1,000 feet from the surface in the last fifty years.

RE: Right. Whether a large amount of water made that trip or not, one can argue about whether it's significant, but the chlorine 36 signature is at the repository level. Now, another part of the analysis, and also mentioned in the Science article, is the performance assessment point of compliance is calculated twenty kilometers away from the repository. So that when you do your analysis of the repository behavior, let's say you change canister materials or you do something in the repository to make it better, you analyze that performance twenty kilometers away. The uncertainty transport over that twenty kilometers grows with distance, over time, and obscures your ability to look at the details of the way that the repository itself is performing.

In addition to that, after the radionuclides travel twenty kilometers, the performance assessment calculates a dose to a population or a person. So all of the uncertainty in estimating how water will be used, how much water will be pumped to irrigate a field that a family might farm, and all of the uncertainty in the pathways of exposure to a human being, all of these uncertainties are included in the analysis, and so it makes it difficult to tell whether you have made any progress or not, and so you can finally arrive at highly unrealistic conclusions. One unrealistic conclusion is that if you have a very durable waste form or not doesn't matter because you can't see the effect twenty kilometers away. Well, you can't see the effect not because it doesn't matter but because the uncertainty of the analysis is so large.

So I would like to see multiple barriers and a detailed, separate analysis of those barriers, rather than wrapping them all into a single analysis and calculating the total performance of the system twenty kilometers away for radiation exposure up to 10,000 years.

The regulatory limit was set at 10,000 years, which is relatively short as compared to the half-life of plutonium 239, which is 24,500 years.

ER: Is this push for probabilistic performance assessment a solution to a political rather than a scientific problem?

RE: That is not an entirely fair summary of the situation. I have many friends and colleagues who do risk assessments, and they are well informed and skilled in what they do, and they readily acknowledge the uncertainties in the analysis. There's a good scientific and engineering basis for the way these analyses are done and so I wouldn't fault that part of the procedure.

But, you're right in that once you have this analysis it takes on a political use. As an example, after the most recent performance assessment of Yucca Mountain, one of the common quotes was that they did the performance assessment and there were no show stoppers. My question is, what is the probability that you would be able to recognize a show stopper in an analysis that has 5,000 input parameters? How good a tool is probabilistic performance assessment? From the political side, this question is seldom asked.

The other aspect that I think is probably the most egregious is if you're managing such a project and you identify an important area where you need data, maybe field data, this could be expensive and it might take many years to acquire the necessary data. As an example, collecting field data if you want to measure the hydrologic properties in the unsaturated zone when there's no water is really very challenging. So it takes some time. But the schedule, driven by politics and other aspects, says that in the year 2002 they'll submit their license application. With the probabilistic assessment approach, one could say, Well, I won't do the field study, I will invite experts to estimate the range of critical parameters. These experts will then do this in good faith and with considerable skill. That's an efficient way to push the project forward; it's particularly efficient because there's no way to test the assumption.

The analogy I use is that if an airplane were built in this way; that is, smaller versions of the plane hadn't been test flown, but you were assured that good and competent engineers and scientists had modeled the plane's ability to fly, would you fly on the first airplane, based on those analyses?

Yucca Mountain really is the first repository with a lot of radioactivity to be put in it. I'm convinced that the nuclear waste can be disposed of safely, but I'm also a careful and prudent person. I want to test things over and over again, and the performance assessment is structured so as not to produce testable hypotheses. At best you do confirmatory tests, but you seldom test the fundamental conceptual models that hold this whole analysis together.

ER: In hindsight, do you think did we overlook perhaps better sites in this country?

RE: Certainly.

ER: Would that be the salt layer under the Texas panhandle?

RE: No. I would compare Yucca Mountain with the Swedish repository program where they are using fractured granite, but at a position in the water table where the conditions are reducing. There's a big difference in the reactivity of spent fuel under oxidizing conditions versus reducing conditions. So if it's a repository for spent nuclear fuel, which is essentially uranium dioxide, one should consider the geochemical behavior of uranium. Under reducing conditions spent nuclear fuel is much less soluble; it's a lot less mobile. Yucca Mountain is highly oxidizing, in contrast. Now, that would be all right if there's no water. But in the probabilistic analysis, of course, you can change the climate, and you can either change it a lot or a little. And you can ask experts yes or no, and that adds considerable uncertainty to the analysis. The Yucca Mountain repository is in the unsaturated zone which means that it's above the water table. So changing the climate has a real effect on the infiltration rate. If you wanted to simplify the analysis, one would put the repository in the saturated zone below the water table and then climate can change all it wants, the repository will always be wet.

I didn't appreciate this when I first started my review of Yucca Mountain, but if you take a cubic meter of fractured granite, say in potential sites in Sweden, and squeeze all the water out, and you go to Yucca Mountain, an and region, and take a cubic meter and squeeze all the water out, you'll get more water out of the rock at Yucca Mountain than in Sweden. Even though Yucca Mountain is above the water table, the rock is relatively porous and water is held in the pores of the rock. Whereas, the fractured granite has a lot less void space; and even though filled with water, it's less than what you would get from the "dry" tuff at Yucca Mountain.

ER: Are we going ahead with Yucca Mountain?

RE: That's the only show in town.

ER: So we have to do better at trying to develop some fairly long-term engineering solutions at this point.

RE: Well, now in the project I would say the major thrust is on the engineering solutions in the near-field. And I think there are things that can be done to make it better. It's just a pity that for geologic disposal, geology matters so little.

ER: Are we going to put all of our nuclear eggs in this basket, or can we use this as a learning experience and do better next time?

RE: Well, there have been at various times proposals to

consider monitored retrievable storage, the idea being let's centralize the storage of spent fuel, let's not dispose of it permanently, keep it where we can watch it, and in the meantime develop a better knowledge base, and a better strategy for final disposal. The site suggested for that is close to Yucca Mountain.

The public policy question is, if you build a monitored retrievable storage facility, does that become permanent? I would say it's a little bit like all the Quonset huts that were built after the Second World War at universities to handle the sudden increase in students. It's only recently that I think the last huts finally disappeared. Temporary things have a way of becoming permanent, simply due to the expense and risk of any further action.

ER: Do you get discouraged working on this?

RE: I'm actually optimistic that there are appropriate, useful and compelling strategies for disposing of nuclear waste. But I think that if we're going to use nuclear power, it is in everyone's best interest to be critical and careful in how we go about doing this. If finally all decisions rest on politics, then nuclear power is too dangerous a technology to handle this way. The public policy decisions have to have strong scientific and engineering basis. If we can't do that, then we shouldn't have it.

Literature Cited:

1.  Metamict mineral alteration: An implication for radioactive waste disposal. R.C. Ewing 1976 Science 192:1336-1337

2.  Less Geology in the Geological Disposal of Nuclear Waste. R.C. Ewing 1999 Science 286:415-417

The Yucca Mountain Website is http://www.ymp.gov


Environmental Review
6920 Roosevelt Way NE PMB 307
Seattle Washington 98115-6653