Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

kristopher

(29,798 posts)
Mon Jan 16, 2012, 04:12 PM Jan 2012

The unreliable nature of nuclear power

Oooops.

Power loss shuts down Kan. nuclear plant
By AP | January 14, 2012


BURLINGTON, Kan. (AP) — The operators of the Wolf Creek nuclear power plant say a loss of off-site power prompted an automatic shutdown at the northeast Kansas facility.

The shutdown happened Friday afternoon. Wolf Creek officials say the plant's two emergency diesel generators automatically started, supplying power to all safety-related equipment...

http://www.canadianbusiness.com/article/65955--power-loss-shuts-down-kan-nuclear-plant


"Wolf Creek officials say the plant's two emergency diesel generators automatically started, supplying power to all safety-related equipment."

But who was supplying power to all the people who were depending on the nuclear plant?
37 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
The unreliable nature of nuclear power (Original Post) kristopher Jan 2012 OP
Pretty funny... PamW Jan 2012 #1
Interesting comparison OKIsItJustMe Jan 2012 #2
There are people on this very forum who argue that we're ready for 100% renewables XemaSab Jan 2012 #9
We may be ready for 100% renewables, however, deploying them will take some time. OKIsItJustMe Jan 2012 #11
but a single incident can have disastrous consequences waddirum Jan 2012 #3
I'm familiar with Palisades PamW Jan 2012 #5
Palisades had 5 unexpected shutdowns last year. kristopher Jan 2012 #8
Spinninig reserve is how they do it! PamW Jan 2012 #20
You are greatly mistaken if you think "anit-nuke activists" stopped Yucca Mountain waddirum Jan 2012 #10
How "massive" is 76 trillion Bequerels? PamW Jan 2012 #18
Well that's not very much CreekDog Jan 2012 #22
Why do you insist on using this conversion to grams? caraher Jan 2012 #24
Bequerels is INAPPROPRIATE!! PamW Jan 2012 #25
Your selective innumeracy is showing caraher Jan 2012 #26
OH COME ON!!! PamW Jan 2012 #32
Mass gives a perspective that is divorced from any relevant considerations caraher Jan 2012 #33
WRONG WRONG!!! PamW Jan 2012 #36
let's talk THOUSANDS of tons of spent fuel... waddirum Jan 2012 #27
The SFP at unit 4 has collapsed and holds no water? FBaggins Jan 2012 #28
WRONG - Chernobyl is still MUCH LARGER PamW Jan 2012 #31
by the way waddirum Jan 2012 #30
Wolfram Alpha is accurate caraher Jan 2012 #34
So let's use your number and make the comparison. FBaggins Jan 2012 #35
Nuclear plants are unreliable kristopher Jan 2012 #4
Systematic vs random failure PamW Jan 2012 #6
Reductio ad Absurdum based on a False Dichotomy OKIsItJustMe Jan 2012 #7
The 100% is not my invention! PamW Jan 2012 #19
“The idea of 100% solar is NOT my invention; it's what kristopher keeps touting as the future.” OKIsItJustMe Jan 2012 #29
You just got my vote... PamW Jan 2012 #37
Fukushima, Chernobyl CreekDog Jan 2012 #23
Why doesn't Lovins give a cite for his "1-2%" failure rate for wind and solar? Dead_Parrot Jan 2012 #12
Does he give citations for the other failure rates in this excerpt? OKIsItJustMe Jan 2012 #13
Yes, the others are cited Dead_Parrot Jan 2012 #14
“which suggests a 10% failure rate p.a. ” OKIsItJustMe Jan 2012 #16
Over multiple systems of different ages, yes. Dead_Parrot Jan 2012 #17
Well, solar panels don't meltdown CreekDog Jan 2012 #21
What are the NERC Equivalent Forced Outage Rates and Equivalent Availability Factors for nukes? badtoworse Jan 2012 #15

PamW

(1,825 posts)
1. Pretty funny...
Mon Jan 16, 2012, 05:30 PM
Jan 2012

Pretty funny coming from a solar proponent.

Nuclear power plants, like their fossil fuel counterparts from time to time will experience a shutdown for some reason or other.

However, one can't say that they are unreliable due to a single incident. Do we say airliners are unreliable because there is the occasional crash?

NO - you look at the capacity factor for the plant. How well did it do over an entire year, for example.

This claim of unreliability for an occasional incident comes from a solar power proponent. A land-based solar plant is GUARANTEED to be down 50% of the time.

A solar proponent making the above claim is really the "pot calling the kettle black". Oops!

Evidently you don't understand the power company has enough reserves so they can absorb the outage of a single plant.

PamW

OKIsItJustMe

(19,938 posts)
2. Interesting comparison
Mon Jan 16, 2012, 05:50 PM
Jan 2012

In the OP, we have an unexpected shutdown of a nuclear reactor.



Plant officials declared a "notification of unusual event," which is the lowest of four emergency classifications defined by the Nuclear Regulatory Commission.



In the second case, we have a completely expected shutdown of a solar plant, “A land-based solar plant is GUARANTEED to be down 50% of the time.

The coming of night is not unexpected, nor is a cloudy day, and so, any system designer would take these into account. For example, a “concentrating solar plant” might include thermal storage.

http://www.eere.energy.gov/basics/renewable_energy/thermal_storage.html
[font face=Times,Times New Roman,Serif][font size=5]Thermal Storage Systems for Concentrating Solar Power[/font]

[font size=3]One challenge facing the widespread use of solar energy is reduced or curtailed energy production when the sun sets or is blocked by clouds. Thermal energy storage provides a workable solution to this challenge.

In a concentrating solar power (CSP) system, the sun's rays are reflected onto a receiver, which creates heat that is used to generate electricity. If the receiver contains oil or molten salt as the heat-transfer medium, then the thermal energy can be stored for later use. This enables CSP systems to be cost-competitive options for providing clean, renewable energy.

Several thermal energy storage technologies have been tested and implemented since 1985. These include the two-tank direct system, two-tank indirect system, and single-tank thermocline system.

…[/font][/font]



Nuclear proponents like to imply that nuclear plants are a source of constant “baseload” power, while renewables like solar and wind are completely unpredictable. Both are distortions.

XemaSab

(60,212 posts)
9. There are people on this very forum who argue that we're ready for 100% renewables
Tue Jan 17, 2012, 02:23 PM
Jan 2012

PV may have planned shutdowns every night, but that doesn't get rid of the fact that there's nothing to take up the slack in that model.

OKIsItJustMe

(19,938 posts)
11. We may be ready for 100% renewables, however, deploying them will take some time.
Tue Jan 17, 2012, 02:54 PM
Jan 2012

You simply cannot convert the entire grid to renewables (or to nuclear fission for that matter) overnight.

Furthermore, a 100% renewable grid, created solely with today’s technology would not be a 100% solar powered grid (nor would it be a 100% wind powered grid.) Instead, it would include a mix of renewables.

As for the question of energy storage, we actually have multiple technologies in hand to do this, and others “on the drawing board,” but at this time, they represent significant additional cost, with little real need.
http://energy.gov/oe/technology-development/energy-storage

Please note, if our grid were 100% nuclear fission (which it won’t be) energy storage would still be awfully handy, since demand fluctuates throughout the day, while nuclear fission plants tend to supply electricity at a relatively constant level. Energy storage technologies would allow plants to store electricity generated during periods of low demand, and dispatch it at periods of high demand.

waddirum

(979 posts)
3. but a single incident can have disastrous consequences
Mon Jan 16, 2012, 06:08 PM
Jan 2012

Palisaides nuke plant (Michigan) once had a fire that was caused by a faulty panel light in the control room. While attempting to replace the tiny light bulb, the electrician caused an arc flash which took down half the plants' power. The reactor scrammed properly. But certain instruments were powered up while others were not. It was a chicken-with-head-cut-off scenario, where certain systems were manually overridden and others still controlled by the instrumentation. Imagine having to manually operate a valve to keep the cooling system running, only to have your action reversed by when the power returned to the control room. This was 100% clusterfuck, which we're all fortunate didn't end up much worse.

Pallisaides is one of the oldest nuke plants still operating in the U.S. It recently completed it's initial 40 year operating license, but was extended by the NRC for another 20 years. This despite the fact that the reactor pressure vessel steel has the telltale signs of radiation fatigue.

And Pallasaides sits on a beach on Lake Michigan. The spent fuel is stored in a building located on a bluff of sand dune.

Nuclear energy is safe and clean, until it isn't. The consequences of a single fuckup are too great.

PamW

(1,825 posts)
5. I'm familiar with Palisades
Tue Jan 17, 2012, 11:53 AM
Jan 2012

I'm very familiar with Palisades.

South Haven is my father's home town. I watched them build Palisades. I've visited the plant, as well as stood on the beach next to the plant's water intake structure.

As for the spent fuel being stored on the sand dunes; whose fault is that? By the time the spent fuel pool at Palisades was full, we were supposed to have a fuel repository at Yucca Mountain accepting fuel. The self-righteous anti-nukes opposed Yucca Mountain, then they whine when the plant's are forced to use dry cask storage. They created the problem they are complaining about. Typical.

PamW

kristopher

(29,798 posts)
8. Palisades had 5 unexpected shutdowns last year.
Tue Jan 17, 2012, 02:18 PM
Jan 2012

Five unexpected shutdowns that required the system to have in place - 24/7 - spinning reserves sufficient to replace at an instant's notice the entire generating capacity of the reactor.

How you plan for an *unexpected* outage versus how you plan for an expected outage is completely different, and that difference is yet another strike against nuclear power.

PamW

(1,825 posts)
20. Spinninig reserve is how they do it!
Thu Jan 19, 2012, 11:47 AM
Jan 2012

Five unexpected shutdowns that required the system to have in place - 24/7 - spinning reserves sufficient to replace at an instant's notice the entire generating capacity of the reactor.
====================================

Spinning reserve is how they do it. Kristopher, we've had power plants for about a century, and they have periodic problems and the power industry has reserves. Evidently you "think" this is a problem, but it is a SOLVED problem that power grids take in stride. The power systems are designed so that even if the largest plant in the system fails, the grid can still stay up. What do you think power companies have been doing for the last century?

But in you ill-considered fantasy of having 100% solar, when nightimes comes, you don't lose a minority percentage, you loose 100% of your capacity. That's something the power companies have never faced before.

It's also why the National Academy of Science and Engineering says renewables can provide only about 20% of our energy. It's in the report that I've cited to you many, many, many times; but for some reason you just don't seem to be able to find it in the library.

I wonder why.

PamW

waddirum

(979 posts)
10. You are greatly mistaken if you think "anit-nuke activists" stopped Yucca Mountain
Tue Jan 17, 2012, 02:27 PM
Jan 2012

It failed on its own merit, for reasons of seismic and groundwater concern.

Building Palisades (and DC Cook and all other nukes on the Great Lakes) was a mistake. They should have never been sited in such a sensitive area.

Consider the massive quantities of radioactive waste that has been discharged from Fukushima into the Pacific ocean. Recent estimates of Pu-239 released from all units are now 76 trillion bequerels. Imagine a similar discharge into the Great Lakes. The risks are unacceptable.

PamW

(1,825 posts)
18. How "massive" is 76 trillion Bequerels?
Thu Jan 19, 2012, 11:18 AM
Jan 2012

Consider the massive quantities of radioactive waste that has been discharged from Fukushima into the Pacific ocean. Recent estimates of Pu-239 released from all units are now 76 trillion bequerels.
===========================================

Why don't we see how "massive" 76 trillion Bequerels is. Most people don't have a good understanding of how big, or how small a Bequerel is. So let's convert this to a more familiar unit.

The main radioisotope released at Fukushima was Iodine-131. Let's use the Wolfram Alpha scientific software to convert this to a more familiar unit.

http://www.wolframalpha.com/input/?i=mass+of+76+trillion+Bequerels+of+Iodine-131

If that 76 trillion Becquerels is Iodine-131, we are talking about a total release of about 16.5 milligrams.

Like most non-scientists, you don't realize how TRIVIALLY SMALL a Bequerel is.

You get impressed by big numbers, but don't understand the context.

Let me impress you with the width of my office; it is about 3 BILLION nanometers wide.
Don't I have a massively big office, with a width in the BILLIONS???

PamW

CreekDog

(46,192 posts)
22. Well that's not very much
Thu Jan 19, 2012, 11:51 AM
Jan 2012

Let's just put it into your house.

Since it's a "trivially small" amount.

Now who's playing with the numbers?

caraher

(6,278 posts)
24. Why do you insist on using this conversion to grams?
Thu Jan 19, 2012, 11:59 AM
Jan 2012

Not to mention doing it incorrectly, since the quote explicitly says the isotope is Pu-239 and not I-131. The specific activity of Pu-239 is, according to the Argonne fact sheet on plutonium, 0.063 Ci/g. Since 76 TBq is about 2000 Curies, we're actually talking about a total release of 2000 Ci/0.063 Ci/g = 31,700 g - almost 32 kg of plutonium! That's certainly not enough to convert our planet into a lifeless radioactive hellscape, but it's a far cry from your estimate and not the kind of release one should pooh-pooh.

If you really want to educate, using an inappropriate unit to emphasize the "smallness" of a release is not the way to go. Would you willingly inhale even 0.1 milligrams of Pu-239? After all, that's "only" 6.3 microcuries. What matters, of course, is not the exact number of Bq but the combination of activity and type of exposure (external, ingestion, inhalation). According to the same Argonne fact sheet, inhaling Pu-239 carries a lifetime cancer mortality risk of 2.9 x 10^-8 per pCi. 6.3 microcuries is 6.3 x 10^6 pCi, so inhaling a tenth of a milligram of Pu-239 carries an 18% chance of causing a fatal cancer - roughly doubling your risk of cancer death. The release you characterize as small is 317,000 times as much plutonium as that.

Now I deliberately picked a worst-case exposure scenario - ingestion is about 100 times less dangerous because plutonium is not well-absorbed that way. But that only emphasizes my point - you cannot separate the dose units from information about the isotope and likely exposure scenarios, and mass is one of the least appropriate units to use.

PamW

(1,825 posts)
25. Bequerels is INAPPROPRIATE!!
Thu Jan 19, 2012, 12:20 PM
Jan 2012

Evidently you missed the whole point as it went over your head.

The Bequerel is NOT the appropriate unit either. Bequerels is a unit of radioactive decay rate, and NOT directly related to biological damage.

If we do the same calculation for Pu-239 via Wolfram:

http://www.wolframalpha.com/input/?i=Mass+of+76+trillion+Bequerels+of+Pu-239+

It is indeed about 33 kg. So Fukushima added 33 kg of Pu-239 to the environment.

Do you know how much Plutonium is in the environment already due to nuclear weapons testing in the '50s and '60s?

About 10 metric tonnes; or 10,000 kg. So Fukushima increased this by 33 / 10,000 = 0.33%

Is all the falderal about Fukushima being so bad appropriate when it only added 1/3 of 1%?

Don't get me wrong; it's not good that it happened; but it's not the end of the world either.

Your calculation is also INAPPROPRIATE since you make the unwarranted assumption that all 33 kg is going to go to a single individual. A better approach would be to take that 33 kg, and spread it out over the globe as it is; and then see how much people are going to be exposed.

If you do that, since the Fukushima Pu is going to distribute like the weapons Pu has already distributed; then a single individual is going to have his / her exposure due to Pu increased by 0.33% How much of the average person's radiation exposure is due to Plutonium from weapons testing? Courtesy of the Health Physics Society at the University of Michigan:

http://www.umich.edu/~radinfo/introduction/radrus.htm

Under "Fallout" we see that nuclear weapons testing fallout is responsible for <0.03% of our background exposure due to Mother Nature.

Fukushima just increased that 0.03% by 0.3% of that 0.03% or 0.00009.

So the Fukushima Plutonium has increased the average person's radiation exposure by 0.00009%

Even with Fukushima, and with the weapons fallout from the '50s and '60s; the number one source of radiation exposure to the average human being is due to Mother Nature

PamW

caraher

(6,278 posts)
26. Your selective innumeracy is showing
Thu Jan 19, 2012, 12:33 PM
Jan 2012

My calculation was for inhalation of one tenth of a milligram, not 32 kg. You have my assumptions wrong.

I never said Bq alone was appropriate. In fact, if you re-read my post you'll find that I said the best characterization involves activity + isotope + circumstances of exposure.

The fact remains that your default way of characterizing releases, using mass alone, is an utterly inappropriate measure - it shares every defect of reporting Bq alone and adds to that its utter silence on the question of whether the material is actually emitting ionizing radiation.

I agree that the overall plutonium release is not going to measurably affect the global average background radiation dose. That doesn't make the release negligible, and doesn't prove there cannot be locally-significant effects.

PamW

(1,825 posts)
32. OH COME ON!!!
Fri Jan 20, 2012, 09:38 PM
Jan 2012

The fact remains that your default way of characterizing releases, using mass alone, is an utterly inappropriate measure - it shares every defect of reporting Bq alone and adds to that its utter silence on the question of whether the material is actually emitting ionizing radiation.
=============================

We're talking about the radioactive release from Fukushima in the form of radionuclides like Pu-239, I-131, Cs-137....
and you think it is a defect to quote the mass because it is silent on whether the material is emitting radiation.

What are we talking about if not the radiation?

The purpose for citing the mass is to give people some perspective as to the amount of material when they have little to go on when the amounts are quoted in Bq, for example.

There are people here and on other forums that think that tons and tons of material were blown into the atmosphere.

However, as you point out; the amount of Plutonium-239 is 32 kg; and not a huge mountain of material.

I would point out that 32 kg of Pu-239, if formed into a sphere would have a radius a little less than 3 inches.

I think it is important that people realize that the ENTIRE Pu-239 inventory discharged by Fukushima one could hold in the palm of their hand. In fact, it wouldn't even be dangerous to do that since Pu-239's alpha radiation can't even penetrate the dead layer of skin cells. In fact, the Soviet nuclear weapons program leader Kurchatov presented Stalin with the Plutonium core of "Joe 1" and Stalin held it in his hands.

That gives a perspective. The Pu-239 discharge from Fukushima is not large; and pales in comparison to the amount of Pu-239 that is already in the environment; which is some 10 metric tonnes.

PamW

caraher

(6,278 posts)
33. Mass gives a perspective that is divorced from any relevant considerations
Fri Jan 20, 2012, 11:52 PM
Jan 2012

Nobody is terribly concerned with what a radioisotope weighs, or its volume, or the radius of a sphere of the material, and your anecdote proves the point. You're right that a small sphere of plutonium poses essentially zero radiological threat due to external dose. It's also a pretty safe bet that plutonium escaping the Fukushima reactors is not in the form of discrete chunks of metal like that. Again - activity, the isotope (and therefore the forms of radiation associated with its decay) and the manner of exposure are the three most significant factors one must take into consideration in assessing the danger associated with a radioactive material. You can't say anything sensible about how big or how small the risk is unless you address all three.

Incidentally, it appears that the "Stalin held the 'Joe 1' core" story may be a fabrication. (Though others have indeed held such objects...) Yulii Khariton reports in the May 1993 Bulletin of Atomic Scientists that the core went from its production site at Chelyabinsk-40 to Arzamas-16 to the test site at Semipalatinsk, and made no detour to Moscow for presentation to Stalin (p.28). He and his co-author speculate that the story was a mashup of two events concocted by members of Lavrentiy Beria's staff. (I found this through a Google Books hit using the search terms "stalin presented plutonium sphere;" DU chokes on the direct URL.)

Also... I wouldn't - couldn't! - go anywhere near a 32 kg sphere of Pu-239, because the critical mass of pure Pu-239 is about 11 kg. Cores for implosion-type weapons contain amounts of plutonium that are sub-critical at the ordinary density of the metal; they rely on the external implosion (as well as neutron reflections) to achieve criticality. The Fat Man bomb only had about 6 kg of plutonium. So another way of putting 32 kg in perspective is to say it's enough to build 5 fission bombs. Of course, nobody can do that because the plutonium missing from Fukushima is dispersed in some (largely unknown?) fashion, not accreted into little lumps. Which is why talk about the safety of such lumps is largely irrelevant.

PamW

(1,825 posts)
36. WRONG WRONG!!!
Sat Jan 21, 2012, 03:14 PM
Jan 2012

Last edited Sat Jan 21, 2012, 05:18 PM - Edit history (1)

Also... I wouldn't - couldn't! - go anywhere near a 32 kg sphere of Pu-239, because the critical mass of pure Pu-239 is about 11 kg. Cores for implosion-type weapons contain amounts of plutonium that are sub-critical at the ordinary density of the metal; they rely on the external implosion (as well as neutron reflections) to achieve criticality. The Fat Man bomb only had about 6 kg of plutonium.
=======================================================

Evidently you have a rather simplistic concept of "critical mass" that most people have. Just because you have a critical mass of a material doesn't mean you have a bomb or a self-sustaining chain reaction. Whether the system is critical is not dependent just on the mass, but on the temperature because of Doppler broadening of resonance absorption cross-sections. You also have to have the neutrons, and they have to be in the "fundamental mode" distribution in space and energy as you would get from solving the Boltzmann transport eigenvalue equation. ( Neutron transport is my specialty ).

You are missing a digit from your claimed 6kg of Pu-239 in Fat Man. It's not 6 kg, but 8 kg. See under "Assembly" of:

http://en.wikipedia.org/wiki/Fat_Man

The Fat Man core or pit was 92 mm in diameter with a 21 mm hole in the middle for the urchin initiator.

The density of Plutonium is 19.74 gm/cc. If you do the calculation, Fat Man had 8 kilograms of Pu-239.

In any case, Fukushima is really no worse than a SINGLE atmospheric nuclear test; and we conducted hundreds of them, along with hundreds by the Soviets during the late '40's and '50s. The result was that we already have about 10 metric tonnes; 10,000 kilograms of Pu-239 in the environment, and Fukushima just added trivially to that.

Don't say that the radiation exposure due to all that Pu-239 is a significant source of radiation exposure. The radiation exposure due to ALL the radionuclides released as "fallout" is given by the following table courtesy of the Health Physics Society at the University of Michigan:

http://www.umich.edu/~radinfo/introduction/radrus.htm

The radiation exposure due to ALL the fallout is <0.03% of the average person's exposure.

Mother Nature remains the biggest source of radiation exposure in our lives, and swamps the amounts due to nuclear testing, and nuclear power, including Chernobyl and Fukushima.

PamW

waddirum

(979 posts)
27. let's talk THOUSANDS of tons of spent fuel...
Thu Jan 19, 2012, 05:51 PM
Jan 2012

... that is unaccounted for.

At Fukushima, there were 3,400 tons of fuel in seven spent fuel pools plus 877 tons of active fuel in the cores of the reactors.

We know that the Unit 4 Spent Fuel Pool has collapsed and holds no water. The conditions of the other SFPs are precarious. Where is it all the spent fuel?

Fukushima is the largest industrial accident in history. It's bigger than Chernobyl, and is no where near finished. In Japan, they are incinerating radioactive debris and dumping the ash in Tokyo Bay.

FBaggins

(26,735 posts)
28. The SFP at unit 4 has collapsed and holds no water?
Thu Jan 19, 2012, 05:56 PM
Jan 2012

Your understanding of what "we know" appears to bear little relationship to reality.

And no... it barely holds a candle to Chernobyl.

PamW

(1,825 posts)
31. WRONG - Chernobyl is still MUCH LARGER
Fri Jan 20, 2012, 09:21 PM
Jan 2012

Fukushima is the largest industrial accident in history. It's bigger than Chernobyl, and is no where near finished. In Japan, they are incinerating radioactive debris and dumping the ash in Tokyo Bay.
------------------------------------------------------------

Chernobyl is still the worst nuclear accident. The Bhopal disaster is probably the largest industrial accident in terms of killing the most people.

However, as others have said Fukushima doesn't hold a candle to Chernobyl.

See the testimony of the esteemed radiation epidemiologist Dr. John Boice to Congress:

http://hps.org/documents/John_Boice_Testimony_13_May_2011.pdf

from page 2 at top and identification as Slide 1:

Fukushima is not Chernobyl

PamW

waddirum

(979 posts)
30. by the way
Fri Jan 20, 2012, 07:25 PM
Jan 2012

Converting activity into mass is dependent upon the half life. Your conversion was for I-131 and not Pu-239. The half life of Pu-239 is 24,100 years while the half life of I-131 is only 8 days, so obviously the mass of will be much less for Iodine.

I don't know how much I trust the Wolframalpha website. But doing a query on 76 trillion Bq of Pu-239 gives 33 kg.

caraher

(6,278 posts)
34. Wolfram Alpha is accurate
Fri Jan 20, 2012, 11:54 PM
Jan 2012

I pointed out this mistake earlier and calculated essentially the same mass value for plutonium using the Argonne National Lab fact sheet on plutonium and its listed value for the specific activity (basically, disintegrations per second per kg).

FBaggins

(26,735 posts)
35. So let's use your number and make the comparison.
Sat Jan 21, 2012, 07:45 AM
Jan 2012

We'll ignore for the moment the incredibly poor source for the 76 Trillion Bq of Pu239 claim and take it as fact (it isn't).

That's almost exactly 1% of the estimated Plutonium released by Chernobyl.

Time to reassess that "bigger then Chernobyl" claim, eh?

kristopher

(29,798 posts)
4. Nuclear plants are unreliable
Mon Jan 16, 2012, 06:16 PM
Jan 2012

Let's look at the claims of nuclear proponents on this matter.
This is taken from an article by Amory Lovins (link below)

1) Nuclear proponents love to throw around the term capacity factor because it seems to the uninformed to favor nuclear power. That claim of superiority is used in the context of references to "baseload power", which they define as “the minimum amount of proven, consistent, around-the-clock, rain-or-shine power that utilities must supply to meet the demands of their millions of customers.”

You'll note that this is talking about aggregated demand from the entire user population, not supply from a single plant.

2) The claim is then made by nuclear proponents that this demand is only able to be met by three sources: "fossil fuels, hydro, and nuclear.


3) Nuclear proponents attempt to rule out wind and solar as "unreliable baseload" because they are intermittent—and subject to the availability of wind and sunshine.

Lovins:

The manifest need for some amount of steady, reliable power is met by generating plants collectively, not individually. That is, reliability is a statistical attribute of all the plants on the grid combined. If steady 24/7 operation or operation at any desired moment were instead a required capability of each individual power plant, then the grid couldn’t meet modern needs, because no kind of power plant is perfectly reliable. For example, in the U.S. during 2003–07, coal capacity was shut down an average of 12.3% of the time (4.2% without warning); nuclear, 10.6% (2.5%); gas-fired, 11.8% (2.8%). Worldwide through 2008, nuclear units were unexpectedly unable to produce 6.4% of their energy output.26 This inherent intermittency of nuclear and fossil-fueled power plants requires many different plants to back each other up through the grid. This has been utility operators’ strategy for reliable supply throughout the industry’s history. Every utility operator knows that power plants provide energy to the grid, which serves load. The simplistic mental model of one plant serving one load is valid only on a very small desert island. The standard remedy for failed plants is other interconnected plants that are working—not “some sort of massive energy storage [not yet] devised.”

Modern solar and wind power are more technically reliable than coal and nuclear plants; their technical failure rates are typically around 1–2%. However, they are also variable resources because their output depends on local weather, forecastable days in advance with fair accuracy and an hour ahead with impressive precision. But their inherent variability can be managed by proper resource choice, siting, and operation. Weather affects different renewable resources differently; for example, storms are good for small hydro and often for windpower, while flat calm weather is bad for them but good for solar power. Weather is also different in different places: across a few hundred miles, windpower is scarcely correlated, so weather risks can be diversified. A Stanford study found that properly interconnecting at least ten windfarms can enable an average of one-third of their output to provide firm baseload power. Similarly, within each of the three power pools from Texas to the Canadian border, combining uncorrelated windfarm sites can reduce required wind capacity by more than half for the same firm output, thereby yielding fewer needed turbines, far fewer zero-output hours, and easier integration.

A broader assessment of reliability tends not to favor nuclear power. Of all 132 U.S. nuclear plants built—just over half of the 253 originally ordered—21% were permanently and prematurely closed due to reliability or cost problems. Another 27% have completely failed for a year or more at least once. The surviving U.S. nuclear plants have lately averaged ~90% of their full-load full-time potential—a major improvement31 for which the industry deserves much credit—but they are still not fully dependable. Even reliably-running nuclear plants must shut down, on average, for ~39 days every ~17 months for refueling and maintenance. Unexpected failures occur too, shutting down upwards of a billion watts in milliseconds, often for weeks to months. Solar cells and windpower don’t fail so ungracefully.

Power plants can fail for reasons other than mechanical breakdown, and those reasons can affect many plants at once. As France and Japan have learned to their cost, heavily nuclear-dependent regions are particularly at risk because drought, earthquake, a serious safety problem, or a terrorist incident could close many plants simultaneously. And nuclear power plants have a unique further disadvantage: for neutron-physics reasons, they can’t quickly restart after an emergency shutdown, such as occurs automatically in a grid power failure.


From Amory Lovins
Four Nuclear Myths: A Commentary on Stewart Brand’s Whole Earth Discipline and on Similar Writings
Journal or Magazine Article, 2009
Available for download: http://www.rmi.org/rmi/Library/2009-09_FourNuclearMyths




Let's look at some of this more closely:

The manifest need for some amount of steady, reliable power is met by generating plants collectively, not individually.

That is, reliability is a statistical attribute of all the plants on the grid combined.

If steady 24/7 operation or operation at any desired moment were instead a required capability of each individual power plant, then the grid couldn’t meet modern needs, because no kind of power plant is perfectly reliable.

For example, in the U.S. during 2003–07, coal capacity was shut down an average of 12.3% of the time (4.2% without warning); nuclear, 10.6% (2.5%); gas-fired, 11.8% (2.8%).

Worldwide through 2008, nuclear units were unexpectedly unable to produce 6.4% of their energy output.
(What do you think that number was for 2011?)

This inherent intermittency of nuclear and fossil-fueled power plants requires many different plants to back each other up through the grid.

This has been utility operators’ strategy for reliable supply throughout the industry’s history.

Every utility operator knows that power plants provide energy to the grid, which serves load.

The simplistic mental model of one plant serving one load is valid only on a very small desert island. The standard remedy for failed plants is other interconnected plants that are working—not “some sort of massive energy storage [not yet] devised.”


...A broader assessment of reliability tends not to favor nuclear power. Of all 132 U.S. nuclear plants built—just over half of the 253 originally ordered—21%were permanently and prematurely closed due to reliability or cost problems.

Another 27% have completely failed for a year or more at least once.

The surviving U.S. nuclear plants have lately averaged ~90% of their full-load full-time potential—a major improvement for which the industry deserves much credit—but they are still not fully dependable.

Even reliably-running nuclear plants must shut down, on average, for ~39 days every ~17 months for refueling and maintenance.

Unexpected failures occur too, shutting down upwards of a billion watts in milliseconds, often for weeks to months. Solar cells and windpower don’t fail so ungracefully.

PamW

(1,825 posts)
6. Systematic vs random failure
Tue Jan 17, 2012, 11:59 AM
Jan 2012

This inherent intermittency of nuclear and fossil-fueled power plants requires many different plants to back each other up through the grid.
========================

We've discussed this before and you don't get it. It's the difference between "random" and systematic failures.

Nuclear, fossil, hydro... have "random" failures - failures ever so often, so having more of the same plant, called "redundancy" works to give you reliable power because they don't all fail at the same time.

Solar failures are "systematic". Suppose you have 100% solar. At night time, your plant goes down for lack of sunlight. However, you can't rely on its sibling solar plants, because they also are down for the same reason.

If you are 100% solar; you have 100% power failure at night since they all fail for a common ( not random ) reason.

Maybe this time you will understand the explanation. If not, please take a mathematics course in statistics.

PamW


OKIsItJustMe

(19,938 posts)
7. Reductio ad Absurdum based on a False Dichotomy
Tue Jan 17, 2012, 12:11 PM
Jan 2012
http://en.wikipedia.org/wiki/Reductio_ad_absurdum
http://en.wikipedia.org/wiki/False_dichotomy

We will not have a 100% solar power supply for several years (if ever.) So, arguments based on such a hypothetical power system are fallacious.


However, as I suggested before, even engineers are aware of the fact that it gets dark at night. So, if we were to go to a 100% solar solution, you may feel well assured that nighttime will not come as a complete surprise, and that some sort of energy storage would be integrated into the system.

At this time, since solar power represents a tiny minority of our electrical grid, there is no pressing need for energy storage.

See also: http://www.democraticunderground.com/11274183

PamW

(1,825 posts)
19. The 100% is not my invention!
Thu Jan 19, 2012, 11:39 AM
Jan 2012

We will not have a 100% solar power supply for several years (if ever.) So, arguments based on such a hypothetical power system are fallacious.


However, as I suggested before, even engineers are aware of the fact that it gets dark at night. So, if we were to go to a 100% solar solution, you may feel well assured that nighttime will not come as a complete surprise, and that some sort of energy storage would be integrated into the system.
===================================

The idea of 100% solar is NOT my invention; it's what kristopher keeps touting as the future. For some time, he stated that we could go 100% solar, and all we needed was "redundancy" to ensure reliability, because that's what the fossil and nuclear plants use.

As far as storage systems, let's see what we need in the way of storage. Solar plants operate on a 25% duty cycle. It's dark 50% of the time, but even when it's light, the angle of the Sun to the Earth means that we don't get much energy from sunlight in the first few hours after sun up, or the last few hours before sundown. The vast bulk of a solar plant's output comes in about 6 hours centered about the local noon.

So a solar plant has to store about 75% of its daily output. What would it take to have a solar plant replace a 1 Gigawatt(e) fossil or nuclear installation. How much energy does a 1 Gw(e) power plant produce in a day? That's easy - it's a Gigawatt-Day, the product of a power and a time is always a unit of energy. However, let's use Wolfram Alpha to convert this to a more familiar unit:

http://www.wolframalpha.com/input/?i=Convert+1+gigawatt-day+to+kilotons

So a 1 Gw(e) power plant puts out 20.6 kilotons of TNT equivalent worth of energy, or about the amount of energy of the atomic bomb dropped on Nagasaki. The solar plant has to store 75% of this ( see above ) or 15 kilotons, or about the energy of the atomic bomb that vaporized Hiroshima.

So to replace just a single 1 Gw(e) fossil or nuclear power plant, the solar plant has to be able to store amounts of energy about the same size as atomic bombs. Good luck with that! Also be aware of the safety implications if you have atomic bomb sized amounts of energy stored in your storage facility, and it fails for some reason - like earthquake or terrorism....

I have no problem with solar providing a minority percentage of our energy. The National Academy of Sciences recommends that renewables could produce about 20% of our energy. But what do we do for the remaining 80%?

PamW


OKIsItJustMe

(19,938 posts)
29. “The idea of 100% solar is NOT my invention; it's what kristopher keeps touting as the future.”
Fri Jan 20, 2012, 04:39 PM
Jan 2012

Shall we say that Kristopher’s views do not necessarily represent the norm?

Generally speaking, I would say that more advocate “100% Renewable” than “100% Solar.”

http://www.environmentalleader.com/2011/12/30/ithaca-goes-100-renewable/

[font face=Times, Serif]December 30, 2011

[font size=5]Ithaca Goes 100% Renewable[/font]

[font size=3]The City of Ithaca, N.Y., is to purchase 100 percent of its electricity consumption from renewable energy sources through a contract signed with Integrys Energy Services of New York Inc.

Beginning in January 2012, the city will be purchasing Green-e Energy-certified renewable energy certificates for all its electricity. These RECs will offset about 4,896 metric tons of CO2 emissions annually from conventional electricity production. The environmental benefit can be compared to not driving 12,000,000 miles in a car, or planting 1,460 acres of trees, according to the municipality.

The REC purchase was conducted through Municipal Electric and Gas Alliance Inc., a non-profit power aggregation alliance of which Ithaca is a member. MEGA uses its collective bargaining power to leverage competitive energy prices for its members.

Ithaca has sourced 5 percent of its energy from wind farms since 2006 and is targeting a carbon footprint 20 percent smaller than 2001 levels by 2016, and it is not the only U.S. municipality to go 100 percent renewable.

…[/font][/font]



As for the NAS, I believe their report was a tad more nuanced than you suggest:
http://www8.nationalacademies.org/onpinews/newsitem.aspx?RecordID=12619


Read Full Report

[font face=Times, Serif]Date: [font size=5]June 15, 2009[/font]
Contacts: Rebecca Alvania, Media Relations Officer
Luwam Yeibio, Media Relations Assistant
Office of News and Public Information
202-334-2138; e-mail <news@nas.edu>

FOR IMMEDIATE RELEASE

[font size=5]Renewable Energy Could Contribute to U.S. Electricity Needs, Yet Challenges Remain[/font]

[font size=3]…

Technological advancements will continue to be needed to reduce costs and make renewable electricity technologies more efficient, the report says, but even with current technologies, renewable resources could contribute more than they do now. With accelerated deployment, increases in transmission capacity, and other electric-grid improvements, non-hydroelectric renewables could technically contribute up to 10 percent of U.S. electricity by 2020, and 20 percent or more by 2035. However, major scientific advances, and changes to the way we generate, transmit, and use electricity, will be needed before renewables can contribute the majority of U.S. electricity. Necessary improvements include the development of intelligent, two-way electric grids; large-scale and distributed electricity storage; and significantly enhanced, yet cost-effective, long-distance electricity transmission.

Renewable-energy use can have numerous environmental and local impacts. Many of these impacts are positive: Using renewable energy lessens emissions of CO2, sulfur dioxide, nitrogen oxides, and mercury; consumes less water; and causes less water contamination compared with fossil fuel electricity. However, issues of land use and other local impacts (e.g., noise from wind turbines or potential effects on local weather) will become increasingly important as deployment of renewable technologies grows, the report says.

For renewable electricity to make a significant contribution to U.S. electricity generation, it is critical that there is an understanding of the scale of deployment that will be required. Large increases will be needed over current levels of manufacturing, employment, investment, and installation. The U.S. Department of Energy recently stated that for wind energy to contribute 20 percent of U.S. electricity it would require 100,000 wind turbines, $100 billion of additional capital investments and transmission upgrades, and employees to fill 140,000 jobs. The result would be the elimination of more than 800 million metric tons of CO2 emissions from the U.S. electricity sector. According to the committee that wrote the report, the U.S. could feasibly meet this goal by 2030, but the challenge will be great.

Achieving widespread adoption of renewable energy will also require long-term and consistent policies that encourage the generation of renewable electricity, the report adds. In most cases, electricity from renewables is more expensive to produce than electricity from fossil fuels. In the near term, policy incentives, such as the renewable production tax credit, would boost the use of renewable electricity. Continued research and development into renewable electricity generation could lead to more cost-effective technologies. Overall, technological developments and consistent policy will need to be coordinated with manufacturing capacity and access to capital in order to accelerate deployment of renewable electricity.

…[/font][/font]


So, in essence, the NAS study said that we can have 20% or more non-hydroelectric renewable electricity by 2035, but it would be a stretch. It did not say that we are limited to 20% for all of eternity.

I keep saying, we cannot change the entire grid overnight (neither to 100% renewable, nor to 100% nuclear fission.)

[font size 4]T.T.T[/font]

Put up in a place
where it's easy to see
the cryptic admonishment
T.T.T.

When you feel how depressingly
slowly you climb,
it's well to remember that
Things Take Time.

— Piet Hein

PamW

(1,825 posts)
37. You just got my vote...
Sat Jan 21, 2012, 07:42 PM
Jan 2012

Shall we say that Kristopher’s views do not necessarily represent the norm?
============================================

You just got my vote... for understatement of the year.

PamW

CreekDog

(46,192 posts)
23. Fukushima, Chernobyl
Thu Jan 19, 2012, 11:53 AM
Jan 2012

Rant, rave, lecture all you want.

Fukushima, Chernobyl.

Fukushima, Chernobyl.

And many of the plants that you defend, are no safer, and are arguably, less safe than Fukushima was.

Dead_Parrot

(14,478 posts)
12. Why doesn't Lovins give a cite for his "1-2%" failure rate for wind and solar?
Tue Jan 17, 2012, 04:01 PM
Jan 2012

It wouldn't be an ass-pull, would it?

And how is this significantly better than the 2.5% for nuclear?

OKIsItJustMe

(19,938 posts)
13. Does he give citations for the other failure rates in this excerpt?
Tue Jan 17, 2012, 07:12 PM
Jan 2012

OK, so let’s do a quick thought experiment here:

Generally speaking, as complexity increases, failure rates tend to increase (simply because there are more pieces to fail.) Moving parts tend to fail more often than solid state.

  • Much of a PV system is solid state.
  • A wind turbine is more mechanical, a turbine, which turns a generator…
  • A nuclear fission plant has a reactor, which makes steam, to power a turbine, which turns a generator…

Which do you think is likely to fail more often?


As for a 1-2% failure rate -vs- a 2.5% failure rate, and whether or not the difference is significant:
  • If we call it a 2% failure rate for a moment, then a 2.5% failure rate would be 25% higher.
  • If we call it a 1.5% failure rate, then the 2.5% failure rate is 66% higher.
  • If we call it a 1% failure rate, then a 2.5% failure rate would be 150% higher.
(Even 25% seems significant to me.)

Dead_Parrot

(14,478 posts)
14. Yes, the others are cited
Tue Jan 17, 2012, 08:32 PM
Jan 2012

Incidentally, the most recent NERC report (the data cited) gives 2.15% as the current failure rate for nuclear.

Solar is indeed mainly solid-state, but a google for inverter failure rates will show it's not 100% reliable: Hard figures are tricky to come by (which is why a cite for the percent would be nice), but according to this article "The useful life of a central inverter typically does not exceed ten years" which suggests a 10% failure rate p.a.

Wind figures are also a little thin, but there are a few studies kicking around: This one suggests 1% for small (<500kW) turbines rising to 3-4% for large (>1MW) systems.

1-2% seems a little optimistic, to be honest.

I'm a little leery of applying percents to percents: You can't say that 1% failure is twice as good as 2%, without also saying that 99% reliable is twice as good as 98%.

OKIsItJustMe

(19,938 posts)
16. “which suggests a 10% failure rate p.a. ”
Wed Jan 18, 2012, 12:00 AM
Jan 2012

You are joking. (Right?)

You’re assuming a constant rate of failure over 10 years?

If I tell you that a tire has a practical lifespan of 10,000 miles, would you assume that the odds of failure are the same for any given mile?

Although they can die a premature death from a puncture, tires wear out over time. So do inverters.


I like to say, “The advantage of a single point of failure, is that when a failure occurs, you know where to look first!”

Practically speaking, this means that since the inverter is the weak point of a solar system, you can:

  • Minimize the number of inverters and replace them as “maintenance,” before they fail, or wait for the failure, and deal with it when it happens.
  • Use redundant or parallel inverters to reduce the impact of a failure.

    or…

  • Eliminate inverters (use a DC system!)

Dead_Parrot

(14,478 posts)
17. Over multiple systems of different ages, yes.
Wed Jan 18, 2012, 03:25 AM
Jan 2012

Obviously for a single system the probability follows a curve, but if you're looking at 1,000 installations built over the last 10+ years, you would expect to average 100 failures per year (assuming Parker's figures are accurate. They might be as big an ass-pull as Lovins').

One of the things getting a lot time recently is the use of microinverters - basically, having a seperate inverter for two or three panels, and hooking the output of them together. The advantage is that when they blow, you get a reduction rather than an outage: The disadvantage being, you get a lot more quality time with a screwdriver. Probably great for off-grid applications, though, and would certainly put a dent in the outage figures...

(FWIW, If I were doing an installation, I'd probably go down the DC route... )

 

badtoworse

(5,957 posts)
15. What are the NERC Equivalent Forced Outage Rates and Equivalent Availability Factors for nukes?
Tue Jan 17, 2012, 09:17 PM
Jan 2012

Those are standard definitions that are used throughout the electric power industry and are the only fair way to compare one technology with another from a reliability standpoint.

Latest Discussions»Issue Forums»Environment & Energy»The unreliable nature of ...