For the year 1982, assuming coal contains uranium and thorium concentrations of 1.3 ppm and 3.2 ppm, respectively, each typical coal-fired plant released 5.2 tons of uranium and 12.8 tons of thorium that year into the atmosphere.
According to the National Council on Radiation Protection and Measurements (NCRP), the average radioactivity per short ton of coal is 17,100 millicuries/4,000,000 tons, or 0.00427 millicuries/ton. This figure can be used to calculate the average expected radioactivity release from coal combustion. For 1982 the total release of radioactivity from 154 typical coal plants in the United States was, therefore, 2,630,230 millicuries.
Thus, by combining U.S. coal combustion from 1937 (440 million tons) through 1987 (661 million tons) with an estimated total in the year 2040 (2516 million tons), the total expected U.S. radioactivity release to the environment by 2040 can be determined. That total comes from the expected combustion of 111,716 million tons of coal with the release of 477,027,320 millicuries in the United States.
For comparison, according to NCRP Reports No. 92 and No. 95, population exposure from operation of 1000-MWe nuclear and coal-fired power plants amounts to 490 person-rem/year for coal plants and 4.8 person-rem/year for nuclear plants. Thus, the population effective dose equivalent from coal plants is 100 times that from nuclear plants.
How does the amount of nuclear material released by coal combustion compare to the amount consumed as fuel by the U.S. nuclear power industry? According to 1982 figures, 111 American nuclear plants consumed about 540 tons of nuclear fuel, generating almost 1.1 x 10E12 kWh of electricity. During the same year, about 801 tons of uranium alone were released from American coal-fired plants. Add 1971 tons of thorium, and the release of nuclear components from coal combustion far exceeds the entire U.S. consumption of nuclear fuels.
More nuclear waste comes from coal plants than is used in the entire nuclear industry.