We provide, and perform a risk theoretic statistical analysis of, a dataset that is 75 percent larger than the previous best dataset on nuclear incidents and accidents, comparing three measures of severity: INES (International Nuclear Event Scale), radiation released, and damage dollar losses. The annual rate of nuclear accidents, with size above 20 Million US$, per plant, decreased from the 1950s until dropping significantly after Chernobyl (April, 1986). The rate is now roughly stable at 0.002 to 0.003, i.e., around 1 event per year across the current fleet. The distribution of damage values changed after Three Mile Island (TMI; March, 1979), where moderate damages were suppressed but the tail became very heavy, being described by a Pareto distribution with tail index 0.55. Further, there is a runaway disaster regime, associated with the dragon-king phenomenon, amplifying the risk of extreme damage. In fact, the damage of the largest event (Fukushima; March, 2011) is equal to 60 percent of the total damage of all 174 accidents in our database since 1946. In dollar losses we compute a 50% chance that (i) a Fukushima event (or larger) occurs in the next 50 years, (ii) a Chernobyl event (or larger) occurs in the next 27 years and (iii) a TMI event (or larger) occurs in the next 10 years. Finally, we find that the INES scale is inconsistent. To be consistent with damage, the Fukushima disaster would need to have an INES level of 11, rather than the maximum of 7.