Comments: THIS Is Where The Math Went

Muy happy.

Posted by Amandasaurus at April 26, 2011 11:09 PM

I'll be the first to say I can't make hide nor hair of that math. None of my schools offered calculus. But I saw this, ""Perhaps the linear dose relationship is correct for low exposures . . ."

can you define "low exposures?"

see also:

"Noble gas releases from normal operations were estimated to be 6700 Ci/yr, primarily 133Xe from reactor building purges, and 0.01 Ci/yr of 131I. The site (i.e., TMI-1 and 2 combined) is allowed by technical specification to release as many as 220000 Ci/yr of noble gases (when calculated on a 133Xe dose equivalence basis) and 0.05 Ci/calendar quarter of 131I.

Projected release rates of radioactive material in liquid effluents were approximately 0.24 Ci/yr, excluding tritium and dissolved gases. TMI-1 and 2 combined are allowed by technical specification to release as many as 10 Ci/yr, excluding tritium and dissolved gases. The tritium release was estimated to be 550 Ci/yr."

-Three Mile Island
-REPORT TO THE COMMISSIONERS AND TO THE PUBLIC
-Volume II part 2
-(The Rogovin report)
-page 348 (pdf page 52)
http://www.threemileisland.org/resource/index.php?aid=00027

Posted by Some guy on the innernet at April 27, 2011 12:13 AM

Wow, a month off--that sounds almost French. And Chazelle was a math professor too--that'a quite a coincidence at such a tiny site! Yet the French love nukes, I thought, nor do I hear any Bach playing in the background nor see anything about torture, apart once again from the sick plea for experimentation upon physicists. What is ze meaning of all zees?

Enjoy the rest, Messeur Aaron.

Posted by N E at April 27, 2011 06:55 AM

@Some guy - I will give you what sounds like a non-answer. The EPA divides health effects into stochastic (low) and non-stochastic (high). The math here applies to concentrations low enough that EPA would consider them to cause stochastic health effects.

The point of these recent posts, so far, is that we judge the safety of stochastic exposures using knowledge about non-stochastic health effects - basically, we don't do the math correctly.

To give a firm answer to your question requires another layer of mathematics which I haven't yet worked out. But, based upon what I've done so far, I would not be afraid to stand up in a room full of scientists and say that the linear extrapolation to low doses cannot possibly be correct.

Maybe by June I'll have the answer. It clearly relates to physics that I knew pretty well at one time. When I manage to piece it together, I'll post it and do my best to explain it clearly.

Posted by Aaron Datesman at April 27, 2011 09:26 AM

I think I'll go back to the equation I wrote in the previous thread, though I'll try to use your notation to the extent possible.

In your notation, Nv is the number of interaction volumes and Tau is the number of interaction times in a given time period.

I'll call Pc the chance that a cancer-inducing event occurs in one interaction volume in an interaction time (one second or whatever is biologically appropriate, which we don't know).

Then the chance of no cancer inducing events anywhere in the body in the given time period is

(1 - Pc) ^ (Nv Tau)

and the chance of one or more is

1 - (1-Pc)^(Nv Tau)

So again, for values where Nv*Tau*Pc is much less than one, this is going to reduce to

Nv Tau Pc.

Now Pc would be complicated. There's a chance that one decay could do it, but two decays in an interaction time close together would be much more likely and three more likely still, etc..

One decay doing it gives you a term proportional to the dose rate, and two gives you a term proportional to the dose rate squared and so forth. It's proportional to the dose rate or the dose rate squared or the dose rate to the n(for n decays) and not identical to it because it's not guaranteed that n decays will induce cancer. It might kill the cell instead, for instance. So you end up with an expression at low dose rates which is

Probability of getting cancer =

Nv *Tau * (proportionality constant 1 times dose rate + proportionality constant 2 times dose rate squared + ...)

So at low dose rates haven't we just rediscovered the linear-quadratic model that BEIR talks about?

Posted by Donald Johnson at April 27, 2011 12:50 PM

@DJ - That's smart and a good reading. Approaching the problem from that angle is how I reached the conclusion I expressed.

I agree with what you wrote as long as Pc is really small. And it does reduce to the BEIR expression, I agree. This is a logical and correct result if you believe that there's no way a low background could result in a large dose in any interaction volume.

But the statistics - which must be applied if the background decay rate is low - say something different. What they appear to say is this: almost every interaction volume in the body will experience simultaneous disintegrations up to Nt-1 per second.

Using the example in the previous post, that's 100x the average rate at 0.1/s and 50x at 0.25/s for Nbar. I think it's wrong to assume that Pc is small in these situations. If Pc is not small, you get something which is very different than the BEIR result.

I am coming to the opinion that low doses are cleaned up by repair mechanisms in the cell, as we are told (although I object to the practice of extrapolating this observation at the cellular level up to an entire organism). If this is true, then, most or all of the danger lies in high-dose events, which overwhelm the cellular machinery.

Had I not been thinking about this a lot for like two months, it probably would not have occurred to me that high-dose events can result from low average decay rates. But they absolutely can. The statistics in this post just show how.

I really appreciate your help teasing out the technical details in the posts I have written. Thank you.

Posted by Aaron Datesman at April 27, 2011 02:02 PM

@Aaron: You infer from (*) that "EVERY interaction volume is highly likely to experience at least one interval with a decay rate of Ntau per sec." This is incorrect. The inference is that AT LEAST ONE interaction volume blah blah. Very different. I see you repeat that same mistake in your reply to DJ.

A relaxed version of DJ's argument does NOT require Pc to be small. The product Nv Tau Pc is simply the expected number of volumes with a cancer event in a year. Now if Pc is calculated by assigning a risk to every number of decays generated by a Poisson, then DJ's conclusion is ALWAYS correct: for 10 decays, multiplying the rate by 2 gives you a 1000-fold increase in that coefficient. (2^10 to be precise). This was precisely my point in an earlier comment. (Just looking at Nv Tau Pc as a mean also means you don't need all those inter-volume independence assumptions.)

LNT theory seems pretty safe -- at least from a statistical angle.

Posted by bobs at April 27, 2011 03:10 PM

@bobs -

This is astute, and I understood the comment before. It's not wrong, but the thing is kind of complex, and I don't think your statement captures the complexity.

I think this is what the math says:

There aren't any interaction volumes with Nt+1 decays.

The are a few interaction volumes with Nt decays.

I think you agree with this, right?

But then, consider - isn't P(N) a really fast function of N?

Doesn't it follow that there are a lot of interaction volumes with Nt-1 decays?

And really a lot with Nt-2 decays?

Since Nt>>Nbar, these are instantaneous doses far, far larger than the dose corresponding to the average decay rate.

You are free to disagree, of course. Thanks for giving the argument a hearing.

Posted by Aaron Datesman at April 27, 2011 03:25 PM

Huh? First, you claim that every volume is likely to get Nt decays at some point in a year. That's false. Then you say to DJ the same thing but with Nt-1. That's still false, unless in your world 1 out of 10,000 is called "likely." Next you say Nt-2. Hey since Nt is 10, if you keep playing that game eventually you'll be right since the expected number decays per vol is at least 10,000. But we're not doing poetry here. This is counting balls and bins where statements have meaning and are either true or false. So not only you keep adjusting your numbers when your claims are shown to be false, but more seriously your approach is wrong. If you want to prove any statement of the form "Every volume is likely to..." your proof must factor out Nv. You should not be taking powers of anything with Nv in the exponent. Just fix your volume once and for all and see what happens over time. Although I do notice that in previous posts, it was all about "there exists a volume." Why are we now talking about "for almost all..." Are you trying to figure it out as you go and using us as a debugging device?

And let's keep in mind that this has no bearing whatsoever on your supralinearity claim, which is completely wrong but for completely different reasons. Again I'll summarize your mistake in one sentence: you fixate on the concavity of the standard deviation as a function of the mean, but that has no relevance on the truth of the linear model.

OK, this time I am done!

Posted by bobs at April 27, 2011 06:33 PM

Aaron--I'm confused by your position, but I'll think about it when I get the chance. I'll be a little busy for awhile and you apparently may be going away soon.

On a more general note, though, I think that if I were going to construct simple models for how radiation induces cancer, first I'd try to become pretty well versed in what the radiobiologists say. I'd read the textbooks and some of the papers. Those guys may know something--stranger things have happened. But if I just wanted to forge ahead now, without knowing much of anything, I'd suggest a model like this--

Take one cell and assume it either dies or goes cancerous when it reaches a damage level D. (Assigning a probability to each of those possibilities is beyond my pay grade--I'd just call it P1 and P2.) Also assume that until it reaches that level it can repair the damage at rate R. Then assume a Poisson process supplies the damage. Set up the stochastic differential equation (as I assume it would be, but I never knew much about them) and solve. You'd be trying to find out how long it takes to reach D on average.

It's been a long time, but I think engineers and physicists do solve analogous problems. Maybe somebody cares about the heating caused by shot noise in a very tiny circuit, for instance. The temperature increase would be analogous to "damage" and the cooling off process would be analogous to "repair". So someone might have done the math already.

Posted by Donald Johnson at April 27, 2011 09:37 PM