Saturday, June 4, 2011

This Week's Worry: Do Cellphones Cause Cancer?

The news this week was all a-twitter about the World Health Organization declaring the electromagnetic fields from a cell phone as “potential” cancer agent, classifying it with diesel fumes, dry cleaning fluid, talcum powder and automobile exhaust.  Like all radios, cellphones emit non-ionizing radiation, totally unlike atomic sources.  It is simply radio, at a frequency (coincidentally enough) fairly close to where microwave ovens operate.  If cellphones are dangerous, all radios are. 

Fair enough: is radio dangerous?  Certainly some of it is.  We all know for sure that if we put Mr. Hamster in the Microwave oven, Bad Things happen.  But is that transmitter down the road safe?  How do we know?

Let me start by pointing out my biases.  I am an electrical engineer who specializes in doing radio frequency design.  I'm also a radio amateur - ham radio operator.  The two of those combine to mean I work with and am physically immersed in radio fields most of the time.  I consider myself a fairly careful guy and if I thought it was dangerous or would cause a long term problem like cancer I'd get out of it and do something else.  Years ago, though, I worked with a guy who would walk far out of his way to avoid a piece of test equipment I was using, so that he could go outside to smoke.  I guarantee you that the nanowatts of energy he might get from that equipment was much, much less hazardous than what he was getting from his cigarettes! 

Electromagnetic radiation exists in a spectrum whose frequency extends from below audio frequencies (less than a cycle per second) through radio, through light, x-rays, gamma rays and beyond.  All common broadcast radio, cable TV, satellite TV, data, Bluetooth, WiFi, 3G, 4G and other services, use radio waves.  It has been pointed out that we live in a constant electromagnetic “smog” of all those EM sources (and more).
(xkcd)

I find that many people don't understand the difference between the two major types of radiation; ionizing and non-ionizing.  Ionizing radiation is energetic enough to, as the name says, knock electrons off an atom and form an ion.  There is a relationship between energy and wavelength (the DeBroglie wavelength), and all ionizing radiation is extremely short in wavelength; far ultraviolet, x-rays, gamma rays and beyond.  For example, the wavelength of an x-ray is centered around 1 nanometer (billionth of a meter) and gamma rays are centered around 10 picometers (trillionths of a meter) while cellphone radiation is centered around 20 centimeters; 200 million times longer wave (lower in energy) than the less energetic X-rays.  X-rays and gamma rays are genuinely dangerous, so the idea isn't that electromagnetic energy isn't dangerous; the idea is that the longer the wavelength, the less energetic the EM wave.

The only absolutely proven effects of radio frequency or power line emissions (besides electric shock from contact) are heat related and the National Cancer Institute is in agreement on that.  The most common heat effect is cataracts: if your head gets irradiated by a strong enough radio field, the circulation in your corneas is so limited (diffusion, really) that your corneas can cook and turn opaque.  If you were exposed to a radio field on the order of that in a microwave oven, you would cook as easily as any other hunk of meat.  When you reduce the power in the field, though, there are no demonstrated, medically proven effects from radio fields besides the thermal effects.  Studies show up from time to time that offer glimpses of other effects, even though there is no known mechanism that could explain them.  Follow-up studies usually don’t show these effects, leading to the tentative conclusion that the effect was some sort of statistical noise or really correlated with something else.  These studies bother some people, but they don’t bother me.  Why?  Simply if you have many people looking for an effect and it behaves like this, either there is no link, it's a very weak link, or the link is really to something else that isn't present in every study. 

For example, over 20 years ago, I read of a study that black and white televisions correlated with leukemia.  Black and white television, not color?  Even 20 years ago, the only people who used black and white TVs were pretty low income.  Poverty, it turns out, is associated with all sorts of negative health impacts.  It's easier to come up with an explanation of how poverty could lead to leukemia than how black and white TV sets could.  You will see the same argument about proximity to power lines causing leukemia or other problems; could it be the property values under the power lines are low, and mainly lower income people live there?  Again, it's easier to see how poverty could cause health problems than it is to see how 60 Hz fields could cause them. 

You regularly read about people who claim to be sensitive to EM fields in general, and who claim health effects of all kinds.  The thing is, as I mentioned above, everyone in modern society is immersed in a soup of radio signals from AM broadcast up through microwaves; they're in the near field of the AC power line, exposed to TV transmitters, radars… and don’t forget that the sun bombards us with radio signals, too.  The strengths are widely variable, but everyone in the western world is within one wavelength of an AC power line (which is five million meters – 3100 miles - for the US 60 Hz lines). 

There’s a saying in toxicology that the dose makes the poison.  The same concept applies here: while in the case of sticking your head in a running microwave oven, harm is guaranteed; in low doses, I see no evidence of harm from radio fields (and radio fields are used as physical therapy - diathermy, or as local heaters for prostate cancer).  In the case of a nearby transmitter, the dose is moderated by distance and the power from a radio field drops off very rapidly with distance, because it radiates in all directions.  Think of a candle flame, radiating light in all directions and lighting the inside of a sphere centered on it.  If you double the distance to the candle, the light spreads out over four times the area, so the light is ¼ the intensity.  This is called an inverse square law, and power drops off very quickly.  If you’re far enough away from the antenna that it looks like a point, the field drops off like this.  If the antenna looks like a line, say the power line from a few hundred yards, the field drops off proportion to distance, not distance squared. 

(picture credit
Let’s consider an example of an FM transmitter, one of the big ones, 50 kW.  Now 50 kW is a lot of power, but due to the inverse square law, you have to be very close to the antenna to be hurt by it.  Imagine standing at the base of the tower and imagine it can radiate you well there (it can’t – it’s aimed toward the horizon).  Just 200 feet away, that power has dropped from 50 thousand watts to less than 1 W (760 milliwatts).  Is that dangerous?  The standards organization that made the standards for radio exposure (based on heating) says 200 microwatts per square centimeter at this frequency should be the limit.  The area of a sphere that big in diameter is 502.7 thousand square feet or 467 million square centimeters.  Divide 760 milliwatts over 467 million square centimeters, and you get 1.6 billionth of a watt per square centimeter.  The safety limit is over 125,000 times bigger than that.  Sounds pretty safe!

The inverse square relationship means that if you cut the distance to the antenna in half, the power goes up by a factor of four.  What if you were 100 feet away from the transmitter?  Instead of being 125,000 times weaker than the limit, the field would be 31,000 times weaker.  50 feet away?  7800 times weaker than the limit.  You almost have to be holding the antenna to be in trouble (almost…). 

The story is a bit more complicated than that.  The amount of power you can handle depends on how much you absorb, which depends on the size of the wavelength compared to your body size (structures about ¼ wavelength in size absorb quite efficiently; much smaller or much bigger are less efficient).  There are two reasons I chose an FM transmitter for this thought experiment.  First, an average-sized person is about a quarter wave tall when they’re standing up.  Sit down to change your size in wavelengths and you’ll absorb less power.  The second reason I chose FM is that it’s the lowest power limit the FCC requires.  By the time you get to the frequency of cellular phones, allowed levels are a few times higher. 

More complications: as frequency goes up, radio doesn't penetrate any conductor well, an effect known as the skin effect; in a wire, the energy travels in a skin with the thickness dependent on the frequency of the radio.  Tubes are as effective as wire at carrying radio.  The effects also depend on how much cooling the body part being irradiated gets.  You don't think of your body as a liquid-cooled fuel cell engine but you are; blood circulation acts to remove local heating.  A hand gets a lot of circulation, and the body’s trunk gets even more. Actually, getting efficient power transfer of radio signals is far from trivial to do, and being exposed to a field is really very different than being affected by it.  

You can do the sort of path loss and power calculations I just did to determine the amount of power at any distance from TV stations, AM radio stations, cell phone towers and so on, and you’ll find they’re not going to cause anyone to get more than the allowed radiation – usually by a factor of thousands.  No surprise - they were designed to meet the laws that limit exposure! 

So what about your phone?  What about the video of the iPhone power detector in the girl’s hand?  The effect she's demonstrating is that as she absorbs signal by wrapping her hand tightly around the antenna (which is why the reception gets bad), she keeps signal from reaching the base station, and the base commands her radio to increase power.  This is what the system is supposed to do. 


The maximum power out of a handset is less than 2 watts.  The exact amount of power you will be exposed to and how much will be absorbed from a phone held to your ear is a computational nightmare, and I'm convinced that computational difficulty is one reason why the “science isn't settled”.  If they could point to  consistent, repeatable calculations and measurements that show the power transferred to your ear and brain was less than the recommended limits, the hype may have died down by now.   Remember, you can't be exposed to all of the power radiating out of the phone; if you absorbed even most of it, not enough would reach the other end and the call would drop. 

When you see people say “does your ear fell warmer after you use your phone?”, consider that the phone gets warmer because the power amplifier isn't 100% efficient; it's more like 20% or less.  That means to get 1 watt out, you put at least 5 watts into the final amplifier and that other 4 watts warms the phone, along with anything in contact with it. 

During normal operation, the output power of the phone will vary depending on how good the signal path is from your location.  In an elevator or other metallic box, the phone will more likely be at full power.  In a place with better radio path characteristics, the power will be lower. The accepted value for how much power you'll be exposed to is around 1 mW/sq. cm., which is approximately the limit.  When I first started in radio design, the limit was 10 mW/sq. cm.; there were no massive studies they undertook to lower the limit, it was done to make US limits more like other country's limits.   

With cell phones as widespread as they are, if there were any sort of clear link between their use and cancer, it would have been determined by now.  Instead, what you get is the “we need more research” chorus.  The simple fact is that the only known effect caused by radio fields is heating.  Any researcher who wants to say otherwise needs to show possible mechanisms for the injury.  I believe this should have been put to bed years ago.  I don't believe there any health risk from radio other than the thermal effects.  You don't stick your head in the microwave, you don't stand in front of a search radar and you don't go grabbing antennas on operating transmitters putting out over 10 watts or so.  Other than that, don't worry.  

The most dangerous thing about a cellphone is talking on it while driving.  I say that as someone who was run down while riding their bike by a small pickup truck driven by someone distracted by their phone.

2 comments:

  1. I've had that XKCD "spectrum" hanging on the wall of my office for years now. Nobody seems to think its as funny as I do...

    (present company excluded of course)

    ReplyDelete
  2. It's great - as is most of his work. Hadn't thought of making it a wall hanging, though. Need to fire up that printer.

    ReplyDelete