The Biological Effects of Weak Electromagnetic Fields Problems and solutions Andrew Goldsworthy March 2012
I am a retired lecturer from Imperial College London, which is among the top three UK universities after Oxford and Cambridge and is renowned for its expertise in electrical engineering and health matters. I spent many years studying calcium metabolism in living cells and also how cells, tissues and organisms are affected by electrical and electromagnetic fields.
In this article, I will try to explain in lay-person’s language how weak electromagnetic fields from cell phones, cordless phones and WiFi can have serious effects on human and animal health. These include damage to glands resulting in obesity and related disorders, chronic fatigue, autism, increases in allergies and multiple chemical sensitivities, early dementia, DNA damage, loss of fertility and cancer.
All this happens at levels of radiation that the cell phone companies tell us are safe because the radiation is too weak to cause significant heating. This is the only criterion that they use to assess safety. In fact, the direct electrical effect on our cells, organs and tissues do far more damage to us at energy levels that may be hundreds or thousands of times lower than those that cause significant heating. These are termed non-thermal effects. As yet our governments and health authorities are doing nothing to protect us from them.
This need not be so. By understanding the mechanisms of these non-thermal effects, it is possible to put most of them right, as I will show in the following article.
Many of the reported biological effects of non-ionising electromagnetic fields occur at levels too low to cause significant heating; i.e. they are non-thermal. Most of them can be accounted for by electrical effects on living cells and their membranes. The alternating fields generate alternating electric currents that flow through cells and tissues and remove structurally-important calcium ions from cell membranes, which then makes them leak.
Electromagnetically treated water (as generated by electronic water conditioners used to remove lime scale from plumbing) has similar effects, implying that the effects of the fields can also be carried in the bloodstream. Virtually all of the non-thermal effects of electromagnetic radiation can be accounted for by the leakage of cell membranes.
Most of them involve the inward leakage of free calcium ions down an enormous electrochemical gradient to affect calcium-sensitive enzyme systems. This is the normal mechanism by which cells sense mechanical membrane damage. They normally respond by triggering mechanisms that stimulate growth and repair, including the MAP-kinase cascades, which amplify the signal.
If the damage is not too severe or prolonged, we see a stimulation of growth and the effect seems beneficial, but if the exposure is prolonged, these mechanisms are overcome and the result is ultimately harmful. This phenomenon occurs with both ionising and non-ionising radiation and is called radiation hormesis. Gland cells are a good example of this, since short term exposures stimulate their activity but long term exposures cause visible damage and a loss of function. Damage to the thyroid gland from living within 100 metres of a cell phone base station caused hypothyroidism and may be partially responsible for our current outbreak of obesity and chronic fatigue.
Secondary effects of obesity include diabetes, gangrene, cardiac problems, renal failure and cancer. Cell phone base station radiation also affects the adrenal glands and stimulates the production of adrenalin and cortisol. Excess adrenalin causes headaches, cardiac arrhythmia, high blood pressure, tremors and an inability to sleep, all of which have been reported by people living close to base stations. The production of cortisol weakens the immune system and could make people living near base stations more susceptible to disease and cancer.
Inward calcium leakage in the neurons of the brain stimulates hyperactivity and makes it less able to concentrate on tasks, resulting in attention deficit hyperactivity disorder (ADHD). When this happens in the brains of unborn babies and young children, it reduces their ability to concentrate on learning social skills and can cause autism. Leakage of the cells of the peripheral nervous system in adults makes them send false signals to the brain, which results in the symptoms of electromagnetic intolerance (aka electromagnetic hypersensitivity). Some forms of electromagnetic intolerance may be due to cell phone damage to the parathyroid gland, which controls the calcium level in the blood and makes cell membranes more inclined to leak. Further exposure could then tip them over the edge into full symptoms of electromagnetic intolerance.
Cell phone radiation damages DNA indirectly, either by the leakage of digestive enzymes from lysosomes or the production of reactive oxygen species (ROS) from damaged mitochondrial and plasma membranes. The results are similar to those from exposure to gamma rays from a radioactive isotope.
Effects of DNA damage include an increased risk of cancer and a loss of fertility, both of which have been found in epidemiological studies. The effects of cell phone and WiFi radiation have also been determined experimentally using ejaculated semen. The results showed the production of ROS, and a loss of sperm quality and, in some cases, DNA fragmentation.
The inward leakage of calcium ions from electromagnetic fields also opens the various tight junction barriers in our bodies that normally protect us from allergens and toxins in the environment and prevent toxic materials in the bloodstream from entering sensitive parts of the body such as the brain. The opening of the blood-brain barrier has been shown to cause the death of neurons and can be expected to result in early dementia and Alzheimer’s disease. The opening of the barrier in our respiratory epithelia by electromagnetic fields has been shown to increase the risk of asthma in children and the opening of the blood-liver barrier may be partially responsible for the current outbreak of liver disease. The opening of other barriers, such as the gut barrier allows foreign materials from the gut to enter the bloodstream, which may also promote allergies and has been linked autoimmune diseases.
Cell membranes also act as electrical insulators for the natural DC electric currents that they use to transmit power. Mitochondrial membranes use the flow of hydrogen ions to couple the oxidation of food to the production of ATP. The outer cell membrane uses the flow of sodium ions to couple the ATP produced to the uptake of nutrients. If either of these leak, or are permanently damaged, both of these processes will be compromised leading to a loss of
available energy, which some people believe to be a contributory factor to chronic fatigue syndrome.
The mechanism underlying electromagnetically-induced membrane leakage is that weak ELF currents flowing through tissues preferentially remove structurally important calcium ions, but they have been shown to do so only within certain amplitude windows, above and below which there is little or no effect. This means that there is no simple dose-response curve, which many people find confusing, but a plausible theoretical model is described. The mechanism also explains why certain frequencies especially 16Hz is particularly effective.
Living cells have evolved defence mechanisms against non-ionising radiation. These include pumping out surplus calcium that has leaked into the cytosol, the closure of gap junctions to isolate the damaged cell, the production of ornithine decarboxylase to stabilize DNA and the production of heat-shock proteins, which act as chaperones to protect important enzymes. However, this is expensive in energy and resources and leads to a loss of cellular efficiency. If the exposure to the radiation is prolonged or frequently repeated, any stimulation of growth caused by the initial ingress of calcium runs out of resources and growth and repair becomes inhibited. If the repairs fail, the cell may die or become permanently damaged.
To some degree, we can make our own electromagnetic environment safer by avoiding ELF electrical and magnetic fields and radio waves that have been pulsed or amplitude modulated at ELF frequencies. The ELF frequencies that give damaging biological effects, as measured by calcium release from brain slices and ornithine decarboxylase production in tissue cultures, lie between 6Hz and 600Hz. It is unfortunate that virtually all digital mobile telecommunications systems use pulses within this range. The Industry clearly did not do its homework before letting these technologies loose on the general public and this omission may already have cost many lives.
Even now, it may be possible reverse their effects by burying the pulses in random magnetic noise, as proposed by Litovitz in the 1990s or by cancelling out the pulses using balanced signal technology but, at present, the Industry does not seem to be interested in either of these.
Until the mobile telecommunications industry makes its products more biologically friendly, we have little alternative but to reduce our personal exposure as far as possible by using cell phones only in emergencies, avoiding DECT cordless phones and substituting WiFi with Ethernet . The only DECT phones that are even remotely acceptable are those that automatically switch off the base station between calls; e.g. the Siemens Gigaset C595 operating in Eco Plus mode. If you are highly electromagnetically intolerant, you may need to screen your home or at the very least your bed from incoming microwave radiation and sleep as far away as possible from known sources of ELF.
There have been many instances of harmful effects of electromagnetic fields from cell phones (aka mobile phones), DECT phones (aka cordless phones), WiFi, power lines and domestic wiring. They include an increased risk of cancer, loss of fertility, effects on the brain and symptoms of electromagnetic intolerance. Many people still believe that, because the energy of the fields is too low to give significant heating, they cannot have any biological effect. However, the evidence that alternating electromagnetic fields can have non-thermal biological effects is now overwhelming. See www.bioinitiative.org and www.neilcherry.com . The explanation is that it is not a heating effect, but mainly an electrical effect on the fine structure of the electrically-charged cell membranes upon which all living cells depend.
Alternating electromagnetic fields can induce alternating currents to flow through living cells and tissues. These can interfere with the normal direct currents and voltages that are essential for the metabolism of all cells. Virtually every living cell is a seething mass of electric currents and electrical and biochemical amplifiers that are essential for their normal function. Some have tremendous amplifying capacity; e.g. it is claimed that a dark adapted human eye can detect a single photon (the smallest possible unit of light) and the human ear can hear sounds with energies as low as a billionth of a watt. We should therefore not be too surprised to find that our cells can detect and respond to electromagnetic fields that are orders of magnitude below the strength needed to generate significant heat.
My main objective here is to show how most of the adverse health effects of electromagnetic fields can be attributed to a single cause; that being that they remove structurally-important calcium ions (electrically-charged calcium atoms) from cell membranes, which then makes these membranes leak. I will explain the scientific evidence leading to this conclusion and also how we can put matters right, but still keep on using cell phones and other wireless communications. I have included key references that should enable the more inquisitive reader to delve deeper. In many cases, you should be able to find the abstract of the paper in question by copying into Google its entry in the list of references.
Electromagnetic fields affect many but not all people
Many of the experiments on the biological effects of alternating electromagnetic fields appear to give inconsistent results. There are many reasons for this, including differences in the genetic make-up, physiological condition and the history of the test material. In humans, reported effects include an increased risk of cancer, effects on brain function, loss of fertility, metabolic changes, fatigue, disruption of the immune system, and various symptoms of electromagnetic intolerance.
Not everyone is affected in the same way and some may not be affected at all. However, there is increasing evidence that the situation is getting worse. Our electromagnetic exposure is rapidly increasing and previously healthy people are now becoming sensitised to it. In this study, I am concentrating on the cases where there have been definite effects; since this is the most efficient way in which we can find out what is going wrong and what can be done to prevent it.
The frequency of the fields is important
The fields that give the most trouble are in the extremely low frequency range (ELF) and also radio frequencies that are pulsed or amplitude modulated by ELF. (Amplitude modulation is where the strength of a carrier wave transmits information by rising and falling in time with a lower frequency that carries the information.).
Why microwaves are particularly damaging
The frequency of the carrier wave is also important. Higher frequencies such as the microwaves used in cell phones, WiFi and DECT phones, are the most damaging. Our present exposure to man-made microwaves is about a million billion billion (one followed by eighteen zeros) times greater than our natural exposure to these frequencies. We did not evolve in this environment and we should not be too surprised to find that at least some people may not be genetically adapted to it. As with most populations faced with an environmental change, those members that are not adapted either become ill, die prematurely or fail to reproduce adequately. Ironically, those who are electromagnetically intolerant may be better equipped to survive since they are driven to do whatever they can to avoid the radiation.
The main reason why microwaves are especially damaging is probably because of the ease with which the currents that they generate penetrate cell membranes. Cell membranes have a very high resistance to direct currents but, because they are so thin (about 10nm), they behave like capacitors so that alternating currents pass through them easily. Since the effective resistance of a capacitor to alternating current (its reactance) is inversely proportional to its frequency, microwave currents pass through the membranes of cells and tissues more easily than radio waves of lower frequencies and can therefore do more damage to the cell contents.
Calcium loss from cell membranes explains most of the adverse health effects
I became interested in this topic when I was working on the biological effects of physically (magnetically) conditioned water, which is widely used to remove lime scale from boilers and plumbing. It is made by allowing tap water to flow rapidly between the poles of a powerful magnet or by exposing it to a weak pulsed electromagnetic field from an electronic water conditioner. Water treated in this way can remove calcium ions (electrically charged calcium atoms) from surfaces, and the effect on the water can last for several days. I was following up some Russian and Israeli work that had shown that magnetically conditioned water could increase the growth of crops, but it turned out to be far more important than that. The underlying principle was also to explain the mechanisms by which weak electromagnetic fields can damage living cells and also what can be done to stop it.
Magnetically conditioned water and electromagnetic fields have similar effects
Probably, our most important discovery was that when tap water was conditioned by weak electromagnetic fields, the treated water gave similar effects in yeast to those from exposing the yeast itself, amongst which was an increased permeability of their cell membranes to poisons (Goldsworthy et al. 1999). Since it had been known since the work of Bawin et al. (1975) that weak electromagnetic fields could remove calcium ions from the surfaces of brain cells, it seemed likely that both the conditioned water and the electromagnetic fields were working in the same way; i.e. by removing structurally- important calcium ions from cell membranes, which then made them leak. We now know that membrane leakage of this kind can explain most of the biological effects of both conditioned water and of direct exposure to electromagnetic fields.
The effects on growth depend on the length of the conditioning treatment
We also showed that the effects of conditioned water on the growth of yeast cultures depended on the length of the conditioning process. Less than 30 seconds of conditioning stimulated growth but more than this inhibited growth. It was as if the conditioning process was steadily generating one or more chemical agents in the water. A low dose from the shorter conditioning period stimulated growth, but longer conditioning periods gave higher doses, which were inhibitory. This toxic effect of heavily conditioned water, where the water is recycled continuously through the conditioner, has now been exploited commercially to poison blanket weed in ornamental ponds (www.lifescience.co.uk/domestic_blanketweed.htm ). By the same token, blood continually circulating for prolonged periods under the pulsating fields from a cell phone or similar device could become toxic to the rest of the body. This means that no part of the body, from the brain to the liver and gonads, can be considered to be safe from the toxic effects of pulsed electromagnetic fields.
Many people have shown similar dual effects with direct exposure to both ionising and non-ionising radiation. Small doses of otherwise harmful radiation often stimulate growth and appear to be beneficial (a phenomenon known as radiation hormesis) but larger doses are harmful. It also explains why small doses of pulsed magnetic fields are effective in treating some medical conditions such as broken bones (Bassett et al. 1974) but prolonged exposure (as we will see later) is harmful.
It also explains some of the apparent inconsistencies found when comparing different experiments and why meta-analysis of the data should be treated with caution. Clear positive and clear negative results (depending on the dose and the condition of the material) when taken together could be mistaken for no effect, but with a high degree of variability.
Cells have tremendous powers to amplify and respond to weak signals
We now know that electromagnetic growth stimulation is almost certainly due to electrochemical amplification followed by the activation of the MAP kinase cascades by free calcium ions leaking into the cytosol (the main part of the cell). The inward leakage of calcium ions is the normal mechanism by which a cell senses that it has been damaged and triggers the necessary repair mechanisms. This involves huge amplification processes so that even minor leakage (e.g. due to membrane perforation or weak electromagnetic fields) can give rapid and often massive responses.
The first stage in the amplification is due to the calcium gradient itself. There is an enormous (over a thousand fold) concentration difference for free calcium between the inside and outside of living cells. In addition, there is a voltage difference of many tens of mV acting in the same direction. This means that even a slight change in the leakiness of the cell membrane can permit a very large inflow of calcium ions. It’s like a transistor, where a slight change in the charge in the base can allow a massive current to flow through it under the influence of a high voltage gradient between the emitter and collector.
The next stage in the amplification is due to the extremely low calcium concentration in the cytosol so that even a small ingress of calcium ions makes a big percentage difference, to which many enzymes within the cell are sensitive.
Even more amplification comes from the MAP-kinase cascades. These are biochemical amplifiers that enable tiny amounts of growth factors or hormones (perhaps even a single molecule) to give very large effects. They consist of chains of enzymes acting in sequence so that the first enzyme activates many molecules of the second enzyme, which in turn activates still more of the third enzyme etc. The final stage then activates the protein synthesising machinery needed for cell growth and repair.
At least some of these cascades need calcium ions to work (Cho et al. 1992) so the inward leakage of calcium through damaged cell membranes will increase the rate of these processes to stimulate growth and repair. However, these repairs can make deep inroads into the cell’s energy and resources, and its ability to make good the damage will depend on its physiological and nutritional condition. This means that, if the damage is prolonged or persistent, sooner or later it runs out of resources and gives up, which is when we see the inhibitory phase, perhaps followed by apoptosis (cell death) or the loss of some of the cell’s normal functions. We are now seeing this loss of function increasingly after prolonged human exposure to cell phone base station radiation; e.g. the loss of thyroid gland function after six years of exposure (Eskander et al. 2012).
Effects on Glands Gland cells are particularly sensitive to radiation
Gland cells may be particularly sensitive to radiation because their secretions are normally produced in internal membrane systems, which can also be damaged. Their secretions are usually released in vesicles (bubbles of membrane) that fuse with the external cell membrane and disgorge their contents to the outside (exocytosis). The vesicle membrane then becomes part of the external membrane. The resulting excess external membrane is counterbalanced by the reverse process (endocytosis) in which the external membrane buds off vesicles to the inside of the cell, which then fuse with the internal membranes. In this way, an active gland cell may internalise the equivalent of its entire surface membrane about once every half an hour. This means that if the surface membrane is damaged directly by the fields, or by electromagnetically conditioned blood, the damaged membrane rapidly becomes part of the internal membrane system, upon which its normal activity depends. If the damage is too severe, the whole gland may lose its normal function.
Electromagnetic effects on the endocrine system and obesity
Although electromagnetic fields frequently stimulate glandular activity in the short term, long term exposure is often harmful in that the gland ceases to work properly. This is particularly serious for the glands of the endocrine system (those that coordinate our bodily functions) since it can affect many aspects of metabolism and throw the whole body out of kilter. For example it may be responsible, at least in part, for the current outbreak of obesity and the many other illnesses that stem from it.
A good example of this is the thyroid gland, which is in an exposed position in the front of the neck. Rajkovic et al. (2003) showed that after three months exposure to power line frequencies, the thyroid glands of rats showed visible signs of deterioration. They also lost their ability to produce the thyroid hormones, which they did not recover even after the fields were switched off. Esmekaya et al. (2010) found a similar visible deterioration of the thyroid gland in rats exposed to simulated 2G cell phone radiation for 20 minutes a day for three weeks. Eskander et al. (2012) found that people living for six years within 100 metres of a cell phone base station showed a significant reduction in the release into the blood of a number of hormones, including ACTH from the pituitary gland, cortisol from the adrenal glands, and prolactin and testosterone from organs elsewhere. However, the most highly significant loss was in their ability to produce the thyroid hormones. The expected consequence of this is hypothyroidism, the most frequent symptoms of which are fatigue and obesity. It may not be a coincidence that about a quarter of a million UK citizens are now suffering from what is being diagnosed as chronic fatigue syndrome, and about eight out of ten are either overweight or clinically obese.
The incidence of obesity may be exacerbated by effects on the release of the appetite regulating hormones ghrelin and peptide YY. Ghrelin is synthesised in the stomach wall and makes us feel hungry, whereas peptide YY is made in the intestine wall and makes us feel full. In normal people the level of ghrelin in the blood is high before a meal and goes down afterwards whereas peptide YY goes up, so we go from feeling hungry to feeling full, which stops us overeating.
However, in obese people the level of both hormones stays roughly the same throughout so that they never feel completely full and eat in an unregulated manner (Le Roux et al. 2005, Le Roux et al. 2006). If prolonged exposure to electromagnetic fields limits the release of these hormones in the same way as they affect the release of ACTH, cortisol, prolactin, testosterone and the thyroid hormones, it may explain why so many people find it difficult to stop eating and end up being clinically obese.
If you are affected in this way, you may be forced to go on a life-long diet, undergo gastric bypass surgery to drastically reduce the size of your stomach or risk the many serious diseases that stem from obesity AND IT MAY NOT HAVE BEEN YOUR FAULT. Think twice before you use a cell phone or install a cordless phone or WiFi. The consequences are only now becoming apparent; neither the Government nor the telecommunications industry will tell you what they are, but they are not good.
Obesity can trigger many other illnesses
The consequences of obesity include diabetes, gangrene, high blood pressure, cardiac problems, renal failure and cancer. Between them, they cause a great deal of human suffering and cost the nation’s economy a great deal of money.
McCormick et al. 2007).
The annual cost of chronic fatigue syndrome is about $20000 per affected person in the USA (Reynolds et al. http://www.resource-allocation.com/content/2/1/4 ) and about £14000 in the UK (McCrone et al. 2003) so a
. The total annual cost of both conditions together is about£10 billion. If part of this is due to
microwave telecommunications, measures need to be taken to minimise their effects, and it would be only fair to ask the Industry to pay for this.
Electromagnetic effects on the adrenal gland
Cortisol: - Augner et al. (2010) in a double blind study (where neither the subject nor the person recording the results knows whether the radiation is switched on or off) showed that short-term exposure to the radiation from a 2G (GSM) cell phone base station increased the cortisol level in the saliva of human volunteers. Cortisol is a stress hormone that is normally produced in the cortex of the adrenal glands and is controlled by the calcium level in its cells (Davies et al. 1985) so electromagnetically- induced membrane leakage letting more calcium into the cytosol should also have this effect.
Cortisol is part of a mechanism that puts the body into a “fight or flight” mode, in
which more sugar is released into the blood, sensitivity to pain is reduced and the immune system is suppressed. In fact, cortisol and its relatives are used medicinally to relieve pain and also to suppress the immune system after transplant surgery. However, when exposure to base station radiation does it, it is not good news since the suppression of the immune system will also increase the risk of infection and of developing tumours from precancerous cells that might otherwise have been destroyed.
Adrenalin: - Buchner and Eger (2011) studied the effect of a newly installed 2G cell phone base station on villagers in Bavaria and found that it caused a long-lived increase in the production of adrenalin. This is an important neurotransmitter which acts on adrenergic receptors to increase the calcium concentration in the cytosol. It is also synthesised in the adrenal medulla in response to signals from the sympathetic nervous system. Adrenalin too puts the body into fight or flight mode by diverting resources from the smooth muscles of the gut to the heart muscle and the skeletal muscles needed for flight or combat. It addition, it stimulates the production of cortisol by the adrenal cortex, and indirectly reduces the activity of the immune system, resistance to disease and increases the risk of getting cancer.
The annual cost of
obesity and related illnesses to the UK economy has been estimated as being around £6.6 –
7.4 billion (
fair estimate of the total annual cost of chronic
fatigue syndrome to the UK economy would be somewhere in the region £3.5 billion
Some people get pleasure from the “adrenalin rush” caused by doing energetic or dangerous things, and this could be a contributory factor to the addictive nature of cell phones. However, on the down side, known effects of excess adrenalin include, headaches, cardiac arrhythmia, high blood pressure, tremors, anxiety and inability to sleep. These results confirm and explain some of the findings of Abdel-Rassoul et al. (2007) who found that people living near cell towers (masts) had significantly increases in headaches, memory loss, dizziness, tremors and poor sleep.
Effects on the Brain Calcium leakage and brain function
Normal brain function depends on the orderly transmission of signals through a mass of about 100 billion neurons. Neurons are typically highly branched nerve cells. They usually have one long branch (the axon), which carries electrical signals as action potentials (nerve impulses) to or from other parts of the body or between relatively distant parts of the brain (a nerve contains many axons bundled together). The shorter branches communicate with other neurons where their ends are adjacent at synapses. They transmit information across the synapses using a range of neurotransmitters, which are chemicals secreted by one neuron and detected by the other.
Calcium ions play an essential role in brain function because a small amount of calcium must enter the cytosol of the neuron before it can release its neurotransmitters (Alberts et al. 2002). Electromagnetically-induced membrane leakage would increase the background level of calcium in the neurons so that they release their neurotransmitters sooner. This improves our reaction time to simple stimuli but it can also trigger the spontaneous release of neurotransmitters to send spurious signals that have no right to be there, which makes the brain hyperactive and less able to concentrate.
Possibly, the greatest damage to the brain from microwaves is when it is first developing in the foetus and the very young child, where it can lead to autism. Dr Dietrich Klinghardt has shown a relationship between microwaves and autism; a summary of his work can be found at http://electromagnetichealth.org/media-stories/#Autism .
What is autism?
Autism is a group of life-long disorders (autistic spectrum disorders or ASD) caused by brain malfunctions and is associated with subtle changes in brain anatomy (see Amaral et al. 2008 for a review). The core symptoms are an inability to communicate adequately with others and include abnormal social behaviour, poor verbal and non-verbal communication, unusual and restricted interests, and persistent repetitive behaviour. There are also non-core symptoms, such as an increased risk of epileptic seizures, anxiety and mood disorders. ASD has a strong genetic component, occurs predominantly in males and tends to run in families.
Genetic ASD may be caused by calcium entering neurons
It has been hypothesised that some genetic forms of ASD can be accounted for by known mutations in the genes for ion channels that result in an increased background concentration of calcium in neurons. This would be expected to lead to neuronal hyperactivity and the formation of sometimes unnecessary and inappropriate synapses, which in turn can lead to ASD (Krey and Dolmetsch 2007).
Electromagnetic fields also let calcium into neurons
There has been a 60-fold increase in ASD in recent years, which cannot be accounted for by improvements in diagnostic methods and can only be explained by changes in the environment. This increase corresponds in time to the proliferation of mobile telecommunications, WiFi, and microwave ovens as well as extremely low frequency fields from household wiring and domestic appliances. We can now explain at least some of this in terms of electromagnetically-induced membrane leakage leading to brain hyperactivity and abnormal brain development.
How membrane leakage affects neurons
Neurons transmit information between one another in as chemical neurotransmitters that pass across the synapses where they make contact. Their release is normally triggered by a brief pulse of calcium entering their cytosols. If the membrane is leaky due to electromagnetic exposure, it will already have a high internal calcium concentration as calcium leaks in from the much higher concentration outside. This puts the cells into hair- trigger mode so that they are more likely to release neurotransmitters and the brain as a whole may become hyperactive (Beason and Semm 2002; Krey and Dolmetsch 2007, Volkow et al. 2011). This results in the brain becoming overloaded with sometimes spurious signals leading to a loss of concentration and attention deficit hyperactive disorder (ADHD).
How does this impact on autism?
Before and just after its birth, a child’s brain is a blank canvas, and it goes through an intense period of learning to become aware of the significance of its new sensory inputs, e.g. to recognise its mother’s face, her expressions and eventually other people and their relationship to him/her (Hawley and Gunner 2000). During this process, the neurons in the brain make countless new connections, the patterns of which store what the child has learnt. However, after a matter of months, connections that are rarely used are pruned automatically (Huttenlocher and Dabholkar 1997) so that those that remain are hard-wired into the child’s psyche. The production of too many spurious signals due to electromagnetic exposure during this period will generate frequent random connections, which will also not be pruned, even though they may not make sense. It may be significant that autistic children tend to have slightly larger heads, possibly to accommodate unpruned neurons (Hill and Frith 2003).
Because the pruning process in electromagnetically-exposed children may be more random, it could leave the child with a defective hard-wired mind-set for social interactions, which may then contribute to the various autistic spectrum disorders. These children are not necessarily unintelligent; they may even have more brain cells than the rest of us and some may actually be savants. They may just be held back from having a normal life by a deficiency in the dedicated hard-wired neural networks needed for efficient communication.
Autism costs the UK economy more than the tax income from cell phones
The incidence of autism has occurred in parallel with the increase in electromagnetic pollution over the last thirty years. The chance of having an autistic child may now be as high as one in fifty. Apart from the personal tragedies for the affected children and their families, autism is of enormous economic importance. In the UK alone, the
, which is about 20billion UK pounds. http://www2.lse.ac.uk/newsAndMedia/news/archives/2009/05/MartinKnappAutism.aspx If it
in care and lost production exceeds the annual tax revenue from the entire cell phone
annual cost to the Nation
were all due to cell phones, the Government could close down the entire industry and actually show a profit! There may be ways in which the modulation of the signal can be changed to avoid this (see later), but in the meantime, we should do whatever we can to minimise our exposure to information-carrying microwaves, including those from cell phones, DECT phones, WiFi and smart meters. Failure to do this could be very costly.
.Electromagnetic intolerance (aka electromagnetic hypersensitivity or EHS)
Electromagnetic intolerance is a condition in which some people experience a wide range of unpleasant symptoms when exposed to weak non-ionising radiation. About 3 percent of the population suffers in this way at present, although only a small proportion of these are as yet so badly affected that they can instantly tell whether a radiating device is switched on or off. At the other end of the scale, there are people who are sensitive but do not yet know it because they are chronically exposed to electromagnetic fields and accept their symptoms as being perfectly normal. Electromagnetic intolerance is in fact a continuum with no clear cut-off point. In some cases there may only be relatively mild symptoms on or after using a cell phone but in severe cases it can prevent people living a normal life and force them to live in almost total isolation. There is every reason to believe that prolonged exposure will increase the severity of the symptoms, so if you suffer from any of them you should do whatever possible to minimise further exposure.
Symptoms of electromagnetic intolerance
Symptoms include skin rashes, cardiac arrhythmia, headaches (sometimes severe), pain in muscles and joints, sensations of heat or cold, pins and needles, tinnitus, dizziness and nausea. A more complete list can be found at http://www.es-uk.info/info/recognising.asp Most if not all of these can be explained by the radiation making cells leak.
When skin cells leak, it is perceived by the body as damage to the tissue. This increases the blood supply to the area to repair the damage and causes the rash.
When the cells of the heart muscle leak it weakens the electrical signals that normally control its contraction. The heart then runs out of control to give cardiac arrhythmia. This is potentially life threatening.
When sensory cells leak, they become hyperactive and send false signals to the brain. We have a variety of sensory cells, but they all work in much the same way. Whenever they sense what they are supposed to sense, they deliberately leak by opening ion channels in their membranes. This reduces the natural voltage across these membranes, which makes them send nerve impulses to the brain. Electromagnetically induced cell leakage would have the same effect, but this time it would make them send false signals to the brain to give the false sensations of electromagnetic intolerance. This could also be exacerbated by the nerve cells involved being made hyperactive due to calcium ingress.
When leakage occurs in the sensory cells of the skin, it can give sensations such a heat, cold, tingling, pressure etc, depending on which types of cell are most sensitive in the individual concerned.
When leakage occurs in the sensory hair cells of the cochlea of ear it gives tinnitus, which is a false sensation of sound. When it occurs in the vestibular system (the part of the inner ear that deals with balance and motion) it results in dizziness and symptoms of motion sickness, including nausea.
Hypocalcaemia, electromagnetic intolerance and the parathyroid gland
Symptoms of hypocalcaemia are very similar to those of electromagnetic intolerance and include skin disorders, pins and needles, numbness, sensations of burning, fatigue, muscle cramps, cardiac arrhythmia, gastro-intestinal problems and many others. A more comprehensive list can be found at http://www.endotext.org/parathyroid/parathyroid7/parathyroid7.htm . It is possible that some forms of electromagnetic intolerance are due to low levels of calcium in the blood. Electromagnetic exposure would then remove even more calcium from their cell membranes to push them over the edge and give the symptoms of electromagnetic intolerance.
The amount of calcium in the blood is controlled by the parathyroid hormone secreted by the parathyroid gland, which is in the neck, close to where you hold your cell phone. It is adjacent to the thyroid gland and, if it were to be damaged by the radiation in the same way, the production of the parathyroid hormone would go down, the amount of calcium in the blood would be reduced and the person concerned would become electromagnetically intolerant.
Effects on DNA Cell phone radiation can damage DNA
Lai and Singh (1995) were the first to show this in cultured rat brain cells, but it has since been confirmed by many other workers. A comprehensive study on this was in the Reflex Project, sponsored by the European Commission and replicated in laboratories in several European countries. They found that radiation like that from GSM (2G) cell phone handsets caused both single and double stranded breaks in the DNA of cultured human and animal cells. Not all cell types were equally affected and some, such as lymphocytes, seemed not to be affected at all (Reflex Report 2004).
In susceptible cells, the degree of damage depended on the duration of the exposure. With human fibroblasts, it reached a maximum at around 16 hours (Diem et al. 2005). However, It would be unwise to assume that exposures of less than 16 hours are necessarily safe since DNA damage may give genetically aberrant cells long before it becomes obvious under the microscope. It would also be unwise to assume that the damage would be restricted to the immediate vicinity of the handset since, as described earlier; the effects of the radiation can be transmitted in the bloodstream in the form of magnetically conditioned blood; so nowhere is safe, not even the sex organs.
How the DNA is damaged
Because of the very high stability of DNA molecules, they are unlikely to be damaged directly by weak radiation. The most plausible mechanism is that DNase (an enzyme that destroys DNA) and other digestive enzymes leak through the membranes of lysosomes (organelles that digest waste) that have been damaged by the radiation. Other mechanisms involve the leakage of reactive oxygen species (ROS) such as hydrogen peroxide from damaged peroxisomes and superoxide free radicals from damaged mitochondrial membranes and NADH oxidase in the plasma membrane. According to Friedman et al. (2007), the first to respond to non-thermal cell phone frequencies is the NADH oxidase in the plasma membrane, which is activated within minutes of exposure.
However, all of these ROS can initiate peroxidation chain reactions in the polyunsaturated phospholipids of cell membranes (the same thing that makes fats go rancid)
which disrupts the membranes further and exacerbates the effect. Only one molecule of ROS is needed to initiate a domino-effect chain reaction, in which each damaged lipid molecule generates a free radical that damages the next one. The process normally stops when it reaches an anti-oxidant molecule, which sacrifices itself by combining with the free radical in such a way that it does not generate a new one. Most of our anti-oxidants come from our diet (e.g. vitamin E) but the most important one that we make ourselves is melatonin. It’s unfortunate that the production of melatonin by the pineal gland is also disrupted by electromagnetic fields (Henshaw and Reiter, 2005) which makes matters worse.
These ROS are highly reactive and can also damage DNA. In fact, much of the damage done to cells by ionising radiation such as gamma rays is due to damage to cell membranes and DNA by free radicals from the radiolysis of water. There may therefore be little difference between holding a cell phone to your head and holding a radioactive source of gamma rays. Both can damage cell membranes, cause the fragmentation of DNA and also do considerable collateral damage to other cellular components, which may either kill the cells or make them lose their normal function over time.
Cell phones increase the risk of cancer
If similar DNA fragmentation were to occur in the whole organism, we would expect an increased risk of cancer, since essential genes that control cell division may be either damaged or lost. Recent studies on the incidence of brain cancer are already beginning to show this. Heavy cell phone use roughly doubles the risk of getting brain cancers in adults on the side of the head used for the cell phone. For younger people, the risk increases to five times more (Hardell and Carlberg 2009). Since brain cancers normally take decades to develop, it is too soon to assess the final impact of the radiation, but the World Health Organisation has already classified cell phones as a Group 2B Carcinogen (possibly carcinogenic) similar to benzene and DDT. Other head cancers are also on the increase, including cancers of the parotid salivary gland (next to where you hold your cell phone) and the thyroid gland, which is in the neck.
Cell phones reduce male fertility
We might expect DNA damage in the cells of the germ-line (the line of cells starting in the embryo that eventually gives rise to eggs and sperm) to result in a loss of fertility. A number of epidemiological studies have shown significant reductions in sperm motility, viability and quantity in men using cell phones for more than a few hours a day (Fejes et al.2005; Agarwal et al. 2006) and the subject was reviewed by Desai et al. (2009). A common finding is that these effects were associated with the production of reactive oxygen species (ROS) which can damage many cellular components, including cell membranes and DNA.
More recently, Agarwal et al. (2009) found in controlled experiments that ejaculated sperm from healthy donors showed reduced viability and motility and an increase in ROS after one hour’s exposure to a cell phone in talk mode. More recently still, Avandano et al. 2012 found that exposing ejaculated semen to a WiFi laptop for four hours gave a decrease in sperm motility and an increase in DNA fragmentation as compared with samples exposed to a similar computer with the WiFi switched off.
A similar relationship between sperm quality and electromagnetic exposure has also been found for low frequency alternating magnetic fields (Li et al. 2010). It is therefore advisable for men to avoid strong magnetic fields, restrict their cell phone calls to a minimum and keep them switched off (or in airplane mode if it has this facility). Otherwise, the phones
transmit regularly at full power to the base station, even when not in use. If they have to be switched on for any reason, men should at least keep them out of their trouser pockets.
Possible effects on female fertility
We do not yet know the effects of cell phone use on human female fertility, but . Panagopoulos et al. (2007) showed that exposing adult Drosophila melanogaster (an insect widely used in genetic experiments) to a GSM phone signal for just six minutes a day for six days fragmented the DNA in the cells that give rise to their eggs and half of these eggs died. We humans should therefore exercise caution since, although our sperm are produced in their countless billions and take about three months to mature, all the eggs that a woman will ever have were in her ovaries before she was born and will be exposed to the radiation (and electromagnetically conditioned blood) throughout her life. There could therefore be considerable cumulative damage, both to the eggs and the follicle cells that nourish and protect them. Damage to either, beginning when the child is in the womb, can be expected to cause a loss of fertility. Pregnant mothers should avoid all present forms of microwave telecommunications, including cell phones and WiFi. Her child could be damaged by their radiation, but she will not know until she reaches puberty and wants a child herself.
Effects on tight junction barriers
Tight junction barriers are layers of cells where the gaps between them are sealed by tight-junctions to prevent materials leaking around their sides. They protect all of our body surfaces from the entry of unwanted materials and often protect one part of the body from being unduly influenced by the others. For example, the blood-brain barrier prevents toxins entering the brain from the bloodstream. Normally, these barriers are closed but they are programmed to open if calcium ions enter their cells. This was demonstrated by Kan and Coleman (1988) who showed that the calcium ionophore A23187 (an antibiotic that kills bacteria and fungi by letting calcium ions leak into their cells) opened tight junction barriers in the liver. The electromagnetic opening of the blood-liver barrier could be a contributory factor to the current outbreak of liver disease in the UK among the under forties (the cell phone generation), which is at present being blamed on alcohol abuse. Since all tight junction barriers have basically the same design, unscheduled calcium entry resulting from electromagnetic exposure is likely to open all of them in much the same way. The opening of our tight junction barriers by electromagnetic fields can account for many modern illnesses, ranging from asthma to multiple allergies and Alzheimer’s disease.
The blood-brain barrier and early dementia
The blood-brain barrier normally prevents possibly toxic large molecules from the bloodstream entering the brain. The radiation from cell phones, even at one hundredth of the permitted SAR value, can open the blood brain barrier in rats so that protein molecules as large as albumin could enter their brains (Persson et al. 1997). Later experiments by Salford et al. (2003) showed that this was associated with the death of neurons. We would not expect an immediate effect because the brain has spare capacity, but prolonged or repeated exposure to cell phone or similar radiation would be expected to cause a progressive loss of functional neurons and result in early dementia and Alzheimer’s disease in humans. The extreme sensitivity of the blood-brain barrier to the radiation could mean that even sitting close to someone using a cell phone could affect you too. It may not be too surprising to find that early onset Alzheimer’s disease is now on the increase in modern society.
The respiratory barrier and asthma
Di et al. (2011) showed that exposure to weak ELF electromagnetic fields during pregnancy increased the risk of asthma in the offspring (they did not test microwaves). This
can be explained by the radiation removing structural calcium from the cells of the tight junction barrier lining the respiratory tract, which then opens. This is supported by the findings of Chu et al. (2001) who showed that either low levels of external calcium or the addition of EGTA, both of which would remove structural calcium ions from cell surfaces, caused massive increases in its electrical conductance (a measure of its permeability to ions) and also to its permeability to much larger virus particles. We would therefore expect many allergens to enter by the same route and predispose the child to asthma.
(http://www.asthma.org.uk/news_media/news/new_data_reveals_hig.html ) The skin barrier, allergies and multiple chemical sensitivities
The skin tight junction barrier is in the stratum granulosum, which is the outermost layer of living skin cells just underneath the many layers of dead cells (Borgens et al. 1989). Furuse et al. (2002) showed that mutant mice deficient in Claudin-1 (a vital component of the sealing mechanism) died within a day of birth and their skin barriers were permeable to molecules as large as 600D, which is enough to admit many unwanted foreign materials, including potential allergens. In humans, this could be the basis of multiple chemical sensitivities, where people have become allergic to a wide range of chemicals, although they leave most of us unaffected. People suffering from multiple chemical sensitivities are often also electromagnetically intolerant and many of their symptoms are very similar.
Virtually all of our body surfaces are protected by cells with tight junctions, including the nasal mucosa (Hussar et al. 2002), the lungs (Weiss et al. 2003) and the lining of the gut (Arrieta et al. 2006). An electromagnetically-induced increase in the permeability of any of these would allow the more rapid entry into the body of a whole range of foreign materials, including allergens, toxins and carcinogens.
Loss of barrier tightness can trigger autoimmune diseases
An electromagnetically-induced increase in the permeability of any of the tight- junction barriers has been linked to the occurrence of autoimmune diseases, in which lymphocytes the immune system attacks the body’s own components as if they were foreign materials or pathogens.
The immune system is quite complicated but basically lymphocytes (a type of white blood cell) are trained and selected before they mature to recognise the body’s own cells, which are normally present in the bloodstream, by virtue of chemical patterns on their surfaces (the major histocompatibility complexes).
B-lymphocytes make specific antibodies that combine with foreign cells and substances that do not have this pattern, which marks them for eventual ingestion and digestion by phagocytes (another type of white blood cell). T-lymphocytes kill the body’s own cells if they are infected with a virus, which is normally displayed on the cell surface. In both cases, the presence of the foreign material or infected cells trigger the rapid multiplication of a clone of lymphocytes that recognise them. They can then attack it in force.
However, if the substance concerned belongs to the body itself but is normally prevented from entering the bloodstream by a tight-junction barrier such as the blood-brain barrier, when that barrier opens, it increases the likelihood of its leaking unfamiliar materials into the bloodstream and triggering an autoimmune response. For example, Grigoriev et al (2010) showed that 30 days exposure to unmodulated 2450MHz microwave radiation triggered a small but significant increase in anti-brain antibodies in the blood of rats. In other
about 5.4 million people with asthma in the UK and the estimated annual cost to the NHS
alone is about £1 billion
words, the radiation had sensitised the body’s immune system to one or more components
of its own brain, which could then result in an autoimmune attack on the brain and/or nervous system. An example of an autoimmune disease of the brain is Graves disease in which the pituitary gland (at the base of the brain) is affected.
In addition, an increase in the permeability of the gut barrier has been linked to several other autoimmune diseases, including type-1 diabetes, Crohn’s disease, celiac disease, multiple sclerosis and irritable bowel syndrome (Arrieta et al. 2006).
Cell membranes as current generators and electrical insulators
Cell membranes not only keep apart materials that must not be allowed to mix, they also act as electrical insulators for the natural electric currents upon which all of our cells depend.
Natural electric currents are important in power and information transfer
Almost every living cell is a seething mass of electric currents and amplifiers. For example, these currents are important in energy production in mitochondria (the cell’s power stations) and in cell signalling (the transfer of information within and between cells). They are carried as flows of ions, which are the normal ways in which electricity is carried through water and through living cells.
These natural currents are generated by cell membranes.
Natural electric currents are normally generated by molecular ion pumps in cell membranes. These are proteins that use metabolic energy to transport specific ions, usually one or two at a time, from one side of the membrane to the other. This generates a voltage across the membrane (the membrane potential) and a chemical imbalance between the concentrations of ions on either side. Their combined effect gives an electrochemical gradient, which provides energy for other functions.
Mitochondria use electrochemical gradients to transmit power
Mitochondria are tiny structures, about the size of bacteria, inside almost all of our cells. They evolved when an aerobic bacterium, which used oxygen to metabolise its food, was engulfed by an anaerobic organism, which could not do his, but was more efficient in other respects. From then on they lived together symbiotically, but are still separate in that that the mitochondria are surrounded by two membranes; the inner one belonging to the bacterium and the outer one to its host.
The inner membrane does the electrical work by a process known as chemiosmosis. The inside of the mitochondrion contains enzymes that convert materials from our food into forms that can combine with oxygen. This combination with oxygen occurs using enzymes actually within the membrane, and the released energy is used to expel hydrogen ions to create an electrochemical gradient between the inside and the outside of the mitochondrion. They are then allowed back through another enzyme in the membrane called ATP synthase that uses the gradient to make ATP, which is the main energy currency of the cell. The cycle then repeats to give an electrical circuit with hydrogen ions carrying the electricity from where it is made to where it is used, with the membrane being the insulator (Alberts et al. 2002).
What happens if the mitochondrial membrane is damaged?
Damage to the inner mitochondrial membrane can have two main effects. If it just leaked it would short circuit the system, reduce ATP synthesis and deprive the cell of energy. If the damage were also to include the oxidising enzymes, they could release free radicals, which are normal intermediates in the process. This would damage both the inside of the mitochondrion (including its DNA) and also the rest of the cell. Mitochondrial dysfunction of this sort is thought to be a possible cause of chronic fatigue syndrome.
Other membranes also use ion currents to transfer energy
Most other cell membranes use ion currents as a source of energy. For example, enzymes in the outer membrane of each cell (the plasma membrane) use energy from ATP to pump positively charged sodium ions out of the cell. This generates its own membrane potential, which typically makes the inside of the cell about 70-100mV negative to the outside. This provides energy for the active transport of other materials across the membrane against a concentration gradient. In this case, the sodium ions that have been expelled are allowed back in, through transporter enzymes, but they carry with them nutrients from the outside by a process called ion co-transport (Alberts et al. 2002) If this membrane leaks, it will short circuit the voltage across it and reduce nutrient uptake as well as a number of other processes which use this voltage as a source of energy.
Ion channels in cell membranes are used for cell signalling
Ion channels are pores in cell membranes that can let large quantities of specific ions through very quickly, but only down their own electrochemical gradient. They normally open and close in response to specific stimuli; e.g. changes in voltage across the membrane or the presence of other chemicals. They can be thought of as amplifiers by which a tiny stimulus can cause a very large current to flow almost instantly to give a rapid biological effect. An example of this is the coordinated opening and closing of sodium and potassium channels that continuously amplify nerve impulses and enable them to travel from one end of the body to the other, both rapidly and without loss.
The mechanisms of cell membrane leakage.
We have known since the work of Suzanne Bawin and her co-workers (Bawin et al.1975) that electromagnetic radiation that is far too weak to cause significant heating can nevertheless remove radioactively labelled calcium ions from cell membranes. Later, Carl Blackman showed that this occurs only with weak radiation, and then only within one or more ‘amplitude windows’, above and below which there is little or no effect (Blackman et al. 1982; Blackman 1990).
The apple harvester: an explanation for amplitude windows
A simple way to explain the selective removal of divalent ions is to imagine trying to
harvest ripe apples by shaking the tree. If you don’t shake it hard enough, no apples fall off,
but if you shake it too hard, they all fall off. However, if you get it just right, only the ripe ones
fall off and are ‘selectively harvested’.
We can apply the same logic to the positive ions bound to cell membranes. Alternating voltages try to drive these ions off and then back onto the membranes with each cycle. If the voltage is too low, nothing happens. If it is too high, all the ions fly off, but return when the voltage reverses. However, if it is just the right, it will tend to remove only the more
strongly charged ones, such as divalent calcium with its double charge. If the frequency is low, at least some of these divalent ions will diffuse away and be replaced at random by other ions when the field reverses. There will then be a net removal of divalent ions with each successive cycle until enough have been removed to cause significant membrane leakage and give a biological effect, but only within a narrow range of field strength to give an amplitude window. Pulses are more effective than smooth sine waves because their rapid rise and fall times catapult the ions quickly away from the membrane and leave more time for them to be replaced by different ions before the field reverses.
Frequency windows and resonance effects
If a molecule or structure has a natural resonant frequency, it may respond selectively to that frequency. For example, if you keep giving a pendulum a gentle push at just the right time at the end of its travel, the energy of each push builds up and is stored in the ever increasing violence of its motion. If you were suddenly to stop it by putting your hand in the way, the combined energy of each push is released in one go and could do more damage to your hand than the energy you gave it from each individual push.
In the same way, if an electrically charged atom or molecule has one or more natural resonant frequencies and you give it an electromagnetic pulse at that frequency, it may store the combined energy of each pulse as some sort of vibration. This could enable it to bring about a chemical reaction that would not have been possible from the energy of each pulse alone, but only at its resonant frequency. Some frequencies are especially effective in giving biological effects. An example is 16Hz, which is the ion cyclotron resonance frequency of potassium ions in the Earth’s magnetic field.
Ion cyclotron resonance occurs when ions move in a steady magnetic field such as that of the Earth. They are deflected sideways by the magnetic field and go into orbit around its lines of force at a frequency that depends on the charge to mass ratio of the ion and the strength of the steady field (see Liboff et al. 1990). If they are simultaneously exposed to an alternating field at this frequency, they absorb its energy and increase the diameter of their orbits, which increases their energy of motion and chemical activity. Potassium resonance is particularly important because potassium is the most abundant positive ion in the cytosols of living cells, where it outnumbers calcium by about ten thousand to one. It is therefore the ion most likely to replace any calcium that has been lost by electromagnetic exposure. An increase in the chemical activity of potassium will therefore increase its ability to replace calcium and so increase calcium loss from the membrane and further reduce its stability.
Calcium loss and leaky membranes underlie many biological effects.
We have seen how the loss of calcium from cell membranes is enhanced at the 16Hz potassium resonant frequency. Also, any metabolic consequences of this calcium loss may be similarly enhanced. Any bioelectromagnetic responses that peak or trough at 16Hz is evidence that they stem from divalent ion depletion in membranes. In fact, many biological responses appear to peak at 16Hz.. These include stimulations of the growth of yeast (Mehedintu and Berg 1997) and higher plants (Smith et al. 1993), changes in rate of locomotion in diatoms (McLeod et al. 1987), and the especially severe neurophysiological symptoms reported by electrosensitive people exposed to the radiation from TETRA handsets (which is pulsed at 17.6Hz). All of this supports the notion that a large number of the biological responses to weak electromagnetic radiation stem from the loss of calcium (and possibly other divalent ions) from cell membranes.
How calcium removal makes cell membranes leak
Positive ions strengthen cell membranes because they help bind together the negatively charged phospholipid molecules that form a large part of their structure. Calcium ions are particularly good at this because their double positive charge enables them to bind more strongly to the surrounding negative phospholipids by mutual attraction and hold them together like mortar holds together the bricks in a wall. However, monovalent ions are less able to do this (Steck et al. 1970, Lew et al. 1998, Ha 2001). Therefore, when electromagnetic radiation replaces calcium with monovalent ions, it weakens the membrane and makes it more likely to tear and form temporary pores, especially under the stresses and strains imposed by the moving cell contents. Normally, small pores in phospholipid membranes are self healing (Melikov et al. 2001) but, while they remain open, the membrane will have a greater tendency to leak. This can have serious metabolic consequences as unwanted substances diffuse into and out of cells unhindered, and materials in different parts of the cell that should be kept separate, become mixed.
Both extremely low frequencies and radio waves that have been amplitude modulated at extremely low frequencies give biological effects, but unmodulated radio waves are relatively (but not completely) innocuous. This implies that living cells can demodulate a modulated signal to extract the biologically active ELF. Furthermore, if they are to respond to cell phone and WiFi signals, they must be able to do it at microwave frequencies, but how do they do it?
The most likely explanation lies in asymmetric electrical properties of ion channels in cell membranes imposed by the membrane potential between the inside and outside of the cell. They will behave like electrically biased point contact Schottky diodes in which electricity passes more easily in one direction than the other. This is all that is needed to rectify and demodulate the signal. A non-biological example of this effect is a radio set that was made from a single carbon nanotube (see http://tinyurl.com/m4u75o ). The asymmetry induced by applying a DC voltage between its ends allowed it to demodulate and even to amplify radio signals, including those at microwave frequencies.
The nanotube has a similar diameter to a typical ion channel in a cell membrane, so it seems likely that the ion channels in cell membranes could perform a similar function, powered by the cell’s membrane potential. The low-frequency component would then appear across the membrane, where it could do most damage. In as much as our tight junction barriers have a similar trans-barrier potential (around 70mV for the skin barrier with the inside of body positive) the ion channels of the whole barrier could act in concert to demodulate the signal, the damaging low frequency components of which could then be applied to and affect the whole body.
Natural defence mechanisms
The body is able to detect electromagnetic radiation and so minimise resulting damage. This ability probably evolved over countless millions of years to mitigate the effects of ionising radiation from cosmic rays and non-ionising radio frequencies from lightning during thunderstorms. Some of them are as follows: -
The concentration of free calcium in the cytosols of living cells is normally kept extremely low by metabolically-driven ion pumps in the cell membrane. Under normal circumstances, the entry of free calcium ions is carefully regulated and small changes in
their concentration play a vital role in controlling many aspects of metabolism. These processes can be disrupted if electromagnetically-induced membrane leakage lets extra and unscheduled amounts of calcium into the cell, either from the outside or from calcium stores inside. To compensate for this, the mechanism that normally pumps surplus calcium out can go into overdrive. However, its capacity to do this is limited because, if the pumping were too effective, it would hide the small changes in calcium concentration that normally control metabolism.
Gap junction closure: - If calcium extrusion fails and there is a large rise in internal calcium, it triggers the isolation of the cell concerned by the closure of its gap junctions (tiny strands of cytoplasm that normally connect adjacent cells) (Alberts et al. 2002). This also limits the flow of electric currents through the tissue and so reduces the effects of radiation.
Ornithine decarboxylase (ODC)
The activation of the enzyme ornithine decarboxylase is triggered by calcium leaking into cells through damaged membranes and by nitric oxide produced by damaged mitochondria. This enzyme leads to the production of chemicals called polyamines that help protect DNA and the other nucleic acids needed for protein synthesis. One such polyamine is spermine, which normally protects the DNA of sperm and is also responsible for the characteristic smell of semen.
Heat shock proteins
These were first discovered after exposing cells to heat, but they are also produced in response to a wide variety of other stresses, including weak electromagnetic fields. They are normally produced within minutes of the onset of the stress and combine with the cell’s enzymes to protect them from damage and shut down non-essential metabolism (the equivalent of running a computer in "safe mode").
When the production of heat shock proteins is triggered electromagnetically it needs 100 million million times less energy than when triggered by heat, so the effect is truly non- thermal (Blank & Goodman 2000). Their production in response to electromagnetic fields is activated by special base sequences (the nCTCTn motif) in the DNA of their genes. When exposed to electromagnetic fields, they initiate the gene’s transcription to form RNA, which is the first stage in the synthesis of the protein (Lin et al. 2001). The job of these heat-shock proteins is to combine with vital enzymes, putting them into a sort of cocoon that protects them from damage. However, this stops them working properly and also drains the cell’s energy and resources, so it isn’t an ideal solution either.
Our defences protect us from thunderstorm radiation but not from cell towers, DECT phones and WiFi
As we can see, our natural defence mechanisms try to limit the electromagnetically- induced damage, but they cannot be deployed without using extra energy and disrupting the cell’s normal functions. They originally evolved to protect us from occasional weak natural radiation, such as that from thunderstorms. However, prolonged or repeated exposure such as that from cell towers, WiFi and most DECT base stations is harmful because they normally run continuously and disrupt metabolism for long periods and is expensive in bodily resources.
These resources have to come from somewhere. Some may be drawn from our physical energy, making us feel tired, some may come from our immune systems, making us less resistant to disease and cancer. There is no hidden reserve. As it is, our bodies are constantly juggling resources to put them to best use. For example, during the day, they are
directed towards physical activity but during the night, they are diverted to the repair of accumulated damage and to the immune system. Day and night irradiation from cell phone towers (which run continuously) will affect both, with little or no chance to recover. In the long term, this is likely to cause chronic fatigue, serious immune dysfunction (leading to an increased risk of disease and cancer) and many of the neurological symptoms frequently reported by people living close to mobile phone base stations (see Abdel-Rassoul et al. 2007).
How can we make our electromagnetic environment safe?
Firstly, there may be no need to give up our electrical appliances domestic appliances or cell phones It is possible to make most of them much safer. All that is needed with domestic wiring is low-tech electromagnetic hygiene. As for cell phones, the operators have known for over a decade how to modify the radiated signal to make it safe; they have just chosen not to do so. I will deal with these one at a time.
It is easy to screen the electrical field from wiring by enclosing it in earthed metal conduits or using screened cable with an earthed screen. We cannot screen the magnetic field in this way but by careful design of the circuits, we can make the magnetic fields of the live and neutral wires cancel each other out. To do this, all you need is to make sure that the live and neutral wires to any device are as close together as possible (preferably twisted together) with each device having its own connection to the main distribution panel. The cheap UK practice of using ring mains (where many plug sockets are connected in a ring, beginning and ending in the distribution panel) should be made illegal. This is because differences in the resistance of the conductors mean that electricity flowing to any plug socket may not flow back the way it came so that their magnetic fields do not cancel and there will be an unnecessarily high field surrounding the whole ring.
Another source of problems is the use of unearthed double insulated appliances. Although there is very little risk of shock, they still emit strong magnetic fields and electric fields at about half the supply voltage, which some people find intolerable.
While we can block or cancel the electromagnetic fields associated with domestic wiring, we cannot do this with cell phones or DECT phones, which depend on radio frequency radiation transmissions if they are to work. However, we can make this radiation much less biologically active. There are at least two ways to do this. The first was devised tested and patented by Theodore Litovitz working at the Catholic University of America in the 1990s. All you have to do is to add low frequency electromagnetic noise to the signal.
The theory behind Litovitz’s method.
His idea was to add a random ELF (noise) magnetic field to the regularly repeating fields from power lines or cell phones. It works on the principle that most of the biological effects of electromagnetic fields are due to the relatively slow but progressive loss of calcium from cell membranes, which then makes them leak. However, the effect on any cell takes place only within certain amplitude windows, as I described earlier. We may not be able to prevent this leakage just by reducing the power of the field. All this might do is to put other cells (perhaps nearer the source) into their amplitude windows and we may be no better off.
However, if we add a second magnetic field with a randomly varying amplitude, cells are constantly being driven in and out of their amplitude windows and do not spend long
enough in their windows to lose significant amounts of calcium before leaving their windows. The lost calcium then floods back and there is no biological effect. This theory has been tested in several biological systems and found to work.
Much of Litovitz’s work used the in production of the enzyme ornithine decarboxylase
(ODC) by tissue cultures as an indicator of radiation damage to living cells. The activity of this enzyme increases several fold when exposed to electromagnetic fields (Byus et al. 1987). ODC is part of a defence mechanism against the radiation and an increase in its production is taken as an indication that damage is occurring. Conversely, if the random signal prevents its production, it is an indication that damage is not occurring.
Work in Litovitz’s laboratory was mainly concerned with mitigating the effects of 60Hz
power line frequencies and he found that adding a random (noise) magnetic field of about the same strength completely reversed their effects on ODC production in mouse tissue cultures (Litovitz et al. 1994b) and also the deformities induced by 60Hz fields in chick embryos (Litovitz et al. 1994a)
They then went on to study the effects of modulation frequency on 845MHz microwave radiation on ODC production in mouse tissue cultures. They found that constant frequencies between 6 and 600Hz were harmful as measured by ODC production. Simple amplitude modulated speech (which is more random) did not stimulate ODC production, neither did frequency modulated microwaves and frequency modulated analogue phone signals. Continuous microwaves had only a slight effect.
Most microwave pulse frequencies are harmful
Penafiel et al. (1997) working in Litovitz’s laboratory concluded that there were only serious health problems when the microwaves were modulated to give pulses of a standard height (amplitude) generated at frequencies between 6 and 600Hz. There was virtually no effect above 600Hz. This corresponds to Blackman et al. (1988) observation that calcium release from brain tissue did not occur above 510Hz.
It would appear that the mobile telecommunications industry had not done their homework before selecting the pulse frequencies for their digital communications, since they virtually all fall within this biologically active range; e.g. 2G GSM cell phones (217Hz), TETRA (17.6Hz), DECT phones (100Hz), WiFi (10Hz), and 3G UMTS signals with time division duplex (100Hz and 200Hz) all of which are potentially harmful. There could be other harmful effects of the radiation that do not trigger ODC production or calcium release but, at the very least, these pulse frequencies should not have been used if the cell phone industry had acted due diligence. .
However, Litovitz (1997 found that even these could be made safe by superimposing a low frequency magnetic field on the signal. They found that it prevents the production of ornithine decarboxylase (ODC) by mouse tissue cultures in response to digital cell phone signals. For example, a random field between 30 and 100Hz with an RMS strength of 5 microtesla completely inhibited the ODC production induced by a cell phone signal with an SAR of about 2.5 W/kg. A coil within the handset could easily deliver a random magnetic field of this magnitude and probably protect the user from the harmful effects of its radiation.
Also Lai (2004) showed that a 6 microtesla random noise field completely reversed the deleterious effect of 2450 MHz continuous waves with an SAR of 1.2 W/kg on rat memory. In none of the above experiments did the random noise have any effect in its own right and, on these criteria, is completely harmless.
Balanced signal technology
While Litovitz’s method might protect the user from the radiation, because magnetic fields dissipate rapidly as you move away from the source, they may not protect other people nearby, who are out of range of the protective random field. By the same token, random low frequency magnetic fields emitted by a cell phone base station would not be able to protect most users. For this you may need something like a system that I devised myself, to which I gave the name “Balanced Signal Technology”. I am not claiming any patent rights and anyone who wants to test and use it can do so free of charge.
The principle is very simple and involves transmitting two complementary mirror image signals on different carrier frequencies; i.e. when one has a pulse, the other has a gap. The base station would have no problem with this since they would look like two separate phone calls. However, living cells would be unlikely to distinguish between the two carrier frequencies and the pulses on each would cancel and it would look like a relatively harmless continuous wave. It would need very little extra bandwidth since only one of the signals need be used, with the other one being effectively thrown away and they could all be dumped on the same frequency. In theory, this technology could be applied to both handsets and base stations, but has not yet been tested.
The cell phone companies should know about both methods to make cell phones safer but there is no evidence that they are interested, possibly because to implement them would cost money with no extra benefit to themselves. It looks very much as if they would prefer many people to become sick and perhaps die, rather than admit that that their safety rules are based on false premises and that their current technologies are not yet safe.
What can we do about it ourselves?
Very few people would want to give up their cell phones, but if you have one, for your own personal safety, keep your calls on it short and infrequent so that your body has a chance to recover in between times. Use text (which takes seconds to transmit) rather than voice calls and avoid unnecessary Internet downloads. The choice is yours, but spare a thought for the people living near the base stations. Some may be badly affected by their continuous radiation but they have no choice. Your cell phone calls will contribute to their problems, so your restraint may help them too.
Also, don’t forget your own personal sources of continuous radiation such as WiFi routers and DECT phone base stations, which can be even more harmful since they are closer. Avoid using WiFi altogether. Ethernet connections via cable are not only safer, but faster, more reliable and offer greater security. Various “Homeplug” devices that connect an the Ethernet socket of your computer to the router via the household electricity supply are second best alternatives. They are not perfect since there is still some radiation from the wiring; especially with those offering faster speeds.
DECT phones should also be avoided if at all possible. But, if you must have one, a reasonable compromise is to use only one that switches off its base station automatically between calls. At the time of writing, the only DECT phones that do this are the Eco Plus models manufactured by Siemens; e.g. the Siemens Gigaset C595. However, make sure they are programmed to work in the Eco Plus mode since this is not the default setting.
Screening and its limitations
Many electromagnetically intolerant people will want to screen themselves from the fields but we need to understand a little about them to get the best results.
An alternating electromagnetic field consist of an electrical, field and a magnetic field. The electrical field is produced by a voltage gradient and is measured in volts per metre. The magnetic field is generated by a flow of current and is measured in tesla. When you are close to the source (typically within one wavelength) you are in the near-field, where the electrical and magnetic fields are mainly separate.
At power line frequencies, the wavelengths run into thousands of miles, so you are bound to be in the near field for power lines. For example, standing under an alternating power line would expose you to a voltage gradient due to the difference between the voltage of the line (set by the power company) and the Earth. You would also be exposed to a magnetic field proportional to the current actually flowing through the line, which depends on consumer demand. Both the magnetic and the electrical fields can induce electric currents in your body and are potentially harmful, but the magnetic field is worse because it penetrates living tissues more easily, goes through most walls and aluminium foil as if they were not there, and is very difficult to screen.
The far field
However, as you move away from the source, the two fields feed on each other’s energy and combine to give photons of radio waves. This is usually complete within a few wavelengths, after which you are in the so called far-field where all the power takes the form of radio waves. Your exposure to these is usually measured in units of power (e.g. microwatts per square metre) or its associated voltage gradient (e.g. volts per metre).
The importance of this as far as we are concerned is that radio waves, are like light waves and are relatively easy to absorb and reflect. This can be done, using earthed metal foil or other electrically conductive materials such as carbon-based paints and metallised textiles. For practical purposes, this means that you can screen yourself against the radiation from a cell tower, WiFi router, or DECT phone base station if they are several wavelengths away (several tens of centimetres) but not from a cell phone held against your head, where you are in the near field and the raw magnetic component will penetrate deep into your brain.
To give an idea of the hazard, magnetic fields lower than one microtesla (a millionth of a tesla) can produce biological effects, but using a 2G (GSM) cell phone or a PDA exposes you to low frequency magnetic pulses that peak at several tens of microtesla (Jokela et al. 2004; Sage et al. 2007). These come mainly from the battery circuits and are well over the minimum needed to give harmful effects. When they are added to the damaging effects of their microwave fields themselves, these devices are potentially the most dangerous sources of electromagnetic fields and radiation that the average person possesses.