No more bipolar disorder?

Our world is replete with diseases of all sorts, illnesses of all kinds, ailments countless in numbers. Modern medicine views these in isolation, and therefore also attempts to treat them in isolation: we have a headache, we take an aspirin; we have high blood sugar, we take insulin injections; we have high cholesterol, we take statin drugs to disrupt the manufacturing of cholesterol in the liver; we have cancer, we are given toxic poisons that kill our cells and hope the cancer will be weakened; we have arthritis or multiple sclerosis, and we are given immune suppressants because it is thought that our own immune system has turned against us, attacking the very body it is intended to protect. We have no idea why, but this is what we do, and this is also what we believe we should be doing.

In psychiatry, we treat so-called mental illnesses. But because we are even more clueless in this realm of the subtle functioning of the brain and mind than we are of the subtle functioning of the body and its organs, we look for drugs that suppress the behaviours which are symptomatic of the “illness” we have been diagnosed with. It’s very simple: we take uppers and stimulants when we are down and low, and downers and sleeping pills when we are high and excited. Because we all do it, we think it’s perfectly normal.

When we take a close look, we see that there are no diseases, no illnesses, no ailments that are not caused by biochemical imbalances, we see that all of our health problems are rooted in problems in the biochemistry, and we see that the functioning of the body and the functioning of the mind cannot be considered independently, because they are both nothing other than the functioning of the whole body-mind.

Surely a most striking example of this is the now almost forgotten disease condition called pellagra. The name comes from the contraction of the Italian pelle (skin) and agra (sour), and was first used by Francesco Frapolli treating people in the 1880’s in Italy where more than 100k suffered from it. But this wasn’t unique to Italy. The same was true in Spain and in France in the late 19th century. In the US, it reached epidemic proportions in the American south where it was estimated that between 1906 and 1940, more than 3m  were affected, and more than 100k actually died from it. Can you image that? That many people, millions of people, in quite a restricted region, walking around in manic states, delusional states, paranoid states, seeing and hearing things, talking or even yelling to themselves and others around them, completely incoherent and, in addition, covered in red, sore, flaking and bleeding skin on the arms, neck, face? What a nightmare it must have been.

In all countries and all cases, pellagra was associated with poor nutrition, and more specifically, associated with corn-based diets in which the maize was not treated with lime in the traditional way. Similarly, in all countries and all cases, it was found that a nutritious diet based on fresh animal foods very quickly resolved the problems that afflicted the sufferers of this disease. So, even in the late nineteenth century, they had figured out how to treat and prevent it. But didn’t know why if they replaced the corn and starches with meats and vegetables people got better.

Pellagra would usually first manifest as skin problems: eczema and psoriasis-like irritations and lesions. Then, it brought about anxiety, depression, irritability and anger. And eventually, periods of full blown mania, visual and auditory hallucinations, extreme fear, paranoia, bipolar and schizophrenic behaviours.

Bipolar-Disorder-Mood_scrabbleLetters

Now, if you know someone, if you have been close to someone diagnosed with bipolar disorder, with schizophrenia, with anxiety disorders, depression, or paranoia, you will immediately recognise in this list of symptoms those you saw in this person, surely to different degrees, and surely in the most extreme during a full blown crisis. Without a doubt, at least for bipolar disorder, these symptoms are all present, often simultaneously, and sometimes in close succession. And do you know what pellagra is? It’s vitamin B3 deficiency.

Yes, pellagra, this terrible disease that caused such awful skin conditions and straight out madness in people, this disease that made these poor people act in ways indistinguishable from those of manic-depressives and schizophrenics, was a simple vitamin B3 deficiency.

When this was understood, niacin fortification was mandated, and the epidemic affecting millions of people in the southern United States was resolved almost instantly. After decades of rampant “mental illness” among so many, so fear, so much anxiety, so much terror within families and communities, so much pain and suffering, and tens of thousands of deaths, a little added niacin ended this national disaster that was the pellagra epidemic almost overnight. The fact that you have most likely never heard of pellagra goes to show how effective niacin fortification has been in preventing it. But something else happened.

Following the introduction of niacin fortification, half the patients held in psychiatric wards were discharged. Just like that, they got better, and went home. There was at least one psychiatrist who noticed this remarkable coincidence: his name was Abram Hoffer. He wondered why so many got better, but also why only a half? What about the other half? Could it be that they just need a little more niacin? Hoffer was an MD, a board-certified psychiatrist, and a biochemistry PhD. He was also the Director of Psychiatric Research for the province of Saskatchewan in Canada, a position he held from 1950, when he was hired and appointed by the department of public health, until 1967, when he opened a private practice.

What he did to check this hypothesis—that maybe more of the psychiatric patients were not mentally ill at all, but just in need of greater amounts of niacin—was to conduct a study. He chose schizophrenics because they are among the most difficult to treat, and also because together with bipolar patients, they have many of the symptoms associated with pellagra. The results were stunning: 80% of the schizophrenics given B3 supplementation recovered. And these results aren’t anecdotal—the word often used in a pejorative or derogatory manner to dismiss important observations or evidence that fall outside the narrow realm of the conventionally accepted. These were the results of the first double-blind placebo-controlled nutrition study in the history of  psychiatry.

What double-blind placebo-controlled means is that he took two equally sized groups of people diagnosed with schizophrenia, and then randomly and blindly, both on the patient’s end as well as on his end, gave half of them 3000 mg of flush-less niacin per day in three doses (niacin has a flushing effect that would be noticed, but either inositol hexanicotinate or niacinamide can be used instead). He gave the other half a placebo, which would have been a pill that looked identical, but contained no niacin or anything else that could have any significant effect on them (like powdered sugar or a starch of some kind). And in the end of the trial, when they looked at which patient got what, they found that 80% of niacin-treated recovered, whereas none in the placebo group showed significant improvements.

Over the years, Hoffer treated thousands of people with remarkable success. With simple vitamin B3 supplementation he continued to successfully treat people suffering from schizophrenia, but also people suffering from attention deficit disorder (ADD), general psychosis, anxiety, depression, obsessive-compulsive disorder (OCD), and bipolar disorder. In fact, he considered pellagra, bipolar disorder, and schizophrenia to be the manifestation of niacin deficiency on different scales, and the sufferers to be niacin-dependent to different extents. Obviously, this is the only natural conclusion he could have drawn given how effectively niacin resolved psychiatric symptoms in these people, but also in light of the fact that each individual seemed to need somewhat different amounts to have these positive effects.

The expression niacin-dependent was used to emphasise that they needed to take it on a daily basis. Naturally, an essential vitamin is not only essential in the sense that it is absolutely needed, but also in the sense that it needs to be consumed regularly because it is not manufactured within the body-mind. Deficiencies develop when the diet lacks in these essential nutrients, and grow more severe as time goes on. When the nutrients are then reintroduced, the deficiencies can be corrected. Some nutrients are abundant, some are rare. Some are easily absorbed, some are not. Some are more easily stored, and some cannot really be stored at all.

In addition, besides the fact that in any given population there is always—for the very same essential nutrient—a range of nutritional needs that vary between individuals based both on their genetic predispositions and on what they do, countless other factors influence and affect the amounts of essential nutrients that each needs to be healthy. These include various kinds of injuries the body-mind, and in particular the gut where absorption of nutrients take place, may have incurred at one point or another from an infection, a virus, a bacteria, a bad diarrhoea we had when we were babies, a childhood disease we don’t even remember, and really anything that could have damaged a specific part of the intestine where a specific family of nutrients are absorbed.

Any such injury could result in a greatly increased need for a particular nutrient that, without knowing about it, could not be supplied in adequate amounts from diet alone, and would inevitably develop into a progressively more severe deficiency whose effects on the body-mind would eventually appear as dysfunctions that would, without a doubt, have physical as well as psychological or psychiatric manifestations. Why? Because there is no body that functions independently of the mind, and there is no mind that functions independently of the body. There is only this single body-mind.

Niacin and B vitamins in general are water-soluble. This means that we pee most of them out, and therefore it also means that we need to have them every day, or nearly, in order to prevent the development of deficiencies. The experience from the last decades of the nineteenth and the first five decades of the twentieth century in Spain, Italy, France, and  in the US, showed that a single vitamin deficiency, a simple niacin deficiency, could cause extreme symptoms that included severe psychiatric dysfunctions. It also showed that even very small amounts of B3 added to the otherwise nutrition-less white bread that was eaten as a staple could cure millions of pellagra sufferers, and prevent the disease from developing in the bulk of the population.

Unexpectedly, niacin-fortification coincided with a large number of the psychiatric ward patients getting well enough to go home. This observation prompted a study with niacin supplementation which showed that in 80% of the schizophrenia patients treated with niacin, symptoms disappeared in the same way they had in pellagra sufferers, but with higher doses of niacin. It was also shown that a similarly high cure rate was seen in people suffering from ADD, psychosis, anxiety, depression, OCD, and, in the point we wanted to emphasise in this article, bipolar disorder. In almost all cases, niacin supplementation resolved the dysfunctional behaviours and psychiatric symptoms. What varied were the amount of vitamin B3 needed to achieve recovery, and the speed with which symptoms would come back upon interruption of the supplementation.

Therefore, whether you are among the lucky people who never were niacin deficient, among the lucky people who need very little niacin, or among the less lucky ones who are deficient, who do need more of it than most, or who are suffering from anxiety or depression, schizophrenia or bipolar disorder, doesn’t it make sense to just start taking a little bit of extra B3 each day? Doesn’t it make sense to give your body-mind the amount of vitamin B3 it needs, recognising that for each one of us this amount may be different, that for some it will be a lot more than for others, but resting in complete assurance that no ill effects will come from it, because niacin supplementation is harmless, and that the only disadvantage of it being harmless, even in large doses, is that we need to take it daily.

And given how inexpensive any form of niacin is, shouldn’t we be giving it in large amounts to every patient in every hospital, psychiatric ward, and medical institution? We should, but this will probably never happen. What we can do is take care of ourselves, of those people closest to us like our children and spouses, siblings and parents, of those people we care about like our friends, colleagues, and even simple acquaintances who come to us for advice or just to share their concerns about a health issue. And one of the simplest and most effective things we can to improve our own health and the health of those around us is by taking a little B3 supplement every day. It could just make you feel more relaxed, more focused, calm and at ease, as it does for me, or it could completely transform your world, bringing you from a state of hyper-anxious, paranoid, delusional and hallucinatory mania, back to a relaxed, helpful and trusting, conscientious and reasonable self, giving you the gift of your own life back to yourself.

Could it really be this simple and this amazingly miraculous? No more pellagra, no more schizophrenia, no more bipolar disorder, just with a little B3 supplementation? Well, maybe. You try it, and let me know.

Vitamin C is not vitamin C

Several years ago now, when I read The Calcium Lie, I found out that vitamin C and whole food vitamin C complex were not the same thing. I wasn’t surprised in the least because obviously this is surely the case for most supplements: an extract is not the whole food. But a few days ago, I saw a short video presentation that forced upon me the realisation that there is a huge functional difference between what is sold as vitamin C and the complex vitamin C molecule we find in whole foods.

wholefoodvitaminc

The distinction may seem trivial at first—it has on the whole clearly been missed—but it is rather important: ascorbic acid, that has been equated to and sold as vitamin C, is the substance from which is made the thin antioxidant shell that protects the many constituents of the vitamin C complex as it is found in food. Since ascorbic acid can be produced in a lab, whereas whole vitamin C complex can only be found and extracted from real food and therefore cannot, this is naturally what has been done: manufacture ascorbic acid and sell it as vitamin C.

This makes sense, of course, because none of the supplement manufacturers would be inclined to emphasise this point. It would be kind of like shooting themselves in the foot. But also because, given the proven biochemical and physiological value of antioxidants, it’s not a far stretch to convince oneself that the usefulness of vitamin C is, in fact, derived from the effects of the ascorbic acid shell. For this reason, when I read Dr Thompson’s comments on vitamin C, I made a point to pile on the red peppers, brocoli and lemons in our diet at home, but nonetheless kept on taking ascorbic acid supplements and do to this day. This is about to change.

Dr. Darren Schmidt is an American chiropractor who works at the Nutritional Healing Center of Ann Arbour and, as most chiropractors, practices natural medicine, treating thousands of patients each year, most of them suffering from the same kinds of complaints, aches, pains and disorders, as is the case everywhere else. The talk was about heart disease: number one killer in the US and very prominent in all industrialised countries. To make it as clear and simple as possible and get the message across, he described that heart disease arises from the gradual filling up of the coronary arteries supplying blood to the heart with arterial plaques that with time grow to block the way completely or almost, and that this leads to a heart attack. We covered this topic in detail in the article At the heart of heart disease.

The main point he wanted to get across is that plaques in the arteries and blood vessels develop because of an injury to the tissues lining the vessels, just like a scab does on the surface of the skin when we accidentally scratch, scrape or cut it, and that a well-functioning organism will fix that injury in the same way as it does the surface of the skin: the scab forms, the skin repairs itself underneath, and when it is healed, the scab falls off. Plaques are like scabs.

He explained that, fresh out of university in the early 90’s, he had heard at a conference someone speak of the work of a great pioneer in nutritional medicine of the first half of the twentieth century, Dr Royal Lee, a friend and colleague of the other great pioneer Dr Weston Price. Dr Lee was the man who made the first food supplement, and the first concentrated whole food vitamin C supplement. He founded in 1929 the Vitamin Products Company, which later became Standard Process, Inc. Lee taught that this concentrated food in tablet form was like a pipe cleaner for arteries. Hearing this, the young chiropractor thought to himself, if it worked then it should work now, and he began to prescribe vitamin C to all his heart disease patients. For a decade he prescribed vitamin C, and for a decade he failed to see significant improvements or any sign of reversal of atherosclerosis in his heart disease patients. But he had missed something.

Frustrated and disappointed, he looked again at the original research and writings of Drs Lee and Price about nutrition and disease, and in particular about vitamin C, and began prescribing only Standard Process vitamin C. What he found, invariably, was a quick improvement in his patients whose chest pains and complains would disappear, and who would gradually feel better and better. Since then, he has repeated this on thousands of people with such success that he now teaches, he now repeats what Dr Royal Lee taught almost a century ago, that the cure for heart disease, for disease of the arteries and atherosclerosis, is vitamin C. And that vitamin C is not ascorbic acid, but it is whole food vitamin C complex.

Schmidt is not handsome nor charismatic. He does not speak eloquently. He is far from refined in his choice of words and speaking style. He doesn’t come across as a brilliant doctor or scientist, and not even as a bright guy, really. But the clinical experience and observations on which his statements and claims are based are undeniably impressive and clearly unambiguous in the information they convey: ascorbic acid has no effect on healing injured tissues and in allowing for the body to clean up and remove the plaques from the arteries and blood vessels; whole food vitamin C complex does, and it does so remarkably well and efficiently in everyone who takes it.

The implication is that other than providing antioxidant effects, ascorbic acid is useless for aiding and promoting healing of tissues. In this case, the concern is the health of the arteries, but it’s not a far stretch to conclude that this applies to all injured tissues in general. What is needed is whole food complex vitamin C, which we eat in whole foods or take in supplements that are made from whole foods. Therefore, it’s a no brainer: if you are interested in keeping your arteries clean and your heart and brain healthy and well-functioning for as long as possible, take a whole food vitamin C complex supplement, and pile on the vitamin C rich foods in your diet (superfoods highest in vitamin C include Camu Camu, Acerola and Goji ; regular foods highest in C include bell peppers, broccoli, brussels sprouts, strawberries and kiwi).

There is one last crucial point to this story, and I was happily surprised to hear it mentioned during the presentation. It is something that is explained by Gary Taubes in Good Calories, Bad Calories, but that is very rarely heard or mentioned anywhere. Vitamin C enters cells through the same channel as sugar does. But for evolutionary reasons, glucose always takes precedence over it (and all other nutrients). Therefore, as long as there is sugar to be shuttled into the cell, vitamin C stays out and waits: it does not enter the cell. So, what does he suggest for the diet? Can you guess? No sugars (simple carbohydrates), no starches (starchy carbohydrates) because they become sugars, lots of fat, adequate protein from healthy animal sources, and lots of green veggies, Sounds familiar? And, of course, whole food vitamin C concentrated in supplement form.

Finally, I promise to write about these and other great pioneers of nutritional medicine. I feel that these people who were greatly ahead of their times and usually greatly suffered from it deserve more recognition than they get. They deserve more recognition than they ever will get. But still, I would like to do my part. I don’t know when, but I will.

If you enjoy this article, please share it with your friends and help more people.

The colour of your skin

Skin colour is the most obviously visible manifestation and expression of our evolutionary history. This history is carried over the course of hundreds of thousands of generations and tens of thousands of years. What we have to understand is that each one of us—as an individual, a person, a self—has nothing to do with the colour of our skin, the colour of our skin has nothing to do with us, and we have no choice in the matter. What we must also understand is that to be optimally healthy, we have to live and eat in accordance with the colour of our skin and what information it carries about our ancestry. All of this is true for you, and it is true for everyone of every colour in the magnificent spectrum of human skin colours as it exists on the planet today. Let me explain why.

skinColourPalette

(Photo credit: Pierre David as published in this article of the Guardian)

The Sun, like every other star in the universe, formed from the gravitational collapse of a huge cloud of gas. This happened about 5 billion years ago. All the planets, like every other planet everywhere in the universe, formed from the left over debris that wasn’t needed or used in making the Sun, and that remained orbiting around it in a large, flat accretion disk consisting of 99% hydrogen and helium gas and only 1% of solid dust particles. In a blink of an eye, a million years or so, the disk was replaced by a large number of planetesimals. An additional couple hundred million years or so, and the planets of our Solar system were formed.

Beyond the snow line, the radius from the Sun past which water can only exist as ice and where the temperature is below -120 C, volatiles froze into crystals, and were formed from massive icy cores the gas giants: Jupiter (the king at 320 times the mass of the Earth), Saturn, Uranus and Neptune. Within the snow line were formed the rocky planets: Mercury, Venus, Earth and Mars. About 4.5 billion years ago the Solar system was in place. It was in place but not quite like we know it today. It was fundamentally different in several ways, especially in regards to what concerns us here, which is how the Earth was: a fast-spinning burning inferno of molten rock spewing out of volcanos everywhere and flowing all over the globe, completely devoid of water, oxygen, carbon and other volatiles species.

The Earth formed more or less simultaneously with a very close neighbour about the size of Mars. Inevitably, soon after their formation, they collided. This apocalyptic encounter tilted the Earth off its original axis and destroyed the smaller planet that, in the collision, dumped its iron core into the Earth, and expelled about a third of our planet into the atmosphere. Most of the stuff rained back down, but some of the material lumped into larger and larger lumps that eventually resulted in the moon, our moon. When it formed, the moon was a lot closer—it would have looked twice as large as it does now, and the Earth was spinning approximately five times faster than it does today—a day back then would have lasted only 5 hours. Because of the proximity between them, huge tidal forces would have deformed the liquid Earth on a continuous cycle driven by its super short 5-hour days. This would have heated the Earth tremendously by squeezing its insides from one side and then from the other, and caused massive volcanic activity all over the globe.

But this inelastic gravitational interaction, this drag of the moon on the Earth worked, as it still does, to sap rotational energy from the Earth and transfer it to the smaller and far less rotationally energetic moon. This made, and continues to make, the Earth slow down, the moon speed up and therefore drift out into a progressively larger orbit. The moon’s drag on the Earth continues to make the Earth’s spin slower and the moon’s orbit larger, but at an increasingly slower rate, now of 3.8 cm per year. This will continue until there is no more rotational energy to be transferred from the Earth to the moon, at which point we will be tidally locked in order with the moon, and not only will we always see the same side of the moon as we do today, but the moon will also always see the same side of the Earth. For what it’s worth, this will happen way after the Sun has come to the end of its life, and thus in more than 5 billion years. So, for now, this is definitely not a major issue.

Besides this important difference in the Earth’s spin rate and its relationship with the moon, there were a lot of left overs from the Sun’s formation that had clumped up in asteroids and comets whirling around in all sorts of both regular and irregular orbits that had them sweeping across the Solar system from the furthest reaches and most distant places to the inner regions near the Sun and rocky planets. The Heavy Bombardment lasted for a period of approximately 500 million years from about 4.3 to 3.8 billion years ago. During this tumultuous early history of our Solar system, a lot of these asteroids and comets flying past the Earth and the other rocky inner planets were gravitationally captured and pulled in towards the planet to crash on the surface or just swoop down into the atmosphere, leaving behind all or some of their mostly volatile constituents: water and carbon compounds. The Earth would have been regularly bombarded by massive asteroids, and the energy dumped by the impacts would have made it a hellish place covered in flowing lava, obviously without any crust, but rather only molten rock flowing everywhere and volcanos spewing out noxious gases and spilling out more molten rock that merged into the already flowing streams of lava. Very inhospitable.

But with these brutal hundreds of millions of years of bombardment from asteroids and comets, water and carbon compounds were brought to our planet. Given how hot it was, the water was in the atmosphere as vapour, and so were the carbon monoxide and dioxide as well as methane. However, these were now bound to the planet gravitationally and couldn’t escape back into space. Once the bulk of the randomly orbiting solar system debris had been cleared out and incorporated into the various planets onto which they had fallen, the bombardment came to an end, and the Earth started cooling down. It is believed that the last major sterilising impact would have hit the Earth around 3.9 billion years ago.

Cooling during a few thousand years allowed the formation of a thin crust. Further cooling then brought on thousands of years of rain that dumped most of the water vapour from the atmosphere onto the surface. This formed vast planet-spanning oceans. The whole planet was at this point still super hot, but also super wet, and therefore super humid, with the surface practically entirely underwater, lots of active volcanos all over the place but otherwise no mountains. Nevertheless, there would have been some  slightly more elevated places, like on the flanks of volcanos, that would have been dry at least some of the time, leaving some spots where water could accumulate in ponds and stagnate. As soon as these conditions were present, around 3.8 billion years ago, the Earth saw its first microbial life emerge.

Claims for the earliest evidence of life at 3.8, 3.7 or 3.5 billion years are still controversial, but it is well established that hydrogen cyanide dissolved in water produces a diversity of essential biological molecules like urea, amino acids and nucleic acid bases; that formaldehyde in slightly alkaline water polymerises to form a range of different sugars; that amino acids, sugars and nucleic acid bases as well as fatty acids have been found in carbonaceous meteorites; and that by 3 billion years ago, prokaryotes (organisms made of cells without a nucleus) were widespread.

There was a major problem, a major impediment to life, that had to be overcome. This was the fact that the entire surface of the Earth was exposed during the day to the Sun’s UV radiation, and UV rays destroy biological structures and DNA. The cleverest of tricks would have been to find a way to absorb these energetic photons and use the energy for something.

Nature is very clever: by 3.5 billion years ago, chlorophylls believed to have developed in order to protect proteins and DNA of early cells appeared, and chlorophyll-containing cyanobacteria—the oldest living organisms and only prokaryotes that can do this—had developed the ability to absorb light, use that energy to split water molecules and use the free electron from the hydrogen atom to sustain their metabolism, spewing out the oxygen in the process. Oxygen accumulated in the crust for a billion years before the latter became saturated with it and unable to absorb any more. Evidence for increasing oxygen levels in the atmosphere is first seen at around 2.5 billion years ago. By 2.2 billion years ago, oxygen concentrations had risen to 1% of what they are today.

Increasing concentrations of reactive and corrosive oxygen was devastating for all forms of life that, at this stage, were all anaerobic: the oxygen was combining with everything it got in contact with creating all sorts of reactive oxygen species (free radicals) that went around causing damage, exactly as they do in our bodies and that of all animals today, and which, in the absence of antioxidants to neutralise them accelerated ageing and death. These were the only card that these simple anaerobic organisms were dealt.

Nevertheless, for another reason entirely, atmospheric oxygen was a blessing because it turned out to be an excellent UV shield. Not only that, but the splitting of oxygen molecules (O2) into oxygen atoms promoted the recombination of these free-floating oxygens into ozone (O3) that turns out to be an even better UV absorbing shield. So, the more photosynthesis was taking place on the surface, the greater the concentration of atmospheric oxygen grew. The more molecular oxygen there was in the atmosphere, the more ozone could be formed. And the more ozone there was to protect and shield the surface from the harsh UV radiation from the Sun, the more complex and delicate structures could develop and grow. Pretty cool for a coincidence, wouldn’t you say?

By 2 billion years ago—within 200 million years—the first eukaryotes appear (organisms made of cells with a nucleus). This makes good sense considering that these simple organisms and independently-living organelles had a great survival advantage by getting together in groups to benefit from one another and protect each other behind a membrane while making sure the precious DNA needed for replication and proliferation was well sheltered inside a resilient nucleus. Note here that these would have been trying to protect themselves both from the damaging UV radiation streaming down from the Sun (it’s estimated that DNA damage from UV exposure would have been about 40 times greater than it is today), as well as from the corrosive oxygen floating in the air (imagine how much more oxidising it is today with concentrations 100 times greater than they were). And in there, within each of these cells, there were chloroplasts—direct descendants from the first UV absorbers and converters, the cyanobacteria—whose job was to convert the photons from the sun into useful energy for the cell.

In all likelihood unrelated to this biological and chemical evolution of the Earth’s biosphere and atmosphere, a long period of glaciation between 750 and 600 million years transformed the planet into an icy snow and slush ball. And with basically all water on the surface of the globe having frozen over, all organisms under a thick layer of ice and snow, photosynthetic activity must have practically or completely ceased. Fortunately, without liquid water in which to dissolve the atmospheric carbon dioxide into the carbonic acid that in turn dissolves the silicates in the rocks over which is streams and carries down to the ocean floor for recycling by the active tectonic plates, all the carbon dioxide sent into the atmosphere by the volcanos just accumulated. It is believed to have reached a level 350 times higher than it is now. This is what saved the planet from runaway glaciation.

Thanks to this powerful greenhouse of CO2, the ice and snow eventually melted back into running streams and rivers, and flowing wave-crested seas and oceans. With water everywhere and incredibly high concentrations of CO2, plant life exploded. And soon after that, some 540 million years ago, complex animals of all kinds—molluscs, arthropods and chordates—also burst into existence in an incredible variety of different body plans (morphological architectures), and specialised appendages and functions. This bursting into life of so many different kinds of complex animals, all of them in the now already salty primordial oceans, is called the Cambrian Explosion. Complex plant life colonised the land by about 500 million years ago, and vertebrate animals crawled out of the sea to set foot on solid ground around 380 million years ago.

Clearly, all plant life descends from cyanobacteria, first to develop the ability to absorb UV radiation, and without complex plant life, it is hard to conceive of a scenario for the evolution of animal life. The key point in this fascinating story of evolution of the solar system, of our Earth and of life on this planet as it pertains to what we are coming to, is that the light and energy coming from the Sun are essential for life while being at the same time dangerous for the countless living organisms that so vitally depend on it. In humans and higher animals this duality is most plainly and clearly exemplified by the relationship between two essential micronutrients without which no animal can develop, survive and procreate. These vital micronutrients are folate and vitamin D.

What makes folate (folic acid or vitamin B9) and vitamin D (cholecalciferol) so important is that they are necessary for proper embryonic development of the skeleton (vitamin D), and for the spine and neural tube as well as for the production of spermatozoa in males (folate). Vitamin D transports calcium into the blood from the intestinal tract making it available to be used in building bones and teeth; folate plays a key role in forming and transcribing DNA in the nucleus of cells, making it crucially important in the development of all embryonic cells and quickly replicating or multiplying cells (like spermatozoa).

Here’s the catch: vitamin D is produced on the surface of the skin (or fur) through the photochemical interaction of the sun’s UV-B rays and the cholesterol in the skin; folate is found in foods, mostly leafy greens (the word comes from the latin folium that means leaf), but it is broken down by sunlight.

What this translates to is this: too little Sun exposure of the skin leads to vitamin D deficiency, which leads to a deficiency in the available and useable calcium needed to build bones, which in turn leads to a weak, fragile and sometimes malformed skeletal structure—rickets; too much Sun exposure leads to excessive breakdown of folate, which leads to folate deficiency, and which in turn leads to improper development of the quickly replicating embryonic cells of the nervous system and consequent malformation of the neural tube—spina bifida.

The most important thing of all for the survival of a species, is the making and growing of healthy babies and children so that they can make and grow other generations of healthy babies and children. This is true for all living beings, but it is not just true: it is of the highest importance, and it has been—taking evolutionary precedence over everything else—since the dawn of life on Earth. Here is how the biochemistry of the delicate balance between these two essential micronutrients evolved.

Six to seven million years ago, our oldest ape-like ancestors walked out of the forest and into the grassy savannah most probably to look for food. (Isn’t this what also gets you off the couch and into the kitchen?). It is most probably the shift in climate towards hotter and dryer weather and, in response to that, the shrinking of their woodlands, that pushed them to expand their foraging perimeter out into the plains that were growing as the forests were shrinking.

Our first australopith ancestors, these ancestors that we share with modern chimpanzees, would have been in all likelihood covered in hair with pale skin underneath (just as chimps are today), their exposed skin growing darker in time with exposure to sunlight. Having left the forest cover, they were now exposed to the hot scorching Sun most of the day, while walking around looking for food, before going back to the forest’s edges to sleep in the trees.

Natural selection would now favour the development of ways to stay cool and not overheat. This meant more sweat glands to increase cooling by evaporation of water on the surface of the skin. It also meant less hair for the cooling contact of the air with the wet skin to be as effective and efficient as possible. But less hair implied that the skin was now directly exposed to sunlight. To protect itself from burns and DNA damage, but also to protect folate, natural selection pushed towards darker skin: more melanocytes producing more melanin to absorb more photons and avoid burning and DNA damage.

In these circumstances, the problem was never too little sun exposure; it was too much exposure, and thus sunburns and folate deficiency. So these early hominids gradually—and by gradually is meant over tens of thousands of years—became less hairy and darker-skinned. They also became taller and leaner, with narrow hips and long thin limbs: this gave less surface area exposed to the overhead sun but more skin surface area for sweating and cooling down, together with better mechanical efficiency in walking and running across what would appear to us very long distances in the tens of kilometres every day, day after day, in foraging and hunting, always under a blazingly hot sunshine. This process that is described here in a few sentences took place over millions of years, at least 3 or 4 and most probably 5 or 6 million years. The Turkana boy, a 1.6 million years old fossilised skeleton is definitive proof that by that time, hominids were already narrow-hipped and relatively tall.

From an evolutionary standpoint it couldn’t be any other way. While keeping in mind that we are still talking about ancient human ancestors, and not modern homo sapiens, nonetheless, did you, as you were reading these sentences, start to wonder who today would fit such a physical description of being hairless, dark-skinned, tall, lean and narrow hipped? Naturally: savannah dwelling modern hunter-gatherers, and, of course, the world’s best marathon runners. It makes perfect sense, doesn’t it?

Taking all currently available archaeological, paleontological, anthropological, as well as molecular and other scientific evidence as a coherent whole brings us to the most plausible scenario in which all humans on the planet today descend from a single mother who was part of a community of people living somewhere on the western coast of Africa; that it is this group of modern homo sapiens that first developed and used symbolic language to communicate and transmit information and knowledge acquired through their personal and collective experiences; and that it was descendants of these moderns who migrated in small groups, in a number of waves, first into Asia and later into Europe, starting 70 to 100 thousand years ago.

It is very interesting that we also have evidence that moderns had settled areas of the middle east in today’s Israel and Palestine region as early as 200 thousand years ago, and that these moderns shared the land and cohabited with Neanderthals for at least 100 thousand years, using the same rudimentary tools and technologies, without apparently attempting to improve upon the tools they had. Meanwhile, this other group of western African coast moderns had far more sophisticated tools that combined different materials (stone, wood, bone), as well as decorative ornaments and figurines.

Thus, although equal or close to equal in physical structure, appearance, dexterity and skills—a deduction based on fossils and evidence that newer and better tools were immediately adopted and replicated in manufacture by moderns to whom they were introduced by other moderns—it is clear that different and geographically isolated communities of moderns ate differently, lived differently, developed differently and at different rates.

This is not surprising, really. Some children start to speak before they turn one, while other do not until they are two, two and a half or even three. Some children start to walk at 10 or 11 months, while others just crawl on the ground or even drag their bum in a kind of seated-crawl until they are three or more. And this is for children that watch everyone around them walking all day long, and listen to everyone around them speak using complex language also all day long. Now, what do you think would happen if a child grew up without being exposed to speech? Why would they ever, how could they ever start to speak on their own, and to whom would they speak if nobody spoke to them?

Fossil evidence shows that the structures in the ear and throat required for us to be able to make the sounds needed for refined speech and verbal communications were in place (at the very least 200 thousand years ago) tens and even hundreds of thousands of years before the first evidence of symbolic thought (70-50 thousand years ago) and together with it, it is assumed, advanced language.

Symbolic thinking in abstract notions and concepts is the most unique feature of our species. It is the hallmark of humans. And it is the most useful and powerful asset we have in the evolutionary race for survival. Sophistication in symbolic thought can only come with sophistication in language and in the aptitude for language: it is only by developing and acquiring more complex language skills that more complex symbolic thinking can come about, and more sophisticated symbolic thinking naturally leads to developing a more sophisticated and refined language in order to have the means to express it.

It’s surely essential to recognise that this is as true for our ancestors, those that developed that first symbolic language, as it is for you and me today. The difference is that then, the distinction was between those few moderns that used symbolic language and those that didn’t, whereas today, the distinction is more subtle because everyone speaks at least one language to a greater or lesser extent. Nonetheless, anyone can immediately grasp what is described here by listening to Noam Chomsky lecture or even just answer simple questions in the course of an interview.

As they moved northward, settling in different places along the way, staying for thousand or tens of thousands of years, then leaving their settlements behind, either collectively or in smaller groups, and moving on to higher latitudes before settling again somewhere else, these people encountered a wide range of different climates and geographical conditions: usually colder, sometimes dry and sometimes wet, sometimes forested and sometimes open-skyed, sometimes mountainous and sometimes flat. In all cases, they were forced to immediately adapt their living conditions, building suitable dwellings and making adequate clothing. This, we know for sure, because they would have simply not survived otherwise, and it is only those that did survive that are our direct ancestors.

Evolutionary adaptation through natural selection of traits and characteristics arising from small—and, on their own, insignificant and typically unnoticeable—random genetic mutations also took place as it does in every microsecond and in every species of animals and plants. But this, we know to be a slow process that is measured on the timescale of tens of thousands of years (10, 50 even 100). Now, consider the evolutionary pressure—the ultimate evolutionary pressure—of giving birth to healthy and resilient offspring that will grow up to learn from, take care of, and help their parents. The most pressing evolutionary need at these higher latitudes was for the body to more efficiently make and store vitamin D from the incoming UV-B rays that, (and this is an important detail often overlooked or under appreciated), make it to the surface only when the Sun is high in the sky and have less atmosphere to go through. This stringent restriction on the few hours near midday when UV-B can make it to the surface is both constraining and life-saving: it is constraining because only during those hours can the essential vitamin D be made, and it is life-saving because a continual exposure to this energetic, DNA-damaging UV radiation would in time sterilise the surface of the entire planet.

The higher the latitude, the lower the Sun’s path on the sky throughout the year and especially during the winter months. Therefore, the shorter is the season during which UV-B rays reach the surface and during which it is possible for vitamin D to be produced on the skin or fur of animals. The only solution to this severe evolutionary pressure is as little body hair and as little pigmentation as possible (think of the completely white polar bears, arctic wolves, foxes and rabbits). As an aside, what else do you think as advantageous in the cold? The opposite as what is in the hot sun: more volume for less surface area; a smaller and stockier build that keeps heat better, exactly as we see in the cold-adapted Neanderthal.

Settled in a place that provides what we need to live relatively comfortably, we tend to stay there. This has always been true, and even if it has changed in the last few generations in industrialised western countries, we have witnessed this phenomenon up until very recently on islands like Sardinia, Crete, or Okinawa, remote valleys in the Swiss Alps, the Karakoram, Himalayas or Andes, and in other geographically isolated pockets of people with genetic characteristics homogenous amongst themselves but distinct with respect to other human populations. And thus across the world we find a whole spectrum—a rainbow—of different colours and shades of skin, different colours of hair and eyes, different amounts and textures of body hair, of different physical builds and morphologies, of different metabolic and biochemical sensitivities, all seen on a continuum, all dependent upon the evolutionary history of the subpopulation where particular characteristics are seen to be present or absent to a greater or lesser extent, and all of this driven by the evolutionary pressures to adapt and maximise the survival probability of our offspring, our family, our clan, our species, by optimising the amount of folate and vitamin D through the delicate balance between not enough of the latter from under-exposure to UV-B’s that produce it, and not enough of the former from excessive exposure to the same UV-B’s that destroy it.

What this tells us is that, for one thing, we have absolutely nothing to do with the colour of our skin, eyes and hair, and nothing to do with any of the physical and biochemical characteristics we have inherited. It tells us that this has nothing to do with our parents or grand parents either, really, because these are particularities that have evolved over tens of thousands of years of evolution in a very long line of ancestors that settled in a place, stayed put and lived at a particular latitude in a particular geographical setting with a particular climate. It tells us, in the most obvious manner, that because this is so, discrimination based on colour or physical features is not jut unfounded, but it is simply absurd.

If you’re black, you’re black. If you’re white, you’re white. If you’re chocolate or olive-skinned, then you’re chocolate or olive-skinned. If you are “yellow” or “red” then that’s just how it is. And who cares how you phrase it or not, try to be “political correct” and avoid speaking of it. That’s just silly. All of it is simply just the way it is. In the same way, if you’re short or tall, hairy or not, thin or stocky, it is just the way it is. However you are and whatever features you consider, there is never anything more or less about it, never anything more or less about any of these features: it is an expression of our genetic ancestry going back not just a few but hundreds of thousands of generations.

What this also tells us is that we have to take this information into account in everything we do, especially in regards to what we eat, where we live, and how much or how little we expose ourselves to the Sun’s vitally important UV-B rays. Disregard for these fundamentally important details leads to what we see in the world in this modern era where we all live wherever we want, more or less, and find ourselves with our olive or dark brown skin living in at high northern latitudes, or with our fair or milk-white skin living near the equator with strong overhead sun all year round, and see the consequent high rates of vitamin D deficiency and rickets in our dark-skinned northern dwellers, together with the similarly high rates of folate deficiency and spina bifida in our fair-skinned southern dwellers.

In general, if you are dark-skinned you need to expose your skin to the sun a lot more than if you are fair-skinned, because you will both produce less vitamin D and store less. If you are fair-skinned you need less exposure and will tend to store the vitamin D more efficiently for longer periods of time. As for folate, we all need to eat (or drink) leafy greens (i.e., foliage) and green veggies.

However, there is an additional complication that makes matters worse (far worse) That complication is that in this day and age, we all live inside, typically sitting all day facing a computer screen, and sitting all evening eating supper and then watching TV. Not everyone, of course… but most people. Not only that, but most of us all over the world now eat more or less the same things: highly processed packaged foods usually high in processed carbs and low in good, unprocessed fats, high in chemicals of all kinds and low in nutrients, and hardly any leafy and green veggies or nuts and seeds. And boy do we love our Coke, our daily bread, our fries and potatoes, our pizzas and big plates of pasta, and our sweets and desserts! Not everyone, of course… but most people. Consequently, we are all as deficient in folate as we are in vitamin D. We are all as deficient in unprocessed fats and fat-soluble vitamins as we are in all other essential micronutrients. How depressing.

But once we know this, once we have been made aware of this situation, we can correct the problem by switching to a diet of whole foods—of real foods—rich in folic acid and fat-soluble vitamins like A, D, E and K2, (the inuits, for example, get all their vitamin D and the other fat-soluble vitamins from the fat of the whales and seals they eat), and supplementing adequately to maintain optimal levels of both vitamin D (80-100 ng/ml or 200-250 nmol/L) and folate (>5 ng/ml or >11 nmol/l), especially during conception, pregnancy and early childhood, but throughout life and into old age.

There’s one last thing I wanted to mention before closing, and in which you might also be interested: can we ask if one is more important then the other, folate or vitamin D, and do we have a way to answer this question from an evolutionary standpoint? Well, here is something that suggests an answer: in all races of humans on Earth, women are on average about 3% lighter in skin colour than men of the same group. For decades, researchers (mostly old men, of course) were satisfied with the conclusion that this was the result of sexual selection, in the sense that men preferred lighter skinned women and so this is how things evolved over time. Of course, most of you will agree with me now that this just sounds like a cop-out or at best a shot in the dark from a possibly sexist male perspective.

Most of you will surely also agree that considering the question from the perspective of the importance of vitamin D versus folate is clearly more scientific in spirit than claiming sexual selection to explain the difference. And if women are lighter than men no matter where we look on Earth, this strongly suggests that it is either more difficult to build up and maintain good levels of vitamin D to ensure healthy offspring, or that it is more important. In today’s world, it certainly is true that it is far easier to have good levels of folate because even if you stay inside all day, as long as you eat leafy greens or drink green juice, your folate levels will easily be higher than the optimal minimum of 5 ng/ml, and probably much higher, like mine which are five time higher than that at 25 ng/ml.

So, for us today, especially if we eat greens, there is no question that we have to pay much closer attention to our vitamin D levels that tend to be way too low across the board all over the world. We can hypothesise that if we continue evolving over millennia following this indoors lifestyle that we have, humans everywhere will continue to lose both body hair and pigmentation, even those who live in sunny countries, because they don’t expose themselves to the Sun. I would like to encourage you to instead expose your skin to the amount of sunlight that is in accord with your complexion, drink green juice, monitor your vitamin D levels at least once per year, and take supplements to ensure both stay in the optimal range (I recommend taking A-D-K2 together to ensure balance between them, better absorption and physiological action). That alone, even if you don’t do anything else, will be of great benefit to you, and, if you are a soon-to-be or would-like-to-be mother, of even greater benefit to your child or children.

And next time you go out, and each time after that, pay attention, look and appreciate the amazing richness and beauty of all the different skin colours and unique physical features of all the people you see all around. What you will be seeing is the inestimable richness and incalculable pricelessness of our collective human ancestry expressing itself vividly and openly, nothing held back and nothing hidden, for everyone to see and appreciate.

If you are interested in reading more about the topics touched upon in this article, its contents draw from the books Life in the Universe, Rare Earth, Masters of the Planet, The Story of the Human Body and the Scientific American special issue Evolution that features the article, entitled Skin Deep, that prompted me to write this post. And please share this post: we all need to do what we can to help overcome discrimination based on race and appearance.

If you enjoy this article, please share it with your friends and help more people.

Which supplements exactly?

Supplements can be, on the one hand, extremely important, crucial even, especially for recovering or rebuilding our health. On the other hand, they are exactly what their name says: supplements—something extra. The foundation of natural health is and always must be food. Without putting food as medicine first, we will never thrive, because supplements cannot really make up for the damage caused by inadequate nutrition.

Supplements are useful to either correct an existing deficiency, prevent one from developing, or enhance and improve bodily functions in some way. And the best supplements are those that are most easily and naturally absorbed by the body. Because we as a species have until the last few decades only ever eaten whole foods derived directly from nature, whole foods will always be dramatically more easy to absorb than isolated substances. Secondarily, isolated substances extracted from whole foods will always be dramatically more easy to absorb than synthetically produced substances. Finally, some substances will be intrinsically more bio-available than others.

I will not here engage in an extensive description of supplements, where they come from, what they do, and why I take them. The global knowledge base accessible through the internet is far better suited to provide you with all the information you may want to have about any particular supplement. I will simply share with you which supplements I buy for the three of us at home. In some cases we take them every day, and in others for periods at different times. Some are taken only in the morning, and others are taken twice a day.

The primary deficiencies we either have corrected or continue to work to correct were/are B12, A, D, K2, iron, zinc, magnesium and iodine. The first five were caused by our two decades of vegetarianism (my wife and I). The last two are quasi-universal due to heavily mineral-depleted soils everywhere. Otherwise, the focus is on anti-oxidants like vitamins A and D, C and B3, astaxanthin, ubiquinol, turmeric and Megahydrate, those substances that neutralise free-radicals and prevent the damage they can cause; and super foods like chlorella and spirulina (in pellets or powder), wheat grass juice powder (in green juices), and maca (in puddings), and the like.

General:

  • Unrefined North-Altantic Sea Salt (2-3 tsp per day in drinks and food)
  • Concentrace Mineral drops (10 drops per liter for a total of about 30 drops per day)
  • Green superfood blend in morning green juice (Food Matters, MegaNutrition,  Biotona, Vitamineral Green, or equivalent)

Transdermal (baths):

  • Magnesium Chloride
  • Sodium Bicarbonate

Transdermal (patches by Dr David or Healthy Habits):

  • B12 (+ 10 supporting)
  • Glutathione (+ other antioxidants)

Morning (all water soluble and not requiring food):

  • Probiotics (1 cap, the best I have tried up to now are by Prescript-Assist)
  • Tulsi (1 cap, either extract by Source Naturals or 2 caps whole leaf by Organic India)
  • Iodoral (start with 1 cap of 12.5 mg for 1 week, increase to 2 thereafter; Optimox)
  • Liposomal vitamin C (1 g; Mercola)
  • Vitamin B3 (niacinimide 2 x 500 mg; Source Naturals)
  • Chlorella (start with 5 mini pellets, increase by 1 every 3 days up to 25; HealthForce, Mercola, or equivalent)
  • Spirulina (start with 1 large pellets, increase by 1 every 5 days up to 5; Nutrex Hawaii or equivalent)

After lunch (many fat-soluble and/or taken on a full stomach):

  • Astaxanthin (12 mg; Bioastin by Nutrex Hawaii)
  • Krill oil (2 caps; Mercola)
  • Ubiquinol (1 cap; Mercola)
  • Turmeric (2 caps; Organic India)
  • Tulsi (1 cap, either extract by Source Naturals or 2 caps whole leaf by Organic India)
  • Iodoral (25 mg; Optimox)
  • Liposomal vitamin C (1 g; Mercola)
  • Vitamin B3 (niacinimide 2 x 500 mg; Source Naturals)
  • Hydrogen Boost or Megahydrate (500 mg; same product, different name)
  • Whole Food Multi (MegaFood)
  • L-Carnosine (500 mg; Source Naturals)

Occasionally (usually 3 months on, 3 months off or punctually):

  • Silicic Acid (15 ml in water, first thing in the morning, 3 months on, 3 months off)
  • D3/K2 (liquid form with green juice or pudding during the day; Thorne Research)
  • A, D3, K2 (after lunch; DaVinci)
  • Bio B12 (after lunch; Thorne Research)
  • Zinc (after a meal; 3 months on, 3 months off and when needed; MegaFood)
  • Iron Bisglycinate (after a meal; 1 months on, 3 months off; Thorne Research)
  • Enzyme complex (on empty stomach at least 30 min before eating; HealthForce, LivingWell, or similar high quality broad spectrum.)

There is information about all of these on the web, in many more than one place. There is also a lot of information about most of these on Mercola’s web site, where there are often long, explanatory videos that present most of what you need to know on the particular supplement.

Naturally, as you should suspect, this list has and continues to evolve in time, but it has been pretty stable for the last few years. If you are not taking supplements and feel overwhelmed by this, start with the ones in bold.

It’s not possible to determine what are your personal deficiencies without complete blood work and lengthy exchanges about all sorts of things relating to your past and health history, but none of these supplements will cause you harm, and on the contrary, all will help you enhance you health. I hope you find it helpful.

If you enjoyed reading this article, please click “Like” and share it on your social networks. This is the only way I can know you appreciated it.

Updated recommendations for magnesium supplementation

Daily magnesium supplementation is definitely more than a no-brainer, it is really very important, and this, for everyone. I hope that I managed to convey just how important it really is in Why you should start taking magnesium today, Treating arthritis Ias well as in At the heart of heart disease. In terms of supplementation, however, I would like to refine my recommendations.

Nigari, or magnesium chloride, is excellent because it is inexpensive and easily absorbed. I continue to stand by this, and also continue to use it very regularly. However, I now only use it trans-dermally (on the skin), and recommend you do the same. The reason for this is very simple. Taking it internally, is fine, but because absorption goes through the digestive system, the most that will be absorbed is estimated at 25%, and the rest will be eliminated.

And how will it be eliminated? Well, naturally, through the stools. And I, after using a 2% nigari-water solution orally for supplementation for several months (even with some breaks as recommended by proponents of this manner of magnesium supplementation), found that my colon gradually became more and more irritated (which could be felt when passing stools and wiping). When I would stop supplementation for a few days, the irritation would go down; when I started again, it would come back. Therefore, after a couple of times checking this, it became clear that it was indeed oral supplementation with magnesium chloride that was the cause of the irritation in the colon.

But why even bother taking magnesium chloride orally when it is far better absorbed through the skin? Magnesium oil (20-30% nigari-water solution) that you must leave on for 30 minutes, works great, but the most pleasant is definitely a 30 minute bath spiked with a cup of nigari flakes. This is without a doubt the most effective and most agreeable way to supplement, while ensuring maximum absorption by the body of the magnesium ions so importantly needed by cells in tissues throughout the body.

Having said that, I recognise that having baths every day is time consuming, only really tempting when the weather is cool, and also wasteful in terms of water usage. Therefore, we don’t have baths when it is hot, and should restrict it to a max of three times per week in the cold season, using the least amount of water, and having really short showers on the days in between in order to keep water consumption as reasonable as we can. In the end, magnesium oil is far more environmentally friendly, because it works all year around and does not result in accrued water consumption.

As an aside relating to hot water usage and energy efficiency, because heat loss is always directly proportional to the difference in temperature between ‘inside’ and ‘outside’ , we should set the temperature on our hot water heater to the minimum useable temperature. This minimise heat loss, and consequently, energy consumption for water heating.  I have determined that temperature to be 41-42 C. These temperatures are also perfect to wash the dishes, wash your hands or face, shower, and also to run a bath that is hot (but not too hot) when you get in, and after 25-30 minutes is still hot enough for you to feel comfortable in the water without any hint of feeling cold, but not too hot such that you can’t stand it any longer, or be sweating for half an hour after you’ve gotten out. (Actually, 40 C is perfect for a shower, dishes, hands and face, etc, we need 1 or 2 degrees more for a bath due to heat losses into the tub and air.)

Naturally, the exact ideal hot water temperature is a personal thing that depends on many factors, surely most importantly on body composition and especially basal body temperature, which in turn depends on metabolism. In my case, basal body temperature is as low as can be, since my metabolism runs almost exclusively on fat, and you’ll remember that fat burns cool while carbs and protein burn hot. Anyway, you need to experiment a little, but I’m pretty sure that you will find your ideal hot water temperature between 40 and 43 C.

Because magnesium is water soluble and used up as it is needed every day throughout the day, it is necessary to supply the body with it on a daily basis. Naturally, eating foods rich in magnesium is essential (almonds and greens are the best), this is typically not enough, and oral supplementation is quick and easy. Fortunately, the perfect magnesium supplement is now available. This is ReMag, designed and marketed by Dr. Carolyn Dean (the doctor who wrote The Magnesium Miracle), and who guarantees that it’s 100% absorbed by the cells because it is in a form that is small enough to pass through the 400-500 pico metre sized ion channels that regulate mineral absorption and excretion through cell walls, and therefore, that none of it is eliminated through the digestive system as are most forms of magnesium supplements. (You can read her e-book about it here, and watch this recent video on Mercola’s site.)

So, these are my updates recommendations for magnesium supplementation:

  1. Magnesium oil on the skin for a couple of months to quickly replenish cellular magnesium levels,
  2. Bath with 1-2 cup of nigari flakes, once or twice a week, and
  3. L-Threonate (liposomal) or ReMag (pico sized) taken orally.

This is really important for everyone, but crucial for any person suffering from any kind of illness or disease condition whatsoever.

If you enjoyed reading this article, please click “Like” and share it on your social networks. This is the only way I can know you appreciated it.

B12: your life depends on it

There are very few nutrients as crucial to our well-being as vitamin B12. The reason why this is so is that vitamin B12 is essential for cellular energy metabolism, gene transcription, and nervous system function. This vital role at the cellular level is not restricted to only some tissues and organs: it is vital for every single cell of every tissue and every organ.

For the nervous system, both for the central nervous system—our brain—and the peripheral nervous system—the spine and entire network of nerves connected to the brain and coursing through the whole body—vitamin B12 is essential in building, maintaining and repairing the myelin sheath that covers every nerve to ensure protection and proper nerve signalling. It is, in fact, the consequences of B12 deficiency on the nervous system that most often betray this very serious problem.

Everyone should supplement and maintain blood levels of B12 in the range from 600 to 2000 pg/ml in order to avoid and, if this is the case, help recover from the wide range of problems that result from B12 deficiency or insufficiency. Health care practitioners: this is the first thing you should check for every patient that comes in, independently of their age or condition.

What is vitamin B12 and how is it absorbed?

B12 or cobalamin is a large molecule whose central atom is cobalt, and around which are arranged various other compounds. To be active in the body, the cobalamin molecule must be in one of two enzyme forms: methylcobalamin or adenosylcobalamin, both of which must be in a charge state of +1. Even though cobalamin can exist in two other charge states, +2 and +3, neither of these is bio-active. Its most powerful antagonist is nitrous oxide (N2O; laughing gas), which continues to be commonly used as an anaesthetic agent during surgical operations, because it inactivates the molecule by modifying the cobalt ion from a charge state of +1 to one of either +2 or even +3.

Cobalamin is produced in the gut of animals by specific bacteria that make part of the intestinal flora. Although this can also be true for humans, we have mostly relied on animals both by eating them and products derived from them, like eggs and dairy. In animal foods, cobalamin is always bound to protein from which it needs to be separated in order to be used. This, in turn, can only be done starting in the highly acidic environment of a well functioning stomach that secretes enough hydrochloric acid, but also enough Intrinsic Factor and pepsin.

Cobalamin is carried into the duodenum—the first part of the small intestine—by salivary B12 receptors that are then broken down by pancreatic protease. This allows the free B12 to attach to Intrinsic Factor, and make its way to the ileum—the very last part of the small intestine—where it penetrates the mucosal wall for absorption. Finally, the free cobalamin latches onto the plasma transporter Protein Transcobalamin II whose function it is to carry it to the cells throughout the body. Any excess, unneeded at any given time, is carried to the liver where it is stored.

Where do we get B12?

That herbivores like sheep, goats and cows, which thrive when they eat only grass, do not suffer from B12 deficiency, but that most of us humans tend to (estimates from various large scale studies range between 40 and 80%), points to two key issues at the heart of this problem:

One, we have evolved and survived as a species over several million years by eating animals. It is believed by some that it was, in fact, the very eating of animal foods, maybe specifically bone marrow, which was, on the one hand, the only left overs after carnivore predators like lions, and then all other scavengers but predators for us like wolves and jackals had eaten all they could, and on the other, the only thing that only humans could get to by breaking apart the bones, that allowed the brain to grow in size over a relatively short evolutionary period, seting us apart from our our primate ancestors and cousins. Whatever the case may be, the organism of the human species as a whole grew accustomed and became reliant on an external supply of vitamin B12 from animal sources.

Two, it is most certainly the case that even with the healthiest, let’s even say ideal or perfect intestinal flora, as humans we will definitely have a very different flora than those of the herbivore animals we domesticated, and it will arguably always be much less capable and much less efficient at producing cobalamin from any of the plant foods we do eat. Moreover, if B12 is manufactured by some of the bacteria in our perfectly healthy colon—the large intestine, it will still not easily make it into circulation because, as we saw, absorption of cobalamin takes place in the ileum in the last part of the small intestine, which is upstream from the large intestine. The manufactured B12 would somehow have to migrate backwards from the colon to the ileum, a most likely very difficult thing to do.

The first point is supported by ample archeological, anthropological, as well as evolutionary biological evidence. In fact, it turns out that our hominid ancestors have most certainly lived for the bulk of our evolutionary history during periods of glaciation where the land over most of the Earth’s surface was covered in ice. This implies that there was a marked absence of plant life in most places on Earth, and therefore an absolute reliance on animals for survival, eating virtually only animals, which in turn also ate virtually only other animals and fish, which ate smaller fish, and on down the food chain to those feeding on sea-borne plant foods. The Inuits, who basically live on whale blubber, are the perfect example of such a scenario. But this could well have been the scenario for a lot of the humans that populated the Earth, and for a good portion of our history spanning the last 2.5 million years.

The second is hypothetical, but on firm footing given that it is indisputable that the gut flora of a herbivore will be different—substantially different—from ours, but also that we simply cannot survive for very long on greens alone as do sheep, goats, cows and all other herbivores. Furthermore, in actual fact, most humans have a dysfunctional digestive system, with heavily compromised and impaired intestinal flora. As a consequence, even those who eat adequate or even large amounts of B12-rich animal foods, usually cannot benefit from it because the cobalamin simply doesn’t make it into the bloodstream for any one of several possible impediments along the ingestion-breakdown-absorption chain.

This is not to say that our digestive flora cannot produce some B12 from plant-based foods, but the evidence shows us that it definitely cannot produce enough, whatever the reason: studies have shown that although B12 deficiency is of the order of 40% in the general omnivore population, it is 50% in vegetarians, and up to a staggering 80% in long-term vegans (see Chapter 6 of Could it be B12? and references therein).

Why is B12 deficiency such a big deal?

Well, let’s ask another question instead: What would happen if the myelin sheath that covers the nerves in our body—peripheral, spinal and brain—were to deteriorate?

Neurological symptoms would include: numbness, tingling and burning sensations in the hands, fingers, wrists, legs, feet, or truncal areas; Parkinson-like tremors and trembling; muscles weakness, paraesthesia and paralysis; pain, fatique and debility labelled as chronic fatique syndrome; shaky legs, unsteadiness; dizziness, loss of balance; weakness of extremities, clumsiness, twitching, muscle cramps, lateral and multiple sclerosis-like symptoms; visual disturbances, partial loss of vision or blindness. But the list goes on.

Psychiatric symptoms? Confusion and disorientation, memory loss, depression, suicidal tendencies, dementia, Alzheimer’s, delirium, mania, anxiety, paranoia, irritability, restlessness, manic depression, personality changes, emotional instability, apathy, indifference, inappropriate sexual behaviour, delusions, hallucinations, violent or aggressive behaviour, hysteria, schizophrenia-like symptoms, sleep disturbances, insomnia. And here again, the list goes on.

At the cellular level, every cell would be unable to adequately produce energy, be it from glucose or from fat. We can easily extrapolate and imagine what it would mean for the organism as a whole to have a lack of, or severe debility in the energy available to it at the cellular level, and this, for the trillions of cells throughout. This would have a most profound effect on everything that we do, and everything that the body does throughout the day and night.

Now consider a yet deeper level: in the nucleus of every cell, where genes are protected and cared for, a problem in the very transcription and replication of genes—these delicate operations that are necessary and vital for the continual renewal, repair and reproduction of cells—which must and do take place throughout our life, this long succession of infinitesimal instants the perception of which is almost universally absent from consciousness, but for which the timescale is, in fact, very long at the cellular level, where movements and interactions take place at phenomenal speeds. Vitamin B12 is absolutely essential for this too. And if it’s missing? Unintended, unplanned, and unwanted genetic mutations from errors in transcription. This means problems; very serious problems.

Who should be concerned about all this?

The short answer is: everyone. This means you, but also your kids as well as your parents. It means infants, toddlers, children, teenagers, young adults, mature adults, the middle aged, the elderly, and the oldest among us: absolutely everyone.

For the longer answer, it would appear to be the case that we are, or at least should be, born with a good B12 reserve, and that, as it is used over time, the amount in the body and blood slowly decreases as the reserves get used up and eventually depleted. Some consider this to be the normal state of affairs. This inevitably implies that those at greatest risk of suffering from B12 deficiency are the oldest, and also that the older we get, the greater our chances of becoming victims of the effects of this deficiency. And this is indeed what we find: practically everyone above the age of 60 is B12 deficient, and more often than not, severely deficient (serum B12 < 200 pg/ml).

It is therefore not really surprising that every single behavioural characteristic—intellectual, psychological, emotional, physiological and physical—associated with ageing and its multiple manifestations in the elderly, senior moments in all their different forms: memory problems, disorientation, inability to concentrate or even pay attention, frailty, weakness, unsteadiness, loss of balance, etc, etc, are all typical  symptoms of B12 deficiency.

Could it be that all these characteristics of old age are actually the characteristics of B12 deficiency? Could it be that if we didn’t let B12 levels drop below 600 pg/ml and actively maintained them around 1000 pg/ml throughout life, that seniors would simply not manifest any of these signs of old age? Maybe. Maybe even most probably. What an entirely different world it would be: strong and healthy, energetic and vibrant, sharp and alert old people. Sounds great, doesn’t it? And hard to imagine, isn’t it? But wouldn’t that be wonderful, for everyone, and especially for the elderly themselves?

As alluded to a moment ago, we should be born with a large B12 reserve. It is of particular importance that we need to have a plentiful supply of B12 throughout our development in the womb, during infancy, and up to the 7 years of age. Why is it so important? Because our nervous system develops fastest while we are in our mother’s womb, and then during infancy and as a toddler, until it reaches maturity by the time we are about 7, and because cobalamin is essential for this development.

The complication, however, a point of crucial importance, is that only B12 consumed by the pregnant mother at first, the breast-feeding mother afterwards, and finally by the toddler can be used to ensure an optimal development and building of a healthy brain and nervous system. Even if the mother had good B12 levels before, during and following pregnancy, only fresh B12 can be used in the developing child. So, if she doesn’t consume much or any during this critical period, the unborn child and infant will have only a meagre or non-existent supply of cobalamin, and consequently, impaired—often severely—brain and nervous system development.

This is a very serious matter. In fact, for many infants, it is a matter of life or death. Or just barely less dramatic but maybe even worse in some respects, it can make the difference between a normally healthy brain and nervous system, and permanent developmental disability, both physical and intellectual, right down to a full or partial vegetative state for a whole lifetime.

All of this shows why B12 deficiency tends to be not only transmitted, but to worsen in severity from one generation to the next, with all the negative consequences that come with it, but most notably those that affect the brain and all cognitive functions. Terribly sad and unfortunate as it is, numerous studies and reports on the babies of vegetarian but especially vegan and macrobiotic mothers have shown very serious neurological problems, developmental delays as severe as stunted brain growth and death, but also that even mild deficiencies in infancy are associated with seriously impaired cognitive performance in adolescence and adulthood. I cannot stress this enough: B12 deficiency is really very serious.

Now, between the oldest and the youngest there is everyone else. If we are born with an excellent B12 status, then we are lucky and likely to be able to make it to old age without any apparent problems in this regard. If we are born B12-deficient, then we are most certainly likely to suffer from it greatly, and this, much sooner than later. And if we are born with anything in between, an intermediately good or bad B12 status, then problems will appear later in life, or sooner, depending on many other factors, but most importantly on how much cobalamin we consume, and how well it is absorbed. Consequently, manifestations of cobalamin deficiency can appear at the age of a few months or a few years; as a child or teenager; as a young adult or person in their prime; near retirement or in old age; or it may also never become apparent. Unfortunately, this condition is continuously growing in importance, the people it affects growing in number, and the reported cases growing in severity.

Unfortunately, and extremely sadly for way too many people whose bodies, minds and lives are destroyed by an undiagnosed deficiency, B12 is not something that doctors routinely check or know much about. Most of them believe that it will appear in the total blood count (TBC) panel either as enlarged (megaloblastic anaemia) or fewer red blood cells (pernicious anaemia). But by the time you get there, you have been suffering the ravages of B12 deficiency for a while already, and have thus almost certainly also already suffered permanent neurological damage. So, for your sake, don’t wait for your doctor to notice this. Instead, teach them about it. You will be doing them and their patients an immense favour.

Closing with the good news

It is really easy to prevent and avoid becoming cobalamin deficient, but also to correct a deficiency that exists or even one that has persisted for several years or decades, no matter if you eat animal products or not, want to or not, think that you should or not. We must, very simply, check our B12 status regularly by measuring three markersserum B12, plasma homocysteine (Hcy), and urinary methyl-malonic acid (MMA)—and make sure to supplement in order to raise and maintain B12 levels in the range between 600 and 2000 pg/ml, with concentrations of Hcy and MMA as low as possible. Pregnant and nursing mothers should maintain levels above 1000 pg/ml to ensure healthy nervous system development in their children.

(Both Hcy and MMA are toxic byproducts of protein metabolism that must be converted to benign and/or useable forms, the animo acid methionine, for example, by the action of B6, folic acid (B9) and especially B12. Here is a good information-dense compilation of B12/Hcy/MMA publications, and transcript of an interview with John Dommisse, a psychiatrist and B12 expert, who published the above quoted serum B12 range as optimal in this authoritative paper cited in Could it be B12? where I read about it.)

Supplementation should be with methylcobalamin—not cyanocobalamin—and should be as aggressive as needed depending on the result of the assessment. In cases where B12 levels are below 200 pg/ml, we should request methylcobalamin injections to be administered daily for 5-6 days, and then weekly until B12 reaches 2000 pg/ml. It should be maintained there at least until Hcy and urinary MMA have dropped significantly, and then monitored and maintained around 1000 pg/ml.

For anything else between 200 and 600 pg/ml and/or elevated Hcy or MMA, methylcobalamin patches are an effective way to get B12 levels up. In addition, oral supplementation, although the least effective of the three, still works surprisingly well compared to other supplements, and obviously cannot possibly hurt; it can only help. I recommend doing both patches and oral supplements until levels are around 1000 pg/ml, and then maintaining them with either one.

Finally, and very important to know is that you cannot overdose on methylcobalamin B12: not one negative physiological side effect has been reported or is known from methylcobalamin supplementation. You cannot do yourself or anyone any harm by taking B12 as methylcobalamin in large quantities for a long time; you can only do yourself and others harm by allowing a deficiency, as mild as it may be, to develop or linger. This applies to everyone.

If you enjoyed reading this article, please click “Like” and share it on your social networks. This is the only way I can know you appreciated it.

Why you should start taking magnesium today

Because magnesium is maybe the most important mineral for plant and animal life on Earth. Because magnesium is certainly one of the essential minerals most deficient in our food. And because we are all magnesium deficient.

Magnesium was the key element in the evolution of plant life on Earth as it is the heart, the central ion of chlorophyll—the plant’s photosynthesising lifeblood. I was amazed when I learnt that chlorophyll and haemoglobin have identical molecular structures, only that chlorophyll has magnesium at its heart, while haemoglobin has iron. This does indeed seem amazing at first, but upon reflection, it seems quite natural, as we can be pretty sure that this is not an evolutionary coincidence since simple cellular life came first, then plant life—obviously dependent on the simplest forms of life, and then animal life—which is completely dependent on plant life.

The human body is about 70% water by weight, with about 2/3 inside our cells and 1/3 outside; the dry weight of a 70 kg person is about 20 kg. So we can say that the rest of our weight is various arrangements of naturally occurring elements. But of the 92 naturally occurring elements, a mere 7 of them make up 99% of the body’s total mineral content. These essential macrominerals are, in order of abundance: calcium, phosphorus, potassium, sulphur, sodium, magnesium, and chloride (chlorine gas dissolved in water).

Calcium is the most abundant and it must be in balance primarily with phosphorus for proper physiological function, but also with magnesium. Phosphorus is the second most abundant, and, present in every cell of the body, it plays a role in almost every chemical reaction. Potassium and sodium work together in their most notable function to transport nutrients into cells and metabolic waste out of them. And hence, potassium is the most abundant element inside the cell, in the intracellular fluid, while sodium is the most abundant element outside, in the extracellular fluid. Sodium is also the primary element on which rely the kidneys for regulating the amount of water in the blood and bodily fluids in general. Chloride works with its siblings potassium and sodium in their role as fluid and acid-base regulators, but it is also the essential element in hydrochloric acid secreted in the stomach to break down proteins into amino acids. Sulphur is necessary for the formation of hair, nails, cartilage and tissue. It is needed for metabolism and a healthy nervous system, plus it aids bile secretion in the liver.

Why so important?

Among these 7 macrominerals, however, magnesium is king. It is second most abundant element inside cells after potassium, and even though it totals only around 25 g in the average 70 kg human body, (more than half of it stored in bones and teeth, and the rest in muscle and soft tissues), it plays a role akin to that of a conductor in regulating the absorption and excretion of many of its sibling macrominerals, both in the intestines and in our cells. Of the multitude of functions it plays, magnesium is involved as a necessary co-factor on which more than 300 essential metabolic enzymatic reactions depend; it is crucially needed for structural function of proteins, nucleic acids and mitochodria; it regulates production, transport, storage and utilisation of energy in cells; it regulates DNA and RNA synthesis, cell growth and cell reproduction; and it regulates nerve function throughout the body.

But certainly most noteworthy, and indeed very important for the vast majority of us magnesium-deficient humans, is that magnesium is what allows muscles to relax: every single muscle cell in our body depends on magnesium to release a contraction instigated by calcium, magnesium’s antagonist brother. Going further, only magnesium can inhibit calcium-induced cell death: only magnesium regulates entry, and can thus prevent calcium from flooding a cell to trigger apoptosis (programmed cell death). It is for these two reasons that magnesium is so much more important than calcium. Sadly, we are as over-calcified—caked stiff with calcium from the inside out—as we are magnesium deficient. And that’s bad news because the more over-calcified the body grows, the more magnesium deficient it becomes. In addition, as important as it is to optimise vitamin D status, it is now clear that this cannot be done without at the same time optimising magnesium status (1).

And in practical terms, what does this mean for you? It means that most modern diseases and conditions are either a direct consequence of or severally aggravated by magnesium deficiency. It means that of all the heart attacks and strokes that claim the lives of most people in industrialised countries, it’s estimated that more than half are caused by magnesium-deficiency. It means that hypertension, poor circulation, water retention, osteoporosis, kidney stones and kidney disease are all caused or severely aggravated by magnesium deficiency. It means that arterial plaque buildup (atherosclerosis), arterial wall thickening and stiffening (arteriosclerosis), cardiac arrhythmia and palpitations, headaches and migraines, anxiety, irritability, insomnia and depression are all caused or severely aggravated by magnesisum deficiency. It means that from the seemingly most benign, occasional involuntary twitching of the eye, or the cramp in your foot, calf or hamstring that just seems to you as a brief nuisance unworthy of attention, to the cardiac arrest or stroke caused by a prolonged spasm of a coronary or cerebral artery that can claim your life in a few instants or leave you paralysed and debilitated for the rest of your life, to chronic anxiety, occasional panic attacks, recurring depression, bipolar or schizophrenic disorders, all of these health problems and hundreds more are caused or severely aggravated by magnesium deficiency. Insulin resistance, metabolic syndrome, and diabetes are also intimately related to magnesium deficiency as it is this mineral that allows insulin to transfer its cargo of glucose from the bloodstream into the cell.

Like many other realities of our world in the realm of medical sciences and treatment of disease, that this can be so—that we can be in such a dire situation of global magnesium deficiency—is truly mind-boggling given the ease with which it can be both prevented and remedied. But for this one as well as so many other such logic-defying realities in today’s medical and health sciences, ignorance is the major hurdle, but the power of the politics of profits cannot be underestimated, and should not be ignored or overlooked.

Why so magnesium-deficient?

Very unfortunately for us, agriculture is not, and to a great extent, never has been as it should rightly be—feeding and enriching the soils and the land, while at the same time producing from it, foods with the perfect balance of minerals, vitamins and phytonutrients in an amazing and unique positive balance process, ultimately based on a remarkably efficient harnessing of the Sun’s energy by the grass and soil. Instead we have an agricultural system that globally pollutes the waters with toxic runoffs, depletes the soils with chemical herbicides, pesticides and Nitrogen-Phosphorus-Potassium or NPK fertilisers, all of which help to slowly but surely sterilise the earth’s surface.

Now, to give you a sense of the scale of the problem of soil mineral content depletion, as far back as 1936, a hearing was held in the 74th US Senate Congress where the following statement was made:

“Do you know that most of us today are suffering from certain dangerous diet deficiencies which cannot be remedied until depleted soils from which our food comes are brought into proper balance? The alarming fact is that foods now being raised on millions of acres of land that no longer contain enough of certain minerals are starving us—no matter how much of them we eat. Our physical wellbeing is more directly dependent upon the minerals we take into our systems than upon the calories or vitamins or upon the precise proportions of starch, protein or carbohydrates we consume (my italics). Laboratory tests prove that the fruits, the vegetables, the grains, the eggs, and even the milk and the meats of today are not what they were a few generations ago. No man today can eat enough fruits and vegetables to supply his stomach with the mineral salts he requires for perfect health.”

And you can be sure that the situation has gotten worse since then—much, much worse. Just to illustrate the point, all chemicals, whether they are those found in fertilisers, in herbicides or in pesticides, contribute to magnesium wasting. Pollutants in the air that fall back down in the form of acid rain waste magnesium stores because it is simultaneously a potent acid buffer and the most water-soluble of the macrominerals. Therefore, it is also the most affected by acid rain and runoffs saturated with agricultural chemicals.

To make matters worse, any processing of a food in its natural form, will most effectively deplete its magnesium content. Here again this is due to magnesium’s super water solubility. Such that with every step of processing, more magnesium is lost from the already magnesium deficient food. The result is that all processed foods are basically devoid of it. Fluoride, the reactive industrial by-product and poison that is put into many municipal drinking waters under the false pretence that it is good for the teeth, seeks out minerals like magnesium, and by binding to them makes it impossible for the body to absorb or use. (This is just one of the many, well researched and well documented negative effects of water fluoridation. See the Fluoride Action Network for plenty more details.)

And the last straw in this magnesium-depleting scenario is our own evermore stressful lifestyle. Always more stress: stress related to the economic situation in our country; stress related to the stability of “The Market”; stress related to the economic stability of our company; stress related to the security of our own job; stress related to our professional and therefore social status; stress related to worries about our kids’ wellbeing, happiness, social development, about their future; stress related to all those deadlines we have to meet, and to those that we set ourselves for our personal projects that somehow always slip to the bottom of the pile of books sitting collecting dust next to your bed; stress about how to save money for hard times, and about where we will go on our next holiday; and on and on and on. Incredible but true: the more time passes, the more technological advances are made, the more stuff we are able to make and use and buy, the more stress there seems to be in our lives.

And what does stress have to do with magnesium? Very simply: stress depletes magnesium and magnesium deficiency magnifies stress. How do we know this? By doing a simple experiment where adrenaline is introduced in the bloodstream intravenously, and seeing the levels of magnesium drop immediately, together with those of calcium, potassium and sodium. Stop the adrenaline and they start to make their way back up, but unfortunately, is takes magnesium the longest to recover to physiological concentrations. But the fact is that every time we feel any kind of stress, adrenaline triggers our fight-or-flight response, in which the heart starts pumping, digestion is stopped as blood is diverted from the digestive system to the arms and legs, blood also thickens by the release of clotting factors to prevent excessive blood loss in case we get injured, glycogen stores are released from the liver to be made available as glucose for immediate energy use in the heart, lungs and muscles, and yes, all of these processes are intensely magnesium-dependent, and at the same time, intensely magnesium-depleting.

In short, almost all soils on agricultural land everywhere are magnesium deficient, some totally depleted, others just greatly depleted. All foods grown in these soils are inevitably also magnesium deficient, and in some cases even more due to the excess potassium in the chemical fertilisers that prevent the plant from taking up magnesium. All processing of food further depletes magnesium, and our crazy and sickly addiction to stress delivers yet another blow—a final blow. We—all of us—really are magnesium deficient. And many of us severely so. For this reason we all need magnesium supplementation. And the sooner we start, the better off we’ll be. If you want to know how magnesium deficient you are, order an RBC Mg test (red blood cells hold about 40% of the body stores of Mg): the lab’s reference range can be anywhere from 3.5 to 7, but you want to be at 6.5 mg/dL.

Remarkably easy, extremely safe and incredibly inexpensive

There are several forms of magnesium supplements. Magnesium chloride is the most  completely ionised (with a stability constant of 0), and therefore the most easily absorbable in its ionic form by our cells. This also means that it is super hydrophilic (water-loving) and dissolves instantly when in contact with even a drop of water, so it needs to be kept very dry in a well-sealed bag or container. All the better for us, it also turns out to be very inexpensive (about 6 euros/kg) in the form of white, brittle flakes called Nigari, which is used to make tofu.

To drink your magnesium, dissolve 20 g (4 teaspoons, and 10 cents worth!) in a 1 litre bottle or 30 g (6 teaspoons) in a 1.5 litre bottle. (This makes a 2% solution of magnesium chloride.) Take 50 ml on an empty stomach when you get up in the morning, and again at bedtime. You can dilute this in as much water as you want because it is the total quantity of magnesium that counts, not the concentration of the solution that you drink. At first or when you feel you need more (stressful day, weakness, cold coming on), you should take another 50 ml in the late afternoon when the body is most in need of it. This will supply 360 mg if you take it three times, and 240 mg if you take it twice per day (magnesium chloride is 12% magnesium by weight. Dissolving 20 g in 1 litre gives 2.4 g of ionic magnesium, and dividing this litre in twenty 50 ml doses yields 120 mg per dose. Therefore 3 doses gives 360 mg and 2 doses gives 240 mg).

To absorb your magnesium through the skin, dissolve 20 g in 80 ml of water. (This gives a 20% solution of magnesium chloride—ten times more concentrated than the drinking solution.) Naturally, you can dissolve more magnesium chloride in more water, keeping the same proportions, and storing the solution in a spray bottle. With just 6 sprays on each arm and leg as well as on 6 on your chest and back, you can take up as much as 600 mg of magnesium every day. This is a much more effective way to absorb magnesium because instead of going through the digestive system from which as little as 25% up to 75% of the magnesium will be absorbed depending on many factors but primarily the state of health of your digestive system, which in most of us is appalling, almost all the magnesium is absorbed through the skin and into the bloodstream in about 30 minutes. We use both methods at home.

Finally, supplementing with magnesium is extremely safe for the simple reason that it is extremely water soluble: it binds so tightly to water that the magnesium ion forms a hydration shell around itself resulting in a radius 400 times larger than in its dehydrated form. This is unlike any of its macromineral siblings. And for this reason, it is also excessively easy for the body to excrete any excess magnesium either through the urine or in the stools. Therefore, there is virtually no chances of overdosing on magnesium, and no possible negative side effects.

So please, for your own good, for the good of your sons and daughters, husband or wife, ageing mother and father, buy some Nigari at your local natural food store, and start magnesium supplementation for all of them. And for the good of your friends and colleagues, tell them about it and send them this article if they need convincing. (In France, Spain and probably other European countries, we find the Celnat brand 1 kg bag of Nigari. I’ve bought is at Bio-coop stores in Paris, and at Eco-centro in Madrid)

Conclusion: Main points to remember

  1. We are all magnesium-deficient, and many of us, dangerously so. This is due to the severe lack of magnesium in soils everywhere and therefore in the foods we eat, due to the fact that processing of whole foods strips most if not all the magnesium that is present in the unprocessed food, due to the fact that our diet is excessively rich in calcium that must be balanced with magnesium in order not to accumulate in our tissues and stiffen everything from our organs to our arteries and to our brain, and finally due to the excessive stress that we all know to be the most remarkable feature of our modern lifestyle.
  2. Magnesium is absolutely essential for relaxing muscle cells including—and maybe most importantly—the endothelial cells that line our blood vessels. Stiff blood vessels cause high blood pressure. This puts great stress on the kidneys and causes a chain of negative consequences that mould into a vicious cycle in kidney deterioration that eventually leads to failure. In addition, stiff blood vessels causes them to suffer much greater damage, especially at bifurcations where the arteries split into finer and finer arterioles. This damage leads to the buildup of plaque, and then to cardiovascular disease, heart attack,s strokes, Alzheimer’s and dementia.
  3. We all need magnesium supplementation, and fortunately it is easy, cheap and safe because Nigari is an inexpensive, food grade magnesium chloride salt easy to buy in natural food stores, and because magnesium’s ultra water solubility makes it very easy for the body to excrete in the urine and eliminations, which guarantees that that it cannot accumulate excessively. On the other hand, this also means that it takes several months to replenish intra-cellular magnesium levels, and that we need to take it daily.