Chemicals and Toxins — What Is Safe?

One of the most common questions I get from SquintMom readers is along the lines of is item/substance/compound XYZ toxic? I’d like to go ahead and answer this once and for all: YES, it is.

Now let me explain what I mean, and how I can answer this very generic question in a catch-all way without specifying the item/substance/compound to which I refer. Because he said it so well that it doesn’t need rephrasing, I’ll quote the Renaissance-era botanist Philippus Aureolus Paracelsus, who said:

All substances are poisons; there is none that is not a poison. The right dose differentiates a poison from a remedy.

Phrased more generally, this is simply that any substance can be either safe or toxic; the dose (quantity) to which one is exposed is what makes the difference. I’ve mentioned in previous posts (like this one about oxybenzone in sunscreen) that the notoriously jumpy Environmental Working Group (EWG) systematically fails to recognize this particular principle; they have a tendency to vilify anything that proves toxic in any dose, under any conditions. This attitude, however well intentioned, leads us to some interesting places. Pause for a moment and check out the cautionary website Note that the highly toxic dihydrogen monoxide (DHMO) is associated with cancer (it’s found in every tumor ever identified), has serious environmental impact (it’s a major greenhouse gas and overexposure is associated with thousands upon thousands of deaths every year), and, per the website:

[DHMO’s] basis is the highly reactive hydroxyl radical, a species shown to mutate DNA, denature proteins, disrupt cell membranes, and chemically alter critical neurotransmitters.

Sounds horrid, doesn’t it? No doubt we should ban it. Except that… is a joke website, and dihydrogen monoxide is the almost never-used, formal chemical name for water.

None of the information on is false, which is what makes it both amusing and apropos to this discussion. Water does, in fact, directly result in many deaths. Not only through “overexposure” via flooding and/or drowning, but also through overconsumption. For instance, in 2007, a radio station held a contest (“Hold your wee for a Wii”), the idea of which was to drink as much water as possible without a bathroom break; the caller who drank the most would win a coveted Wii game console. Contestant Jennifer Strange won (and then lost) by consuming more than 2 gallons of water in the space of less than an hour. She died shortly thereafter of hyponatremia, a condition in which there is an insufficient concentration of sodium in the body fluids to support life (sodium is critical to cellular function, neural conduction, muscular contraction, brain function, and so forth). This is not the only incident of water toxicity on record; similar cases have resulted from fraternity hazings, bizarre diet plans, and overconsumption of water during endurance sporting events like marathons.

On the other hand, there are substances that we typically consider highly toxic that are, in the right dose, of great medicinal utility. Clostridium botulinum is a species of bacteria that produces botulinum toxin, generally considered the deadliest substance on Earth. The average 150 pound man would have a 50:50 chance of survival if exposed to merely 341 ng (that’s less than a millionth of a gram) of pure botulinum toxin. Regardless, marketed under the trade name Botox, botulinum toxin is used for cosmetic purposes (wrinkle treatment and prevention). Of perhaps greater medical importance, it’s also used to ease the painful symptoms of temporomandibular joint syndrome (TMJ) and other spasmodic disorders, and mitigate the symptoms of diabetic neuropathy (damage to peripheral nerves, often in the feet, due to diabetes).

Further complicating matters, our perception that “natural” substances are somehow safer or better for us than “artificial” substances is misinformed. A simple example is the flavoring agents found in many foods. While the common perception is that natural flavors come from the food of which they taste (strawberry flavor, for instance, comes from strawberries), nothing could be further from the truth. In reality, natural and artificial flavors are generally identical chemicals, collected or produced in different ways.* Natural almond flavor, for instance, isn’t a mixture of “natural substances” that come from almonds. Instead, it’s a chemical called benzaldehyde that is extracted from peach pits. Artificial almond flavor is also benzaldehyde, but unlike natural almond flavor, the artificial stuff is made in the lab. Funnily enough, it’s possible to get benzaldehyde made in the lab much more pure than that extracted from peach pits. Further, the stuff that comes from peach pits — the natural almond flavor, remember — contains small amounts of deadly cyanide that occurs naturally in those same peach pits (one of many reasons it’s not wise to eat the pits of stone fruit).

*Eric Schlosser’s excellent book Fast Food Nation contains a very interesting chapter on this topic, for further reading.

Where does this leave us, in trying to avoid toxins? First, as a chemist, let me just say that the word toxin is very often misused in popular sources and conversation, and the word chemical is almost always misused. “Chemicals” are not bad things that cause harm and should be avoided. Instead, they are matter; they are what makes up the physical universe. Nothing that has mass and occupies space — nothing we touch, eat, drink, breathe — is not chemical. There’s no such thing as chemical-free bread, shampoo, or paint. Water is a chemical (and — let’s not forget — a toxic one at that). With regard to toxins, the word is used too often in a vague, handwaving sense on the Interwebs. I see pop-authors (who are generally trying to sell something) write about how Product X contains “toxins,” and should therefore be avoided, or Product Y (which they’re selling) contains no toxins.* I’m not sure what these folks mean when they say “toxins” (and since they rarely name said toxins, I’m not sure they know either); after all, let’s not forget that all substances are toxic in the right dose.

*Or worse yet, Product Y (which they’re selling) is a detoxifying agent. This is ridiculous; almost all humans (with the exception of a few with significant disease) are possessed of one of the most powerful detoxifying mechanisms known to man — a liver. Livers work really well, particularly when they’re left alone to do their job.

This is not to say that we should all go about our business with no concern whatsoever for the things we touch/eat/drink/breathe; it’s simply to say that we simultaneously worry too much and worry too little about “chemicals.” To take one particular example, a few scare-articles about bisphenol A (BPA) have some of us so worried (and confused) that we’re willing to shell out extra cash for BPA-free diaper wipe containers, toys, and even a bath toy organizer. In reality, if BPA has any effect at all in doses to which we’re routinely exposed (which has not yet been established), it would require significant physical contact with the compound to absorb it. Holding, playing with, or storing one’s bath toys in a BPA-containing item would not be a problem, particularly given that while the absorption rate of BPA through human skin hasn’t been thoroughly evaluated or established, it nevertheless appears to be significantly lower than the (already modest) rate of absorption through the skin of other animals (Marquet et al). Based upon the current research, might it be worth avoiding storing food in BPA-containing plastics? Possibly. This is because food might leech BPA out of the plastic in sufficient quantities to possibly have some effect on people (because we eat the food, which gives it an easy route into the system). Is it worth it to avoid all BPA in our houses, however? Simply, no. And on that note, it particularly amuses me to watch women with painted nails shopping for BPA-free toys for their daughters (also with painted nails), given that the exposure to potentially harmful substances (like toluene) is much greater when one physically paints said chemicals on one’s body.*

*For those who are curious, I do paint my nails, because I really don’t think this is that big a deal. But it’s certainly a more significant exposure to chemicals (ew! chemicals!) than touching a rubber ducky in the tub.

So, we worry too much. But we also worry too little. In our desire for the “natural” (whatever that means), we choose the cyanide-laced flavoring agent over the one made under strict conditions and control in the lab. We go to the natural foods store and buy herbs to treat our ailments — which are essentially unregulated for either safety or efficacy, and which may interact unsafely with prescription and over-the-counter drugs or be toxic in their own right — rather than using the “unnatural chemicals” prescribed by medical professionals, despite the fact that the latter have undergone many years of pre-marketing research, followed by decades of post-marketing surveillance. We’re more willing to expose our children to the 1/330 risk of death due to the measles than the 1/3000 risk of a moderate side effect of measles vaccination (e.g. seizure with no permanent effects, mild rash), and immeasurably small risk of serious side effect. We further eschew the vaccination because, in a complete failure to understand the mechanics of human immunity, we have come to believe that “natural” immunity from disease is superior to “artificial” immunity from vaccination. When it comes to the “natural” versus the “toxic” and/or “chemical,” we’re chasing flies out of the chicken coop while the foxes sneak in.

So what do we do about it? This is difficult. We know that all substances are toxic in the right (wrong?) dose, but when it comes to many substances, we still don’t know what that dose is. Some exposures are unavoidable (by virtue of living in a city, for instance, one is going to be exposed to a certain amount of benzene from exhaust, industrial processes, etc). Some exposures are avoidable, but avoiding them reduces quality of life (no one HAS to eat foods containing coloring agents, for instance, many of which are of questionable safety, but the complete avoidance of these would make for a stoic existence, particularly for children). In most cases, when it comes to toxic chemicals (and once more, all substances are chemicals, and all chemicals are toxic when one is exposed to them…all together now…in the right dose), one must do a risk-to-benefit analysis. Some cases are relatively clear. Is codeine toxic? Yes, in the right dose. Is it worth the risk to take codeine for recreational purposes? Probably not. Is it worth the risk to take codeine after a painful surgery? Probably. Is water toxic? Yes, in the right dose. Is it worth the risk to drink water when one is thirsty? Absolutely. Is it worth the risk to drink water to win a contest? Probably not. Some cases are less so, as with the previous example of BPA. With the evidence still equivocal, financial means and convenience likely become a large part of the decision. Those of greater means or with greater willingness to be inconvenienced might buy the BPA-free rubbery ducky, the BPA-free cabinet safety locks. Others might decide to buy the BPA-free food storage, but be content with the plain old, BPA-containing bath caddy. Regardless of these personal decisions when it comes to substances of yet-unknown safety, it’s worth remembering that the media, the product manufacturers, and the fad-authors capitalize upon the lucrative combination of public confusion and fear, and that the words “chemical,” “toxic,” “artificial,” and “natural” are as powerful as they are misused and misunderstood.


Marquet et al. In vivo and ex vivo percutaneous absorption of [14C]-bisphenol A in rats: a possible extrapolation to human absorption? Arch Toxicol. 2011 Sep;85(9):1035-43. Epub 2011 Feb 2.

Organic Versus Conventional Milk: Health Issues And Environmental Perspectives (Guest Post at Science of Mom)

I’m guest-posting today! Alice at Science of Mom has recently featured two articles about conventional versus organic milk; the first claimed that milk from rBST-treated cows was the same as (or even preferable to) milk from non-rBST-treated cows, while the second claimed that conventional milk was just as good as organic. As a chemist with a special interest in environmental and social issues, I have a different take. Here are the major points/conclusions:

  • Small, idyllic-sounding conventional family dairy farms (like the one described in in this recent guest post on Science of Mom) sound lovely. If everyone raised dairy cattle like she does, there’d be little reason to consider organic milk. However, farms like this one are the exception in the U.S. dairy industry, and are rare exceptions at that.  The vast majority of U.S. dairy cows are housed in animal feeding operations (AFOs), and specifically in concentrated animal feeding operations (CAFOs). By EPA definition, then, both AFOs and CAFOs are crowded, and CAFOs are major sources of environmental pollution.
  • Milk from dairy cows, regardless of how they’re raised, is free from antibiotics. However, antibiotic overuse — meaning use of antibiotics in a prophylactic sense and as necessary for treatment of diseases spread through unnecessary husbandry practices — is promoting the development of antibiotic-resistant bacteria. Because conventional operations including CAFOs promote the development of antibiotic-resistant bacteria (thorough antibiotic overuse) that then proliferate in the environment, it’s not necessary to have contact with or consume a conventionally-raised animal or product to be negatively impacted by these practices.
  • CAFOs produce tremendous amounts of concentrated environmental waste. There’s far too much of it for the land to absorb, so it runs off into the surface water (lakes and rivers) and leeches into the groundwater (aquifers that feed municipal supplies and wells). Excess nitrogen in the water is associated with acid rain, fish-kills, blue-baby syndrome (methemoglobinemia), and global warming.
  • Conventional farming practices result in dairy cattle consuming large amounts of chicken feces and chicken feed, which contains cattle meat. This cannibalization of cattle by cattle increases risk of spreading BSE (mad cow disease) in the U.S.
  • Conventional farms that use rBST increase the likelihood that their cows will suffer mastitis (an animal welfare issue.
  • Conventional dairy cattle have less access to pasture, which results in a different (and less healthy) fatty acid profile in the milk. Organic milk is higher in heart-healthy omega-3 fatty acids, while conventional milk is higher in pro-inflammatory omega-6 fatty acid.
  • In the end, organic milk is healthier for everyone: your family, the cows producing the milk, humanity as a whole, and the planet.

Read the full article at Science of Mom.


When Is The Best Time To Introduce Solids?

The decision to start solids is both an exciting one (your baby is growing up!) and a difficult one for many parents. The latter is because there’s so much conflicting information floating around (“Starting solids sooner will make your baby sleep better!” “Starting solids too soon will give your baby allergies!”). The purpose of this post is to summarize the research that addresses when to start solids in a baby that is breast- and/or formula-fed.

If you’re confused by all the seemingly conflicting information out there regarding when to start solids, you’re in good company; the American Academy of Pediatrics (AAP) is split on this issue. The AAP’s Breastfeeding Initiatives state that it’s best to wait until an infant is 6 months of age, while the AAP’s nutrition division suggests that it’s fine to introduce solids around 4 months of age. There is no research to suggest that there’s any benefit associated with introducing solids before 4 months of age, and there is quite a bit of research suggesting that such early introduction of solids is associated with increased risk of allergies and eczema (see, for instance, Greer et al, Tarini et al, Zutavern et al). Waiting until 6 months of age to introduce solids decreases the risk of atopic diseases (allergies, eczema, and asthma). Researchers are split on introduction of the most allergenic foods (including eggs, shellfish, and nuts). Some studies (including Filipiak et al) suggest that there’s no benefit associated with waiting beyond the sixth month to introduce these foods (in non-chokable form), while other studies (such as Fiocchi et al) suggest waiting to introduce dairy, egg, nuts, and seafood. Given the split nature of research findings on delayed introduction of highly allergenic foods, it may be worth delaying such foods in families with a history of atopic disease. Highly allergenic foods aside, the preponderance of evidence suggests that the best time to introduce first solid foods falls somewhere between 4 and 6 months of age. The question, then, is whether to shoot for closer to the beginning of that window, or closer to the end.

There are several arguments often made for adding solids to the diet earlier, rather than later. None of these, however, are supported by science. Perhaps the most common assertion is that adding solids will improve infant sleep. Several studies have examined this issue, and have found no sleep improvement with added solids (see, for instance, Macknin et al, Oberlander et al.) The Oberlander study looked at newborns, comparing sleep after a randomly assigned meal of water, carbohydrate, or formula. Water-fed infants slept less than formula-fed infants, while carbohydrate-fed infants (contrary to the common maxim) didn’t sleep as well as formula-fed infants. The Macknin study examined the effects of adding infant cereal to the nighttime bottle (a common practice thought by some to promote sleep) of 5-week-old and 4-month-old infants. The sleep durations of the infants given cereal were compared to the sleep durations of same-age infants given formula with no cereal; the researchers found no increased quantity or quality of sleep with cereal. There is no research support for beginning solids as a means of improving sleep.

Another argument used to support introducing solids at closer to 4 months than 6 months of age is that the older infants are (according to their caregivers) no longer satisfied by breast milk or formula alone. Because 4- to 6-month-olds have very limited communication ability, this is largely based upon speculation. For instance, some caregivers interpret a 4-month-old’s sudden interest in the food on an adult’s plate (or silverware) as an interest in eating. Given the opportunity, many 4-month-olds will grab food off an adult’s plate and place it in their own mouth, interpreted by some caregivers to mean the baby wants to (and/or is ready to) eat solids. However (and I recognize this is not a scientific statement), 4-month-olds also put rocks, garbage, and anything else they can find into their mouths. Around 4 months of age, an infant’s attention begins to turn to the outside world. The infant also increasingly possesses the ability to control his hands, allowing him to grasp objects of interest and bring them to his mouth for exploration. Infants don’t differentiate “food” from “non-food” with regard to what they taste; they simply use oral investigation as one of their means of gaining information about the world. It is a misattribution of intent to suggest that a 4-month-old who grabs food off his mother’s plate wants to eat solids. More scientifically, there is no evidence to suggest that an infant younger than 6 months of age needs anything more than breast milk (with supplemental vitamin D if indicated, see this article for more information) or formula. Further, there is ample scientific evidence showing that infants thrive on nothing but breast milk for the first 6 months (see, for instance, Carruth et al, Dewey, Nielsen et al). There is also evidence showing that introducing solids after 4, but before 6 months of age doesn’t positively affect growth (Cohen et al), because infants fed solids consume less milk or formula. Even infants given as many nursings (this study was conducted on breastfed infants) as they’d been given prior to introduction of solids consumed less milk per nursing when given supplemental solids. This demonstrates that a 4-month-old can’t be made to increase his caloric intake by giving him solids, as he’ll take less milk in response. Of particular concern is the case of the breastfed infant; there is no substance as nutritionally complete or suited to the digestive tract of the young infant as breast milk. Thus, since the breastfed infant responds to solids by decreasing milk consumption, supplementing with solids prior to 6 months of age actually decreases the quality of the breastfed infant’s diet. Given that formula is designed to mirror the nutritional qualities of breast milk as much as possible, we can reasonably extrapolate that it is the best second choice for feeding a non-breastfed infant (or supplementing an infant whose mother is not exclusively breastfeeding) until 6 months of age, and that introduction of complementary solids displaces a higher-quality source of nutrition.

If waiting until 6 months to introduce solids is good, then, is waiting longer than 6 months even better? Apparently not. There’s research that suggests rather strongly that delaying the introduction of solids beyond the 6-month point does not further decrease the risk of allergies (see, for instance, Filipiak et al, Greer et al, Zutavern et al), and may even increase the risk (Nwaru et al). Further, breast milk and formula are no longer sufficient to support increasing nutrient needs beyond 6 months of age (Dewey). As an isolated (but not unique) example, breast milk is quite low in iron (there is a great article about this at Science of Mom), and complementary foods can be used to increase iron in the diet (there’s another great article from Science of Mom here). The most nutritionally-complete diet for a 6-month-old (or older) infant should consist of mainly breast milk (or formula), with carefully-selected complementary solid foods.


Science Bottom Line:* There is ample research to support waiting until after 4 months of age to begin complementary solids, and there is a modest amount of research to support waiting until 6 months of age, particularly in the case of a breastfed infant. There is no evidence of any nutritional or behavioral benefit conferred by solids between 4 and 6 months of age. Research does not support (and, in fact, opposes) waiting beyond 6 months of age to introduce complementary solids.


When did you/will you introduce solids, and why?



Carruth et al. Addition of supplementary foods and infant growth (2 to 24 months). J Am Coll Nutr. 2000 Jun;19(3):405-12.

Cohen et al. Effects of age of introduction of complementary foods on infant breast milk intake, total energy intake, and growth: a randomised intervention study in Honduras. Lancet. 1994 Jul 30;344(8918):288-93.

Dewey, K. Nutrition, Growth, and Complementary Feeding of The Brestfed InfantPediatr Clin North Am. 2001 Feb;48(1):87-104.

Filipiak et al. Solid food introduction in relation to eczema: results from a four-year prospective birth cohort study. J Pediatr. 2007 Oct;151(4):352-8. Epub 2007 Aug 23.

Fiocchi et al. Food allergy and the introduction of solid foods to infants: a consensus document. Ann Allergy Asthma Immunol. 2006 Jul;97(1):10-20; quiz 21, 77.

Greer et al. Effects of Early Nutritional Interventions on the Development of Atopic Disease in Infants and Children: The Role of Maternal Dietary Restriction, Breastfeeding, Timing of Introduction of Complementary Foods, and Hydrolyzed Formulas. Pediatrics. 2008 Jan;121(1):183-91.

Macknin et al. Infant sleep and bedtime cereal. Am J Dis Child. 1989 Sep;143(9):1066-8.

Nielsen et al. Adequacy of Milk Intake During Exclusive Breastfeeding: A Longitudinal Study. Pediatrics. 2011 Oct;128(4):e907-14. Epub 2011 Sep 19.

Nwaru et al. Age at the Introduction of Solid Foods During the First Year and Allergic Sensitization at Age 5 Years. Pediatrics. 2010 Jan;125(1):50-9. Epub 2009 Dec 7.

Oberlander et al. Short-term effects of feed composition on sleeping and crying in newborns. Pediatrics. 1992 Nov;90(5):733-40.

Tarini et al. Systematic Review of the Relationship Between Early Introduction of Solid Foods to Infants and the Development of Allergic Disease. Arch Pediatr Adolesc Med. 2006 May;160(5):502-7.

Night Nursing and Cavities

Extended nursing is loosely defined. In the United States, where only about a third of babies are exclusively breastfed until 3 months of age and fewer than a sixth are exclusively breastfed until 6 months of age (per the CDC), one could reasonably claim that breastfeeding beyond a year is “extended.” The American Academy of Pediatrics recommends breastfeeding for at least a year (with complementary foods after six months of age), while the World Health Organization recommends at least two years. It goes without saying, then, that a baby breastfed per the recommendations of these organizations will still be breastfeeding when teeth have come in. Some lucky parents have babies who start sleeping through the night at only a few months of age, while other mothers find themselves nursing once, twice, or even multiple times per night well beyond a baby’s first birthday. Certain sources, including La Leche League, suggest that breast milk isn’t cariogenic (cavity-causing), and even protects the teeth. Others, however, suggest that breast milk pooling in a baby’s mouth leads to early cavities, which can have significant ramifications for later oral health. What does the science say about night nursing and cavities?

One problem with finding a scientific answer to this question is that it’s difficult research to do. Case studies — reports of medical findings in a given individual — provide a limited amount of information, but aren’t a strong platform from which to derive inductive generalizations. This is because it’s difficult or impossible to establish causality in the case of an individual. As such, while there are reports in the literature of nursing caries associated with breastfeeding, these don’t support the conclusion that night nursing causes cavities.

Stronger evidence that night nursing either is or is not associated with cavity formation comes from population-level analysis. Dentist Brian Palmer, who studies ancient human skulls, concludes that there’s no connection between breastfeeding and night nursing on the grounds that 1) there isn’t evidence of cavities in ancient skulls of children, and 2) these children were probably breastfed for an extended period of time. Unfortunately, there are several problems with his theories. First, he has no proof that children were nursed at night (yes, they probably were…but he has no proof). Second, he does not take into account other aspects of diet that could significantly impact dental health. The conclusion he can reasonably draw from his research is that nursing didn’t cause cavities in children 500-1000 years ago, but it’s not possible to generalize this conclusion to today’s children because of significant dietary and lifestyle differences.

A few studies have looked at populations of modern children in an attempt to determine whether night nursing correlates with cavities. A study of children in Tehran found an association between bottle-feeding with milk at night and cavity development, but no association between breastfeeding at night and cavity development (Mohebbi et al). A study of Swedish children found that it was the intake of cariogenic food that was most associated with early cavity formation (Hallonsten et al). This finding weakens the findings of the Mohebbi study where they apply to children in the U.S. and other Western industrialized nations, because of significant dietary differences. Hallonsten also found, interestingly enough, that children who engaged in extended breastfeeding were more likely to consume cariogenic foods and have other cavity-promoting dietary habits than those who weaned at younger ages. A study of Dutch children found that frequency of breastfeeding and lack of fluoride were most associated with development of cavities (Weerheijm et al).

Note that the experimental design in the studies above is not one that allows determination of causality, only correlation. It’s possible that parents who breastfeed at night also engage in or encourage behavior x (whatever that might be), which predisposes their children to (or helps prevent) cavities.

Cavities are, of course, complicated things. There are a multitude of factors that make them more likely (bacterial colonization of the mouth, intake of cariogenic foods), as well as factors that make them less likely (dental hygiene, fluoride). Perhaps the most important question to answer in order to inform the night nursing/cavities association is whether human milk itself is cariogenic. La Leche League claims it is not, but this appears not to be supported by any particular scientific evidence, as they cite no direct research on the cariogenicity of human milk. Research evidence, in contrast, suggests that human milk is mildly cariogenic, though far less so than sugar water or soda (Bowen et al). The researchers ranked the cariogenicity of various tested substances as follows: table sugar, 1; soda (cola), 1.16 (the acid probably contributed to the increased cariogenicity as compared to table sugar); honey, 0.88; human breast milk, 0.29; cow’s milk, 0.01; distilled water, 0. The authors speculated that the increased cariogenicity of human milk as compared to cow’s milk may be due to the greater concentration of lactose in human milk, and (likely more important) the much lower concentration of dental health-supporting minerals (such as calcium and phosphate) in human milk. Based upon this research, it is unreasonable to suggest that human milk is non-cariogenic.


Science Bottom Line:* Human milk is approximately 1/3 as cariogenic as table sugar, and should be treated as a mildly cariogenic food. It’s probably reasonable to consider brushing a child’s teeth after a night nursing session, or at least wiping them off with gauze.


What do you do to help prevent nursing cavities in your night-nursing baby or toddler?



Bowen et al. Comparison of the cariogenicity of cola, honey, cow milk, human milk, and sucrose. Pediatrics. 2005 Oct;116(4):921-6.

Hallonsten et al. Dental caries and prolonged breast-feeding in 18-month-old Swedish children. Int J Paediatr Dent. 1995 Sep;5(3):149-55.

Mohebbi et al. Feeding habits as determinants of early childhood caries in a population where prolonged breastfeeding is the norm. Community Dent Oral Epidemiol. 2008 Aug;36(4):363-9.

Palmer; B. Breastfeeding and infant caries: No connection. ABM News and Views 2000; 6(4): 27,31.

Palmer B. The Influence of Breastfeeding on the Development of the Oral Cavity: A Commentary. J Hum Lact 1998;14:93-98.

Weerheihm et al. Prolonged demand breast-feeding and nursing caries. Caries Res. 1998;32(1):46-50.

Are Megadoses of Vitamins Healthy and Safe?

Photo by Ragesos

Megavitamin therapy is the use of very large doses of vitamins to prevent or treat illness or some symptom thereof. While not the first major proponent of megavitamin therapy, Linus Pauling is perhaps the best known; he advocated using huge doses of vitamin C (many grams per day) to treat and prevent disease. As a chemist, I have enormous respect for Pauling. His publication record is impressive, and his work in quantum chemistry helped lay the foundation of that field. Oh, and he was part of the team that helped discover the structure of DNA, which was kind of the Holy Grail of chemistry. As much as I hold him in esteem as a chemist, however, I have to wonder what business he thought he had dabbling in nutrition and medicine; he had training in neither of those fields. In any case, there was and is no evidence to support any of Pauling’s theories regarding megadoses of vitamin C. Neither is there evidence to support use of other vitamins in megadoses. The popularity of megavitamins is a “more is better” fallacy. Here’s the science bottom line: we need vitamins in small amounts. They serve a variety of critical roles in the body, and we experience illness, disease, and loss of function in the case of deficiency. More of a vitamin than the body needs, however, does it no good. Depending upon the vitamin, it’s either excreted or builds up and becomes toxic. Let’s stick with vitamin C as an example. Among its functions, vitamin C helps maintain the immune system; you’ll become more susceptible to disease (among other symptoms) if you’re vitamin C deficient. However, taking more vitamin C than recommended (the USDA currently recommends 90 mg/day for adult men, and 75 mg/day for non-pregnant, non-lactating adult women) doesn’t “supercharge” the immune system or help it function any better than it otherwise would. Think about it like this: if you’re trying to wash your hair in the shower, you need shampoo. If you use none, your hair doesn’t get clean. If you use a teeny, tiny amount, your hair gets a little cleaner. Use more, and your hair gets cleaner…up to a point. Once you’re using sufficient shampoo (usually anywhere from a dime-size to a quarter-size dollop, depending upon how much hair you have and how dirty it was), using more won’t get your hair any cleaner. It won’t do anything at all…except go down the shower drain. The same is true of vitamin C. Consume none, and you have problems. Consume some (but less than you need), and you have less severe symptoms of deficiency. Get what you need, and you achieve normal function (where it relates to vitamin C). If you take more vitamin C than your body needs to maintain function, however, the excess is excreted and goes down the toilet (or the shower drain, I suppose, depending upon your habits). Some vitamins aren’t as forgiving. For instance, vitamin A is quite toxic in megadoses. In any case, while there’s all sorts of scientific evidence to support getting your recommended daily dose of each vitamin, there’s just no evidence for — and in many cases, there’s evidence against — using megavitamins.


What’s your vitamin strategy?



USDA Dietary Guidelines. Accessed 11 Nov 2011.


Nitrates, Cancer, Lunch Meat, and Celery — Should You Worry?

Nitrates and nitrites are chemically related to one another, and are commonly used as preservatives in a variety of food items. Bacon is perhaps the most notable example, but many packaged, processed meats — including many lunch meats — are among those that contain nitrates and nitrites. Even “natural” lunch meats, which don’t list nitrates or nitrites on the ingredients label, can contain dried celery juice. This is a natural source of nitrates and nitrites.

First and foremost, nitrates and nitrites, while not chemically identical, are implicated in similar health consequences. This is because a chemical reaction converting nitrate into nitrite takes place in human saliva. Approximately 5% of ingested nitrate is converted to nitrite in the saliva of adults and children, while approximately 10% of ingested nitrate is converted to nitrite by infants (Spiegelhalder et al). The problem is that nitrites then go on to engage in a variety of undesirable chemical reactions. For one thing, they can react with chemicals called secondary amines to produce new compounds called nitrosamines. Nitrosamines are carcinogenic (cancer-causing) in animals (Swann et al), and there’s very good evidence — in fact, the relatively conservative Linus Pauling Institute of Oregon State University goes so far as to say there’s “an enormous amount of evidence” — to suggest they’re carcinogenic in humans as well (though of course, for obvious ethical reasons, no controlled scientific studies have been done). Secondary amines, put very succinctly, make up proteins, which in turn make up a major portion of the structural and functional componentry of every living cell. In short, living creatures have an abundance of secondary amines in the body, meaning that there’s no barrier to formation of carcinogenic nitrosamines upon the consumption of nitrites or nitrates. Meats are also made up of protein, which means a meat preserved with nitrate or nitrite salt contains all the necessary ingredients for nitrosamine formation.

Cancer-causing nitrosamines aren’t the only reason to be concerned about nitrates and nitrites; they can cause methemoglobinemia (“Blue-Baby Syndrome”) as well. This results from the reaction of nitrites with hemoglobin, which is the protein in red blood cells that binds to oxygen and carries it to the tissues. The reacted hemoglobin, called methemoglobin, can’t carry oxygen as efficiently. While one of the major causes of Blue-Baby Syndrome is consumption of formula made with nitrate-rich water by infants under 6 months of age (often water that is polluted by fertilizer runoff and other sources of nitrates), there have been reports in the literature of nitrites from vegetables leading to the syndrome (Sanchez-Echaniz et al). The Sanchez-Echaniz study reported on cases of methemoglobinemia in infants up to 13 months of age, which is troubling, but the vegetables in question were homemade purees that had been stored for some time (as opposed to fresh vegetables). Because there’s considerable evidence to suggest that nitrites and nitrosamines can cross the placenta (see, for instance, Gruener et al, Althoff et al), pregnant women are generally advised to avoid nitrate- and nitrite-containing foods during pregnancy. Interestingly enough, however, maternal consumption of high-nitrate water during lactation doesn’t appear to increase nitrate levels in breast milk (Dusdieker et al). The study did not report on nitrite or nitrosamine levels in breast milk, however.

Because a certain amount of exposure to nitrates and nitrites is unavoidable — they’re naturally-occurring, such that even avoiding processed food isn’t a mechanism for completely eliminating them from the diet — it’s more useful to discuss how much nitrate and nitrite is safe, rather than whether nitrate and nitrite are safe. The EPA standards for nitrates and nitrites in drinking water are set at no more than 10 mg/L and 1 mg/L, respectively. The dose of nitrate considered “safe” by the EPA is 1.6 mg/kg daily, while that for nitrite is 0.1 mg/kg daily. For a 15-pound baby (perhaps an average 6-month-old), that correlates to no more than 10.9 mg of nitrate and 0.68 mg of nitrite daily. Dietary nitrate and nitrite intake vary significantly with eating habits, but vegetables are the most major source of dietary nitrates in most individuals (White), providing just over 80% of total daily nitrate. Spinach, raw lettuce, and cooked beets are among the highest in nitrate concentration (van Velzen et al), with celery only slightly behind (White). All in all, an “average” adult likely gets about 86 mg/day of nitrate and 0.2 mg/day of nitrite from vegetables, in addition to 16 mg/day of nitrate and 3.92 mg of nitrite from cured meat. Given the 5% conversion, give or take, of nitrate in vegetables and meat to nitrite, the vegetables contribute a total of about 4.5 mg/day of nitrite, while cured meats contribute a total of about 4.72 mg/day of nitrite. This suggests that cured meats, while not the most significant source of dietary nitrates, are nevertheless the most significant source of dietary nitrites in most adults. If an average adult gets about 9.22 mg/day of nitrite from all sources (which is above the “safe” dose of 6.8 mg/day for a 150-pound adult), then clearly the average diet is a bit too high in nitrite for absolute safety’s sake. Because of the many health benefits associated with consuming vegetables, however, it isn’t reasonable to suggest reducing vegetable consumption (or even consumption of high-nitrate vegetables, like spinach) on these grounds. Cured meats, however, do not serve a unique and important dietary function, and it is therefore reasonable to suggest limiting cured meat consumption. Consuming a normal quantity of vegetables on a daily basis while reducing cured meat consumption to no more often than every other day is a reasonable mechanism for staying within EPA-suggested nitrate and nitrite limits.

The final question is whether nitrates and nitrites are any healthier if they come from celery as opposed to from added nitrate and nitrite salts. This is a bit difficult to quantify, because meat companies don’t report the amount of nitrate and nitrite in their product, but it’s reasonable to assume that the total quantity of preservative is probably similar, regardless of whether it comes from celery (as in “natural” meats) or from added sodium nitrate and sodium nitrite. Compared to a meat preserved with sodium nitrite, meat preserved with celery juice is probably safer. This is because the conversion of nitrate to nitrite is so inefficient, and because celery contains nitrate rather than nitrite. Compared to a meat preserved with sodium nitrate, however, a meat preserved with celery juice likely has very similar total nitrate concentration, as as such, there would be little difference between the two. These are speculations, but they’re reasonable and measured speculations.


Science Bottom Line:* Don’t cut down your vegetable consumption because you’re worried about nitrates and nitrites, but consider eating lunch meat no more than every other day. Be aware that there are nitrate and nitrite preservatives in some other foods as well (generally processed ones), so read packages carefully. Celery-preserved meats are probably better than nitrite-preserved meats, but may be quite similar to nitrate-preserved meats.


Do you watch for nitrates and nitrites in your diet?



Althoff et al. Transplacental effects of nitrosamines in Syrian hamsters: I. Dibutylnitrosamine and nitrosohexamethyleneimine. Z Krebsforsch Klin Onkol Cancer Res Clin Oncol. 1976 May 3;86(1):69-75.

Dusdieker et al. Does increased nitrate ingestion elevate nitrate levels in human milk? Arch Pediatr Adolesc Med. 1996 Mar;150(3):311-4.

EPA Drinking Water Contaminants. Accessed 1 Nov 2011.

EPA Nitrate. Accessed 1 Nov 2011.

EPA Nitrite. Accessed 1 Nov 2011.

Gruener et al. Methemoglobinemia induced by transplacental passage of nitrites in rats Bull Environ Contam Toxicol. 1973 Jan;9(1):44-8.

Linus Pauling Institute Nitrosamines and Cancer. Accessed 1 Nov 2011.

Sanchez-Echaniz et al. Methemoglobinemia and consumption of vegetables in infants. Pediatrics. 2001 May;107(5):1024-8.

Spiegelhalder et al. Influence of dietary nitrate on nitrite content of human saliva: Possible relevance to in vivo formation of N-nitroso compounds. Food Cosmet Toxicol. 1976 Dec;14(6):545-8.

Swann et al. Nitrosamine-induced carcinogenesis. The alklylation of nucleic acids of the rat by N-methyl-N-nitrosourea, dimethylnitrosamine, dimethyl sulphate and methyl methanesulphonate. Biochem J. 1968 Nov;110(1):39-47.

van Velzen et al. Relative significance of dietary sources of nitrate and nitrite. Toxicol Lett. 2008 Oct 1;181(3):177-81. Epub 2008 Aug 3.

White, J. Relative significance of dietary sources of nitrate and nitrite. J Agric Food Chem. 1975 Sep-Oct;23(5):886-91.

Fish Oil And Health

I wanted to follow up last week’s post on DHA supplementation with a look into the research on fish oil supplementation, since while fish oil is a common source of supplemental DHA, there are supplements that contain pure DHA (as opposed to the normal mix of fats present in fish oil). While I concluded that there isn’t scientific research to support supplementing with pure DHA, there’s a fair amount of work that supports fish oil supplementation.

Certain benefits associated with fish oil supplementation begin during pregnancy. Thorsdottir (ok, sorry for the commentary, but that is a REALLY cool name when you say it out loud!) and colleagues found that Icelandic women who consumed the lowest quantities of fatty fish had smaller babies than those who consumed larger quantities of fish. Interestingly enough, however, those who consumed the most fish (containing more than a tablespoon of fish oil daily) also had shorter babies with smaller head circumferences. These women were getting three times the recommended daily vitamin A, and twice the recommended vitamin D as a result of their very high fish intake, which the researchers speculated might have had something to do with the results (both vitamins A and D are toxic in excessive quantities). Thorsdottir and colleagues recommended moderate fish and/or fish oil consumption during pregnancy (though “moderate” to an Icelandic research team is probably not the same as “moderate” to an American, given dietary norms). Olsen and colleagues found that moderate fish oil supplementation helped prevent pre-term delivery of a singleton baby in a high-risk (earlier pre-term delivery) mother, though the fish oil didn’t prevent pre-term delivery of twins. The researchers noticed no negative effects of fish oil on either mother or infant. Fish oil supplements during pregnancy appear to extend their effects into the first six months of lactation (Dunstan et al, 2007). Polyunsaturated fatty acids (PUFAs) from fish oil clearly pass into breast milk The key PUFAs are eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA), which typically represent a large fraction of the total fish oil in a supplement capsule. In the Dunstan study, women who took fish oil during pregnancy (but not continuing into lactation) had higher levels of DHA in breast milk at three days, six weeks, and six months postpartum. The differences between the women who were supplemented and those who were not supplemented disappeared after six months postpartum. Infants of women with higher PUFA levels in breast milk had higher PUFA status themselves, and scored higher on a variety of developmental tests (both physical and cognitive) at 2.5 years. On the other hand, a very large study by Makrides et al failed to find much benefit associated with fish oil supplementation during pregnancy; women taking fish oil did not have lower rates of postpartum depression, nor did their infants score better on developmental assessments. A commentary on that same study (Oken et al) points out that some of the observational studies suggesting PUFA intake benefits (such as the Icelandic study by Thorsdottir) are based upon intake of whole fish, rather than fish oil supplements. Whole fish could contain more biologically active PUFAs, minerals that affect PUFA action, or other unidentified compounds. Oken and colleagues also note that the Makrides study tested infant development, while other fish oil studies tested toddlers and preschoolers. They point out that tests might not be sensitive enough to detect differences in infant development, and that differences might not become apparent until the babies were older. A further commentary on the Makrides study (Suzuki) points out that there were some differences in postpartum depression levels between women supplemented with fish oil and those receiving a placebo, but that the depression score cutoff value that Makrides et al chose did not allow for detection of those differences. Suzuki suggests that fish oil supplementation may play a role in reducing cases of subclinical (or less easily detected) depression.

Continued fish oil supplementation during lactation also appears to have benefits. Supplementation appears to increase levels of IgA (a type of antibody passed from mother to baby through breast milk) (Dunstan et al, 2004). There’s also evidence that it helps to reduce the risk of allergies (see, for instance, Dunstan et al 2003, Furuhjelm et al). Direct supplementation of infants may also confer benefits; a study by Damsgaard and colleagues noted that infants supplemented with fish oil had healthier blood lipid (fat) profiles at a year of age. Since blood lipid profile is a marker for heart disease risk, this is a potentially important finding.

Developmental benefits aside, fish oil supplements have also been associated with a reduction in several inflammatory disease processes, including rheumatoid arthritis (Kremer et al), asthma (Nagakura et al), ulcerative colitis (Hawthorne et al), and cardiovascular disease (see, for instance, Nestel et al, Geleijnse et al). Many of the studies on the benefits of fish oil refer to doses in the neighborhood of 3-4 grams of fish oil (containing 1-2 grams each of EPA and DHA) a day, with the caveats that while some fish oil is better than none, higher dosages show diminishing returns (and possibly harm).

It’s worth noting that while the benefits above are all conferred by DHA-containing fish oil capsules (and while many of the benefits are directly linked to the DHA in those capsules), supplementation with fish oil isn’t the same as supplementation with pure DHA. This is because fish oil is a blend of many different fats, of which DHA and EPA are only two. Research has shown repeatedly that separating out, purifying, and supplementing with a single compound suspected to be the “active” agent in a healthful food can have unintended (and sometimes detrimental) consequences. This is, for instance, what Miller and colleagues found in their work on vitamin E, which had previously (Knekt et al) been touted as having heart disease-reducing properties. In the case of vitamin E, it’s likely that by separating out a single form (alpha-tocopherol) of a vitamin that occurs in nature as a mixture of several forms, the supplemental vitamin E could be sending an unintended biological signal. Further support for the notion that there’s more to fish than DHA lies in the observation (Oken et al) that habitual fish-eaters note more predictable fish-related benefits than those taking fish oil supplements. With regard to supplements, there are two things to keep in mind: more is not better, and purifying the active ingredient isn’t necessarily an improvement over seeking out a source of that beneficial ingredient.


Science Bottom Line:* There is a multitude of evidence to support using fish oil (or better yet, eating fatty fish regularly!) if you’re…human. And especially if you’re a human who is pregnant, nursing, growing, and/or affected by an inflammatory disease process. No research suggests that moderate fish oil supplementation is harmful, and since fish oil is typically manufactured from small fish like anchovies, there’s absolutely minimal risk of mercury contamination in commercial capsules (meaning you don’t really need to seek out algae-based capsules, and probably shouldn’t, since the research is largely on fish, as opposed to algae, oil).


What has been your experience with fish oil?



Damsgaard et al. Fish oil affects blood pressure and the plasma lipid profile in healthy Danish infants. J Nutr. 2006 Jan;136(1):94-9.

Dunstan et al. Fish oil supplementation in pregnancy modifies neonatal allergen-specific immune responses and clinical outcomes in infants at high risk of atopy: a randomized, controlled trial. J Allergy Clin Immunol. 2003 Dec;112(6):1178-84.

Dunstan et al. The effect of supplementation with fish oil during pregnancy on breast milk immunoglobulin A, soluble CD14, cytokine levels and fatty acid composition. Clin Exp Allergy. 2004 Aug;34(8):1237-42.

Dunstan et al. The Effects of Fish Oil Supplementation in Pregnancy on Breast Milk Fatty Acid Composition Over the Course of Lactation: A Randomized Controlled Trial. Pediatr Res. 2007 Dec;62(6):689-94.

Furuhjelm et al. Fish oil supplementation in pregnancy and lactation may decrease the risk of infant allergy. Acta Paediatr. 2009 Sep;98(9):1461-7. Epub 2009 Jun 1.

Geleijnse et al. Blood pressure response to fish oil supplementation: metaregression analysis of randomized trials. J Hypertens. 2002 Aug;20(8):1493-9.

Hawthorne et al. Treatment of ulcerative colitis with fish oil supplementation: a prospective 12 month randomised controlled trial. Gut. 1992 Jul;33(7):922-8.

Knekt et al. Antioxidant vitamin intake and coronary mortality in a longitudinal population study. Am J Epidemiol. 1994 Jun 15;139(12):1180-9.

Kremer et al. Effects of high-dose fish oil on rheumatoid arthritis after stopping nonsteroidal antiinflammatory drugs. Clinical and immune correlates. Arthritis Rheum. 1995 Aug;38(8):1107-14.

Makrides et al. Effect of DHA Supplementation During Pregnancy on Maternal Depression and Neurodevelopment of Young Children: A Randomized Controlled Trial. JAMA. 2010 Oct 20;304(15):1675-83.

Miller et al. Meta-Analysis: High-Dosage Vitamin E Supplementation May Increase All-Cause Mortality. Ann Intern Med. 2005 Jan 4;142(1):37-46. Epub 2004 Nov 10.

Nagakura et al. Dietary supplementation with fish oil rich in omega-3 polyunsaturated fatty acids in children with bronchial asthma Eur Respir J. 2000 Nov;16(5):861-5.

Nestel et al. Fish oil and cardiovascular disease: lipids and arterial function. Am J Clin Nutr. 2000 Jan;71(1 Suppl):228S-31S.

Oken et al. Fish, Fish Oil, and Pregnancy. JAMA. 2010 Oct 20;304(15):1717-8.

Olsen et al. Randomised clinical trials of fish oil supplementation in high risk pregnancies. Fish Oil Trials In Pregnancy (FOTIP) Team. BJOG. 2000 Mar;107(3):382-95.

Suzuki T. Maternal Depression and Child Development After Prenatal DHA Supplementation — A Reply. JAMA. 2011 Jan 26;305(4):359-60; author reply 360-1.

Thorsdottir et al. Association of fish and fish liver oil intake in pregnancy with infant size at birth among women of normal weight before pregnancy in a fishing community. Am J Epidemiol. 2004 Sep 1;160(5):460-5.

Previous Older Entries