Thursday, September 09, 2010

Lack of evidence for lactose remedies?

My friend Rob recently emailed me an article synopsis distributed by the Canadian Medical Association. It said “Lack of evidence for lactose remedies” (!), in reference to Systematic Review: Effective Management Strategies for Lactose Intolerance by Shaukat, Levitt, Taylor et al. This synopsis said: “Good evidence does not exist for lactose-reduced milk, lactase supplements, or probiotics to reduce gastrointestinal symptoms of lactose intolerance in people older than 4 years. Most studies were of low quality and did not find a benefit over placebo”. Well, at least they flat out admit that the studies were of low quality! I was flabbergasted to hear that some people – scientists, even – were actually saying that they didn’t see the benefits of treatments for lactose intolerance, when I live with it every day and see the benefits first-hand. How could this be?

So I read the article itself to find out more. It is a government-funded literature review, not an original study with participants. That in and of itself is a problem, because the authors admit that the different studies reviewed had not only different protocols, but different definitions of lactose intolerance. The authors also found that participants could drink a cup of milk without a problem, more if it was taken with food. While it is true that I can digest lactose more easily if there is other food involved, there’s no possible way I can drink a cup of regular milk and not get really sick. There are different degrees of lactose intolerance, though, so I figured perhaps the patients in the studies reviewed were only mildly lactose intolerant – in which case, it’s too bad the sample wasn’t more representative. But it turns out that the authors flat out say that most of the subjects suffered from lactose malabsorption rather than lactose intolerance. One of the studies cited actually said: "Participants were not required to have symptoms compatible with lactose intolerance before enrollment". So I think the authors shouldn't be generalizing their findings to the LI population and saying that products like lactase are ineffective in our case, because that’s not helping anyone! I mean, how can the study be about lactose intolerance if none of the participants are lactose intolerant? How does this get funded and subsequently published?

Another problem (as if I needed to dig any further) is that different brands of lactose-reduced milk might have different methods of removing lactose, and therefore different levels of lactose remaining once they are sold. I’ve had issues with that since moving here a month ago (I'm trying different brands now, since my Natrel is not available here, but I think the last one I tried still had too much lactose in it for my body). The same is true about lactase pills: different brands have different strengths and act more or less quickly, so whatever brand was used in those studies may not have been appropriate for the participants anyway. And on the subject of probiotics: the bacteria that help break down lactose are (or at least include) Lactobacillus acidophilus and Lactobacillus bulgaricus. But when a product like yogurt claims to have probiotics or active cultures, it doesn’t necessarily have those particular ones! So the yogurt has greater odds of being digestible for LI sufferers than if it didn’t have probiotics (probably because odds are then greater that it's real yogurt made the old-fashioned way with bacteria that make our life easier), but there’s no way to know for sure without either knowing exactly what’s inside or actually trying it to see how our body reacts. (By the way, here’s what Steve Carper, from Planet Lactose, said about yogurt and probiotics: Part 1 and Part 2).

Regarding Systematic Review: Effective Management Strategies for Lactose Intolerance, my friend Rob says: “I am biased in thinking that it is a good article, since it is published in a reputable journal, although even the best journals make mistakes. I think the key point I take from this is the surprising paucity of well-done research on the subject. I would have imagined that there would be more and better studies, especially considering the relatively high prevalence (especially amongst certain ethnicities) of lactose intolerance.” Personally, I think that the reason there is so little research done on the subject is that lactose intolerance is not really a disease – if anything, it’s actually the norm, with the exception of Caucasians. No one will die from it and it is easily treatable, so the potential to get rich by finding a cure for it is limited. (I’ve heard claims by Lactagen that they sell a cure, but I keep thinking that if it really worked, the medical community would know about it and I wouldn’t have to find out through a pop-up ad online.)

I emailed Planet Lactose’s Steve Carper (who literally wrote the book on lactose intolerance) to seek his opinion on this article. Here’s what he had to say: "I see that the [paper] is from the NIH Conference on the state of lactose intolerance. I actually attended that conference and did commentary on every paper for my blog, back in March. If you read - or, realistically, scan - those entries, you'll see that every single paper that touched on this issue reads the same way. Scientists can't reproduce in their labs the symptoms that people report in the real world. If you can't reproduce the symptoms, then you're not going to get good results from testing symptom relief. However, most of the studies are themselves not terribly good, for reasons I go into. I'm fairly sure that lactase pills work for many people in many real world situations. But I can't find very good scientific proof of that."

The authors of Systematic Review: Effective Management Strategies for Lactose Intolerance also presented other papers that were variations on the same theme at the NIH conference. There’s one called Effective Strategies for the Management of Individuals With Diagnosed Lactose Intolerance; Steve Carper reviewed it here and basically said: “The literature burped up a grand total of 37 studies for managing lactose intolerance. Almost all of them showed nothing of interest or were based on such small and bad samples that they added up to nothing. The limp conclusion: using lactose-reduced milk reduced symptoms in the lactose intolerant.” There was also another author, Savaiano, who wrote The Tolerable Amount of Lactose Intake in Subjects with Lactose Intolerance, reviewed here. In a nutshell, the author said that nobody gets symptoms from lactose. To which Steve Carper says: “Ridiculous, right? Ludicrous, even. This whole blog is about lactose intolerance. My books are about LI. The conference was the state-of-the-science on LI. I've received thousands of letters and emails and posts from people telling me about their LI symptoms. [The presenter is a researcher who's spent his entire career] writing about LI. Something's totally nuts here. I wish I knew what. […] I didn't get it then, and I don't get it today. I'm reporting what the medical journal evidence says. […] But I'll put the concluding paragraph here.‘We stress the importance of additional scientific investigations to provide evidence-based and culturally sensitive recommendations about the amount of daily lactose intake that can be tolerated by lactose-intolerant individuals, with special emphasis on pediatric and adolescent populations and pregnant and lactating women.’ That's the biggest ‘We don't understand what the hell's going on, give us some funding money’ you'll ever see in scientific language.”

I’m always one to complain about the fact that journalists often misreport information from scientific publications, because they usually just take the title or conclusion and run with it, without bothering to read the article or be critical about it in any way. So I guess this is one case where I should be grateful that the only member of the press at the NIH conference last spring was Steve Carper – though to me that only illustrates the media’s pathetic lack of interest in this condition.

4 comments:

The Engineer said...

Studies on lactose intolerance that do not 1) have lactose intolerant test subjects, and 2) no methodology for making meaningful symptom measurements are dumb!

Anonymous said...

I must point out one clarification, though.
The article title and summary are fairly clear that they are referring more to a paucity of evidence rather than to evidence that these treatments don't work.

The technique for conducting a systematic review involves looking through all the literature on a topic in a very thorough and precise way and then combining all the data and results from the different studies that are judged to be sufficiently scientifically rigorous (ie. you don't include studies that are done poorly or are too subject to bias). Statistical methods allow combination of the different results and a new analysis is done using the larger sample. In stats, larger sample sizes allow you to more accurately detect effects of treatment or differences between treatments/groups.
The meta-analysis technique also tends to be somewhat conservative as any dramatic study effects get washed out when added to other studies. Similarly, they are supposed to take into account other factors that may affect one paper that do not affect another paper, which further reduces the chance that the meta-analysis finds a treatment effect when none is actually present (type 1 error). This is why well conducted meta-analyses are the highest level of evidence in medicine: you get lots of data from different studies and different labs (to reduce bias or unusual findings associated with specific labs or locations), you can control for several factors (publication bias, study quality, etc...), and unusual findings are diluted by the total sum of evidence.

When a meta-analysis reveals very few decent quality and relevant studies, the results are much more likely to be close to the null hypothesis (no treatment effect). In fact, the authors state that the data they found was so miserable and different from one study to the next that they could not combine the data for a better analysis. A meta-analysis like this is important not because it says that lactose intolerance strategies don't work (which it never says) but rather because it says that there is insufficient evidence to prove that lactose intolerance strategies work. This is why they emphasize the need for further research and hopefully this article may spur that along.

As an aside, I don't think one needs to worry about the funding cost for a study like this. Meta-analyses are typically done very cheaply because all you need is a computer, some free or relatively cheap software, and a bit of time. Not a huge expense.

All together, I think a key point to remember when looking at this article and summary is a key line from evidence based medicine: absence of evidence does not equal evidence of absence.

Amélie said...

I totally agree with you about the use of the meta-analysis technique in general. The authors of the systematic review actually mentionned the number of studies they were unable to use because of their stricter criteria, so I'm not saying they did a bad job. I was referring more to the funding of studies with actual participants and experiements that are poorly planned, which is usually evident early on.

I'm just frustrated that all scientists seem able to say is that there's not enough evidence and they need more funding. I've had a scientific education myself, so I know that most authors actually WANT to end a paper saying "please give us more money", because then they can keep on doing their job, but at the same time... Can't we just create a solid methodology for a study that could prove the efficiency of treatments for LI? Like get participants who have been diagnosed with LI, measure how much lactose they normally tolerate and what/how many symptoms they have beyond that threshold, and THEN see how they react with a treatment? If so many people with LI see the treatment work in their everyday life, how hard can it be to reproduce that in a lab?

Otherwise, we just end up with a bunch of studies that say that more evidence is needed one way or another, which makes me feel like the study was poorly constructed from the start - like the study cited where participants were not required to even have any symptoms before enrolment... Are you kidding me with that?

Anonymous said...

I would agree that there are definitely poorly done studies included and a paucity of high quality studies.
In my mind, there are three reasons:
1) LI is not a "sexy" disorder like some others (I will refrain from saying which in order to avoid more controversy), and it does not have a group of people who are strongly advocating for new treatments and more research like there are for other conditions.
2) The food companies have no interest in conducting research because they are already selling products that they claim are helpful (some likely are and some likely are not). As soon as something moves from food to medicine, the rules become much more stringent about how it is produced and distributed. The drug companies probably see a saturated market that would not be worth investing billions into for R&D.
3) I've come to the conclusion that the majority of medical research is done by med students, residents, and junior staff who are still practicing being researchers and are unable to conduct the big studies that would be needed to prove that LI works. In my field, a study with 100 people in it total is considered large and if I can do a study involving more than 30 people, that's an accomplishment. The truth is that physicians sometimes make good researchers, but more often then not, MDs don't have the training like grad students do. With practice we get better, but it takes time and, unfortunately, there is very little money to be made in research (especially if you are not a university prof and very few MDs are).

In the end, I do agree that a study should be easy to do, but doing it well (properly controlled and planned out with sufficient sample size and double blinding) is more difficult and much more expensive. On the other hand, some of the studies in the meta-analysis are just crap. Research, like everything else, is 90% garbage and 10% useful.

I forgot what my point was...