Holford tries to respond to questions raised by BBC documentary. He fails.

Cat is jumping across a gap in a kitchen but will not reach its target; the caption reads 'About to Fail'

Having just posted about Professor Patrick Holford of Teesside University’s curious relationship with the mainstream media, we were fascinated to see Patrick Holford responding to the Radio 4 programme: The Rise of the Lifestyle Nutritionists. From what he writes, it sounds like he does feature in Part 2 of the series. I haven’t heard Part 2 yet – it’s scheduled to be broadcast on March 31 at 8pm – but it’s already clear that Holford fails to offer an adequate response to the questions raised. His responses range from dodging the questions asked, to answering while giving a clearly incorrect answer, and so gobsmackingly wrong that they even fail to qualify as wrong. Now I’m really looking forward to the radio programme: Holford digs himself in deep enough without having heard the programme, but I’m sure that the BBC’s research skills will allow them to provide a JCB or two to join Holford in his hole.

Holford complains that:

Two weeks ago I was approached by Radio 4 saying they were doing a programme, presented by Ben Goldacre, on nutritional therapy and wanted to do a pre-recorded interview, that would be edited.

I declined the interview for the reasons given below but I provided Radio Four with comments in answer to their questions to be read out verbatim on the programme. They would not agree to reading these out unedited however and so did not use them

So, maybe Holford has extrapolated from his own editing and treatment of Dr John Marks’ remarks. Mean old BBC, right? Except Holford’s “short and concise” response to the BBC runs to over 1000 words; there is considerably more on the Food for the Brain site. Keep in mind that the radio programme has about a 27min slot, and it would take over 10mins to read out Holford’s response – and the questions he’s responding to – at any vaguely sensible speed. And it is unlikely that the programme is all about him.

Anyway, given the quality of these responses, I’m sure that the BBC would have loved to have been able to quote from them. I can’t cover them all here, so I will just pick out a few choice examples. What I find especially interesting is that – not only is this an epic failure on Holford’s part – but he seems entirely unaware of this fact, even claiming victory at the same time as he digs himself in deeper.

Holford’s unique approach to research is especially clear in Food for the Brain’s attempts to defend their widely reported Child Survey. The one that we explored and discussed in detail that has been described as ‘mercilessly thorough’. FFTB acknowledge a number of problems with the research:

  • While the survey is touted as including 10,222 children, Holford admits that he ended up comparing two of the groups of children analysed; these groups contained just 32 and 42 members respectively. Most researchers would see this as a problem: for example, a handful of extended families could constitute all of one or both of these groups and thus completely skew the results. However, Food for the Brain are still proudly insisting that – for comparisons between these two tiny groups – p<0.05. Do you think it is appropriate to pronounce on the optimal diet for UK children based on 74 children from a unrepresentative sample of 10,222? Some of us would call those groups outliers.
  • In the survey, “The ‘effect sizes’ were calculated by looking at the prevalence of respondents in each (high or low) consumption band giving a ‘very good’ rating.” This is a very non-standard – to the best of my knowledge, it’s completely unique – way to calculate effect size. While one may need to introduce novel approaches to statistics from time to time, these need to be argued for: for Holford to blithely use and calculate ‘effect size’ in such a unique way, without mentioning this in the survey, is simply wrong. And wrong at a very basic level.

One more thing to note about the survey – again, using technical language in a very unique and unclear way, without explaining why – is a simple Q&A that will stand by itself:

Q. The word “variance” seems to be used throughout not in the usual statistical sense of the word. Am I right in supposing that it is being used here as a synonym for ‘difference’?
A.Yes.

In Holford’s ‘full response’ on his own site he – entertainingly – continues in his habit of confusing food allergies and intolerances. To quote from (just one) part of the response where he does this:

While the conventional view is that IgE antibodies are responsible for most immediate onset allergies, there is growing evidence that IgG antiobidy [sic] mediated reactions, may indeed be responsible for more ‘hidden’ allergies.

The type of IgG mediated intolerances that Holford refers to are quite different from IgE mediated allergies: while food intolerances are inconvenient and may be unpleasant, allergies can kill (for example, can cause anaphylactic shock). And – as we’ve shown – the type of IgG blood tests recommended by Holford are not an effective way to diagnose food allergies or intolerances. (I would also be wary of taking advice on allergies from someone who can’t spell ‘antiobidy’.)

Holford was also asked “His views on a famous, some might say, infamous, paper by Bjelakovic et al. [PDF] which describes the possible risks of taking certain antioxidants. Mr Holford’s article was called “Antioxidant Review is a Stitch Up”.”

Given his own rather classic series of blunders in both the literature review and statistical analysis of the FFTB Child Survey 2007, it is with breath-taking and rather touching bravado that Holford offers criticism of the Bjelakovic et al meta-analysis:

[my] two main criticisms of this paper were that the one study, by a Dr Correa from the pathology department at the Louisiana State University Health Sciences Centre, that apparently skewed results for antioxidants overall towards a negative, showed a clear protective effect of antioxidant supplements against gastrointestinal cancer. I decided to contact Dr Correa and he was ‘amazed’, he said, because his research, ‘far from being negative, had shown clear benefit from taking vitamins.’ Correa told us, there was no way the study could show anything about mortality. ‘Our study was designed for evaluation of the progress of precancerous lesions,’ he said. ‘It did not intend, and did not have the power, to study mortality and has no value to examine mortality of cancer.’

Also, the summary of this study states ‘treatment with beta carotene, vitamin A, and vitamin E may increase mortality’ creating the impression these antioxidants are no good. What it failed to say in the summary, all of which are clearly stated in the results, is that ‘vitamin C given singly, or in combination with other antioxidants did not affect mortality, and selenium given singly or in combination with other antioxidant supplements may reduce mortality’. It also fails to say that ‘beta-carotene or vitamin A did not show increase in mortality if given in combination with other antioxidants’, or that ‘vitamin E given singly or combined with 4 other antioxidants did not significantly influence mortality’.

Now that is interesting: we addressed this paper in HolfordWatch’s first ever post. Aw, I’m all misty-eyed with nostalgia… Firstly, one might note that Holford’s ‘Antioxidant Review is a Stitch Up’ piece complained bitterly – and rather strangely – about quite a range of things. For example, Holford objects that:

The next way to investigate whether an analysis is a stitch up is to see if all trials are included, how trials are excluded, and what the trials actually say. Two classic primary prevention studies, where vitamin E is given to healthy people, are those of Stampfer et al, published in the New England Journal of Medicine, the first of which gave 87,200 nurses were given 67mg of vitamin E daily for more than two years. A 40 per cent drop in fatal and non-fatal heart attacks was reported compared to those not taking vitamin E supplements (1). In another study, 39,000 male health professionals were given 67mg of vitamin E for the same length of time and achieved a 39 per cent reduction in heart attacks (2). Guess what? They are not included.

However, as noted in the original post, these two studies aren’t included in the Bjelakovic et al. meta-analysis of RCTs on antioxidants because, um, they’re not RCTs. Holford’s complaint about this suggests some confusion as to what an RCT is. This is rather odd in someone that Teesside University has raised to professorial status. Happily, Holford now acknowledges that arguing for these studies to be included was incorrect (though it would have been nice if he had let us know about this, and thanked us for pointing this out, when we posted on the topic about a year ago). As will be shown below, though, it is unfortunate that Holford failed to take on board our other criticisms.

In answer to the criticisms of the Bjelakovic et al. article that Holford does cleave to in his response to the BBC, though, I will quote from this blog’s first post: they were all addressed in there.

The abstract of the JAMA article says that “The potential roles of vitamin C and selenium on mortality need further study.” It therefore does say that there’s a need for further study on the role of vitamin C and selenium supplementation, and that they may have reduce (or increase) mortality…The meta-analysis doesn’t explicitly say that different combinations of antioxidants may have different effects (or, for example, that antioxidant may have different effects if you exercise regularly, smoke 30 a day, etc.) It didn’t seek to analyse every possible combination of factors, and it wouldn’t have been feasible to do this…

I’m not sure how Holford concludes that the JAMA meta-analysis assumes that Correa et al’s paper is negative: the results would have been included in the meta-analysis as would the results of all the other trials with low bias risk. Others may have misread Corea et al’s article as showing that taking vitamin E could have been damaging, but I can’t see any evidence that the JAMA meta-analysis has done the same thing.

Right, after that blast from the past I’ll allow myself to look at one more point from Holford’s response to Radio 4: the rest can wait till another day. Holford also claims that:

My views on the benefits of fish oils are shared by many doctors and scientists. The recent Associate Parliamentary Food and health [sic] Forum ‘The Links Between Diet and Behaviour’, makes this clear.

No: it really doesn’t. It’s a bit odd that Holford tries to argue from the authority of the Food and Health Forum. What’s odder, though, is that we have read this report [PDF]: one may or may not agree with it, but the report does not entirely concur with Holford’s position on essential fats. For example, whereas Holford argues that “Most people are deficient in both omega 6 and omega 3 fats”, the Food and Health Forum report is more nuanced. It suggests for example that “If high omega-6 intake is shown conclusively to inhibit EPA and DHA absorption or synthesis, an alternative to increasing fish and seafood consumption would be to reduce the intake of LA (omega-6) as each would result in more equal tissue ratios of EPA and DHA.” [p. 11] The report also notes that in the past 30 years “intake of Linoleic acid (LA) (omega-6) has risen in many northern European counties” and that this may be problematic [pp. 10-11].

Holford is quite entitled to agree or disagree with the Food and Health Forum. After all, although he made a written submission to the Forum, there is no record of him having been called to speak there. However, to claim that they agree with his position whereas they actually don’t is both unfortunate and rather confusing.

In summary, then – when offered a chance to respond to some questions from the BBC, Holford has only managed to dig himself an ever deeper hole (some of the BBC’s questions look as if they overlap with our summary of outstanding issues in our Holford Myths page). Given that his responses are clearly a failure – we have barely started on the problems with these responses, and I haven’t yet heard what sounds like it will be a rather critical segment about him – one does wonder what will be left of Holford’s credibility as a scientist after the second part of The Rise of the Lifestyle Nutritionists airs.

Advertisements

10 Comments

Filed under allergy, anaphylaxis, Ben Goldacre, Food for the brain, Food for the brain foundation, food intolerance, omega 3, patrick holford

10 responses to “Holford tries to respond to questions raised by BBC documentary. He fails.

  1. tifosi246

    So a self-selected survey of 10,222 children actually ends up being a cross-comparison of 32 with another 42! So all the data analysis and pretty (if mis-leading) bar graphs in the report are based on 74 kids?!! Or, to put it another way 0.72% of the respondents.

    If I was a parent of one of the 99.28% who have been ignored, I’d feel a bit miffed at having wasted my time.

    Although, I do wonder if PH’s original hypothesis was that the vast majority of kids would be in the v. poor/poor diet category, and, when the results showed they were almost all “neutral”, faffing around with a few outliers was all he had left to “prove” his points. Hey, why change your mind when the data doesn’t support you when you can change the analysis post-hoc?

    Ho Hum.

  2. Wulfstan

    So all the data analysis and pretty (if mis-leading) bar graphs in the report are based on 74 kids?!! Or, to put it another way 0.72% of the respondents.

    If I was a parent of one of the 99.28% who have been ignored, I’d feel a bit miffed at having wasted my time.

    0.72% in his outliers??? And that’s the basis of comparison? He is talking about p values when he should be discussing statistical noise?!?

    Words fail.

  3. Trev

    I work in PR and those do not look like the questions posed by a radio 4 producer to a contributor who is refusing to be interviewed.

    If Holford is shown to be misleading his readers about the questions he was asked, and the answers he gave, then that would be a remarkable new discovery.

    This really is very funny. My professional advice to him would be to keep veeery quiet, but of course, his hole-digging is why he is such an instructive joy to watch.

  4. Fascinating behaviour.

    I took a bit of a back seat for this part of the programme and it was all handled by the Executive Producer from the BBC Radio Science Unit, Rami Tzabar.

    I can tell you absolutely for definite that those are NOT the questions which the BBC sent to Patrick Holford asking for a response.

    And furthermore, those are NOT the answers Professor Holford sent back to the BBC.

    This would seem to be further clear evidence of the spectacular unreliability of Patrick Holford’s claims.

    Absolutely astonishing. There are no depths etc.

  5. pj

    “In the survey, “The ‘effect sizes’ were calculated by looking at the prevalence of respondents in each (high or low) consumption band giving a ‘very good’ rating.” This is a very non-standard – to the best of my knowledge, it’s completely unique – way to calculate effect size.”

    To be fair to Holford and co, it isn’t completely mad to look at the proportion giving a categorical rating of “very good” – similar methods have often used in pain research and other subjective fields (although you’d usually group all ‘good’ and ‘very good’).

  6. just to let you know i’ve double-checked, and those are definitely NOT the questions which were sent to patrick holford in asking for a response, after he refused to be interviewed, i’m going to have a dig and find out what he is responding to there.

    i can clarify that those are also definitely NOT the responses he sent to the BBC, although the responses he did send were equally long and flawed.

  7. Pingback: BBC put Holford’s science to the test. It fails. « Holford Watch: Patrick Holford, nutritionism and bad science

  8. pj – I’m a fan of effect sizes and wish researchers would use them more regularly in straightforward digests of research findings. However, we wanted to check the effect sizes because the reported ones looked wrong – which is why we asked for the SDs.

    We belatedly worked out that they were just using jargon to sound technical but hadn’t actually calculated any effect sizes. Hence our annoyance – if you had wasted a lot of time trying to generate useful SDs that would approximate to anything like what they claimed were effect sizes – you might be less generously disposed. Particularly when you might strongly suspect that they didn’t realise that it is a technical term – despite all their ludicrous posturing.

    This isn’t particularly coherent – I might have a bash some other time.

  9. pj

    dvnutrix – sure, I was just pointing out that this kind of analysis can be used – I note they were sufficiently unfamiliar with statistics to be unable to put confidence intervals (or the like) on their estimates!

  10. Pingback: Ben Goldacre’s Rise of the Lifestyle Nutritionists on Radio 4 Won a Norwich Union Medical Journalism Award « Holford Watch: Patrick Holford, nutritionism and bad science

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s