Friday, August 21, 2009

Viagra Causation Goes Limp

Imagine this: The only published article in the medical literature reaching a statistically significant result concerning a drug and an outcome turns out not to have been what it seems. Rather, over a third of the subjects in the relevant category turn out to be misclassified. When properly reclassified, the statistical significance between the drug and the outcome goes away.

If that study had been sponsored by the drug company, and involved the benefit of the drug, imagine how the plaintiffs would have reacted. Would they demand that the drug be removed from the market? Would they seek punitive damages? Sanctions? A criminal investigation?

Of course they would. We see this kind of response just for results unearthed by new studies or supposed e-discovery violations -- let alone the publication of a ground-breaking, but false, study.

But what if the study happens to be published by a plaintiff's expert?

Hardly a peep. In fact, expect a belated attempt to make excuses for misstated data.

That's what just happened in the Viagra MDL. The last plaintiff's causation expert standing turns out to have published a study based upon data that was, at best, misclassified.

Kudos to the defendant for persistence in the face of a published, supposedly peer-reviewed article.

It's results like this that demonstrate why it may be worthwhile to seek discovery of the data that underlie even published studies.

So what exactly happened?

The study was suspicious anyway, since it consisted solely of telephone interviews with Viagra users who had been diagnosed with non-arteritic anterior ischemicoptic neuropathy (“NAION”) and a case-matched control group of the same size. Slip op. at 2.

The defendant subpoenaed the underlying data, but the expert, at the direction of the plaintiff's counsel, didn't produce anything. Slip op. at 4. That's one way to confirm suspicions. The defendant deposed the guy anyway, and after that the expert had to write the scientific journal and tell it about the deficiencies in the article. Slip op. at 4-5.

The biggest problem was that a significant portion of the patient data had been seriously miscoded. Specifically, eleven subjects were erroneously listed as "exposed" to Viagra when their telephone interviews indicated that they began taking Viagra after being diagnosed with NAION. As the court explained:
There are eleven instances where the date of first use on the original telephone survey forms is later than the date of NAION diagnosis on the same form. However, each of those individuals was still coded as exposed in Dr. McGwin’s [the expert's] electronic dataset. Dr. McGwin acknowledged that the statistics in the McGwin Study would have been different had those individuals (11 of 27 patients who reported Viagra or Cialis use) been coded as unexposed rather than as exposed.

Slip op. at 7.

Eleven of 27 is a pretty high error rate for anything. And every one of those errors just happened to occur in such a way that would bias the study in favor of a supposed association that otherwise has not been proven to exist.

One would think plaintiffs would be content simply to avoid the sanctions that they surely would have sought had the shoe been on the other foot. But they had chutzpah. They tried to explain away these "discrepencies" by arguing that (1) their own expert's survey forms were hearsay, and (2) some phantom researcher (never identified) supposedly went back and recontacted these persons and got differing information from what was recorded on the forms. Slip op. at 7-8.

The court was having none of it. It considered the forms to be admissible business records. Slip op. at 8. As for the supposed recontacting of the survey participants, plaintiffs eventually had to concede that they were pushing a fantasy -- there was not a scrap of evidence to prove that any recontacting had ever happened:
Plaintiffs have failed to produce any competent witness or documentary evidence to verify that such a step [recontacting survey participants] was actually taken. Indeed, as Plaintiffs concede, “Dr. McGwin [their expert] was unable to authenticate any of the underlying documents. . . . Plaintiffs have not cited to any other admissible testimony from [anyone] who is able to verify that patients were recontacted.

Slip op. at 10. With admirable understatement, the court concluded "discrepancies between the dates of first use on the original survey forms and in the electronic dataset raise serious concerns about the reliability of the McGwin Study as originally published." Id.

But wait, there's more.

More than misclassifying more than a third of study participants in a way that biases the results in favor of the conclusion that the expert is paid to reach? More than making up cock-and-bull excuses about phantom recontacting?

Yes.

The published study also misrepresented the type of statistical analysis that had been conducted. "The McGwin Study said that it used a paired t-test; Dr. McGwin admitted that
he in fact used a two sample t-test instead, which he conceded was 'not the most
appropriate.'” Slip op. at 10-11. Beyond that, "the code that Dr. McGwin wrote to produce the numbers in the McGwin Study contained errors that would affect the odds ratios and confidence intervals." Id. at 11.

Once again, with notable restraint, the court concluded, "the fact that the methodologies described in the study were not the actual methodologies used undermines the reliability of the McGwin Study as published." Slip op. at 11.

But wait, there's more!

More than misrepresenting how the study's numbers were crunched?

Yes.

One of the study's "main findings" was "mischaracterized." The study claimed that Viagra users with "personal histories" of heart attack were at significantly greater risk of NAION. But in fact there was no "personal history" data collected. As the court summed up the evidence:
The patients were actually asked whether they had a family history of myocardial infarction; no one was asked about personal history. Dr. McGwin conceded that he mistakenly assumed that the variable “MI” in his electronic dataset referred to a personal history of myocardial infarction.

Slip op. at 11 (emphasis original). This was, as the court held, "yet another layer of unreliability." Id. at 12.

Adding all this together, the published study competely failed Daubert analysis, notwithstanding that it appeared in a scientific journal:

Taken together, the miscodings and errors described above effectively undermine the reliability of the McGwin Study as published. As Plaintiffs concede, there are
“acknowledged inaccuracies in the published study” that need to be corrected. In light of those acknowledged inaccuracies, the Court finds good reason to vacate its original Daubert Order permitting Dr. McGwin to testify as a general causation expert based on the McGwin Study as published. Almost every indicia of reliability the Court relied on in its previous Daubert Order regarding the McGwin Study has been shown now to be unreliable.

Slip op. at 12.

Unfortunately, neither plaintiffs nor their expert knew to quit when they're behind. The expert ginned up an unpublished "reanalysis" - submitted after the fact - that he claimed salvaged the result of the published study, and thus his causation opinion.

Didn't help.

First, the unpublished reanalysis wasn't peer-reviewed. Second, the letter wasn't published. Third, the letter was created "post-litigation." Not only that, upon receipt of the letter, the journal in question "referred the Letter to the Committee on Publication Ethics." Slip op. at 14. In other words, the expert is being investigated for academic fraud.

The moral of this story (if not of certain of the participants) is clear - defendants can't give up the Daubert ship just because the other side comes up with a published, supposedly peer reviewed article. As here, a full investigation may reveal that the peer-reviewer was asleep at the switch, or that the article subverted the entire peer-review process by stating things that were simply false.

In short, there may be a reason beyond statistics why an outlier study is an outlier.

We can only hope that the Committee on Publication Ethics does the right thing.

5 comments:

Ron Miller said...

Let's take the premise for a second: plaintiff's expert does a study to underscore the point plaintiffs are trying to make. That would be a bad thing. But, certainly, you see there are, as JFK, told us, various degrees of evil. Drug companies supressing studies or manufacturing studies - an incredibly common practice I think folks on both sides of the v would admit, is a far greater concern to public safety. Which is why one is a bigger deal that the other.

In this regard, the world makes sense.

Andrew Oh-Willeke said...

The study clearly appears to have problems. But, is there really something that a judge should be policing?

This seems to be a classic case of a weak factual case, of the kind that a jury can appropriately police. It isn't that the author is rejecting basic scientific principles, has a too small data set, or doesn't know how to do a p test. The argument is simply that the data relied upon were in factual incorrect.

I can see the argument of the slip op. that this falls within FRE 702 and Daubert, and I could also see grounds for simply ruling under FRE 402 in a motion in limine.

But, this motion sounds a great deal like the objection that my clients always want me to make at trial: "Object! He's lying." Or, "Object! That's not what happened." Obviously, those aren't valid objections. The point is to figure out what did happen. Normally, the argument that the other guy is factually wrong is one that you make to a trier of fact.

I don't feel much sympathy for a decision to strike an expert whose opinion is not supported by his data (a standard that also rules out the vast majority of op-ed columns in newspapers and a good share of book length treatments of current affairs issues as well). And, while the constitution protects the right to a federal civil jury, experience shows that judges are pretty good at resolving civil cases, so the harm is modest at most.

But, if one is going to have judges pre-emptively resolve evidence matters that go to the merits on the basis of the factual accuracy of that testimony, one really ought to follow the practice in election cases and serious civil law cases of having multiple judges make a group decision.

Jenifer said...

I think allegedly illegal off-label promotion of the company's drug is a big criminal offense..
--
Jenifer
Wireless Home Alarm Security Systems

Generic Viagra said...

think allegedly illegal off-label promotion of the company's drug is a big criminal offense...

I totally agree with this comment, thanks for sharing, have a nice day!!

Mark said...

I think Viagra Causation Goes Limp, it should be remove from the market it's allegedly illegal off-label promotion of drugs by any company.

have a nice day..
Generic Viagra