Rational Medicine?

Evidence-based medicine, or EBM, is the current gold standard for clinical regulations. To its practitioners, it seems obvious that medicine should be based on the “best” evidence. Furthermore, they claim that Evidence-based Medicine is scientific and brings an old discipline up to date.

We have looked closely at EBM. Our conclusions are astounding:

  • EBM is NOT scientific: it is concerned with legal evidence, rather than scientific data
  • EBM does not conform to the scientific method
  • EBM selects its evidence and is disturbingly biased
  • EBM breaks the rules of numerous disciplines, including information theory and cybernetics
  • EBM is inadequate for selecting treatments for individual patients
  • Thus, a rational patient (or doctor) should reject EBM as useless!

We could go on. However, our new book Tarnished Gold: The Sickness of Evidence-based Medicine explains the issues fully and suggests an alternative approach.

Comments from early reviewers suggest that some people may find the book’s content disconcerting. It is disorienting to discover that “evidence-based” medicine does not make sense. However, evidence is not the same as science, data, or information. Science-based medicine would have the acronym SBM, which most people would probably think goes without saying! Unfortunately, when it comes to EBM, the stark fact is that the emperor has no clothes.

Tarnished Gold examines the weaknesses of EBM from a variety of intellectually challenging perspectives. We suspect that readers who take the trouble to follow the arguments will conclude, as we did, that EBM is dangerously irrational.

Tarnished Gold Cover

The book is available on Amazon here and shortly on kindle.

People are not Populations

Ecological Fallacy

Evidence-based medicine uses the statistics of groups and populations as a guide to treating patients. Thus, for example, a supposedly authoritative clinical trial might claim that aspirin reduces the risk of cancer by 25%, although in reality the reduction was only 1 in 1000 (e.g. 4 people in 1000 control subjects got cancer, compared to 3 in 1000 treated subjects). If other clinical trials and maybe a meta-analysis confirm this finding, EBM practitioners consider it scientifically “proven”. People might then be recommended to take aspirin to prevent cancer.

One problem with this idea is the ecological fallacy, which happens when people try to apply group statistics to individuals. To take an example, the average dress size for women in the United Kingdom is 16. But husbands and boyfriends should beware of buying this size clothing as a birthday present for their partner. They might be lucky. But, more likely, the dress will be either too small, “So you think I should be thinner!” or too large, “So you think I’m fat!” Either way, the result will not be helpful.

EBM applies this logical fallacy to patients when it recommends treatments based on large-scale studies. For this reason, you should never assume the results of a clinical trial or media report apply directly to you. Eating cholesterol-laden eggs may, on average, increase the incidence of heart disease slightly in a large population but that is irrelevant to any particular person. You are an individual and can disregard aggregate statistics, in the same way that you would be unlikely to buy average-sized clothes or shoes.

Robinson W.S. (1950) Ecological Correlations and the Behavior of Individuals,  American Sociological Review, 15(3), 351–357.

The Goldilocks Principle

Every good solution to a problem must model the problem.

Ashby’s law means that we need enough information to solve a problem or control a system. If we have too little our solution will not work.

Strangely, too much information can also prevent a solution. We get bogged down in the detail – information overload. Like Goldilocks and her porridge the solution needs to have just the right amount of information, detail, or data – and no more!

Ross Ashby and Roger Conant explain that a good solution depends on far more than just having enough of the right amount of data. The solution needs to model to problem.

So a clinical trial of a new drug should NOT compare two groups of patients using statistical tests. When medicine employs clinical trials it is making a big mistake. The trial needs to model the doctor-patient situation. We need to model a single doctor treating an individual patient who has a unique physiology. Compare groups of patients and the result will apply to populations – on average.

If you are a supporter of evidence-based medicine please feel free to comment and explain how the much hyped clinical trials, meta-analyses and the like overcome the Goldilocks Principle. That is, how do the aggregate statistics of clinical trials model the specific interaction between a doctor and the patient?

The Goldilocks Principle is usually described as Ashby and Conant’s good regulator theorem. Links are given below to the original paper and other accounts.

Check out the Good Regulator Project. Link

Daniel L. Scholten (2010) A Primer For Conant & Ashby’s “Good-Regulator Theorem. Link

Daniel L. Scholten (2009-2010) Every Good Key Must Be A Model Of The Lock It Opens (The Conant & Ashby Theorem Revisited) Link

Here is a the original paper for download: GoodRegulator

Ashby’s Law

The First Law of Cybernetics


W. Ross Ashby’s Law of Requisite Variety is so powerful that it is known as the First Law of Cybernetics.

Ashby’s Law implies that the degree of control of a system is proportional to the amount of information available. This means you need an appropriate amount of information to control any system, whatever it is.

This is a simple idea, though it is difficult to explain.


Variety is another way of thinking about information. It describes the number of potential states of a system – any system. If you recognise all the possible states, you have complete knowledge of the behaviour. Uncertainty occurs when you do not know all the possible states. As Ashby put it, variety is a concept inseparable from that of ‘information’.

Requisite Variety?

Requisite means necessary or required. So requisite variety implies that you need a certain amount of information for some purpose.

If you have complete knowledge of a system, it is possible to control it. However if the system has some hidden properties your information is incomplete and there is uncertainty about the behaviour. To have full control you need to have full knowledge of the system and its behaviour.

Ashby described it as

“Only variety destroys variety”

There are many similarly obscure descriptions such as

  • “The total number of possible states of a system, or of an element of a system.”
  • “The larger the variety of actions available to a control system, the larger the variety of perturbations it is able to compensate.”
  • ‘The greater the variety within a system, the greater its ability to reduce variety in its environment through regulation.”
  • “The quantity of regulation that can be achieved is bounded by the quantity of information that can be transmitted in a certain channel.”
  • “Variety absorbs variety”

We measure variety in bits. Alternately, we can measure it as a logarithmic measure (like information content). In this form variety represents the minimum number of choices (by binary chop) needed to resolve uncertainty.

When people first hear about Ashby’s law they often don’t get it – surely it’s trivial, obvious, everyone knows that.   Perhaps we all need thinking caps:

Ashby W.R. (1956) Introduction to Cybernetics, Chapman & Hall, Available free online.