Skip to content

Measuring time: getting research from bench to bedside

(Image credit: Flickr/Artiom G)

(Image credit: Flickr/Artiom G)

Why do some research findings take so long to affect healthcare? And how can we make sure that the time between the bench and the bedside is the right amount? The MRC evaluation team’s Ellen Charman spoke to Professor Stephen Hanney at Brunel University about his research analysing what speeds up, or slows down, the journey from lab to clinic.

Pretty much everyone would agree that the speedy translation of research into medical advances such as new drugs, devices and healthcare policy is a good thing.

Aside from the health benefits, the economic return from an investment in medical research is higher, the shorter the period of translation or ‘elapsed time’ is. For example, in the area of cardiovascular disease, the rate of return on a new intervention doubles if you can reduce the elapsed time from 25 to 10 years. Researchers estimate that on average, it takes 17 years for research to reach clinical practice.

But elapsed time can also be beneficial — no one would argue against taking time to ensure that a new treatment is safe and effective.

So what affects the length of this elapsed time, which parts of it are needed, and can anything be done to speed it up?

Professor Stephen Hanney at Brunel University, together with colleagues at RAND Europe and the Office of Health Economics, has been investigating these very questions.

They produced a matrix — or timeline of progress — representing the life of an intervention. Progress is marked by a series of tracks, or stages, such as the initial discovery and clinical trials, through to approval and launch onto the market. Within each track are a series of calibration points, to clearly measure when events occur, from the initial patents granted during the discovery phase, to the NICE guidelines being issued as part of the policy phase.

The team then applied this matrix to case studies in the areas of mental health and cardiovascular disease, measuring the time it takes to get from one track to another and looking for similarities between the elapsed time in individual case studies.

“The benefit of this matrix is that it allows for tracks to overlap and happen simultaneously. Also, sometimes progress might revert to a previous track, for example, conducting additional effectiveness research after the treatment has been launched, so the matrix takes that into account,” Stephen explains.

There were instances where the elapsed time was a by-product of the area of research. In the case of treatment for schizophrenia, for example, advances in drug discovery meant that drug research became the focus, which slowed down progress in cognitive behavioural therapy.

The team also found examples of where the process was accelerated.

“In the case of the calcium channel blocker (CCB) amlodipine, a drug for high blood pressure, a post-launch trial showed that this treatment was without doubt more effective than a beta-blocker. This resulted in a more rapid updating of the NICE guideline on high blood pressure recommending CCBs as one of the first line treatments,” says Stephen.

So are there other means of reducing the time it takes to get interventions into standard practice?

Stephen says that increasing resources, improving processes and researchers working in parallel would all help.

“What might ultimately be a way forward is to use a similar approach to cutting time as the GB Olympic cycling team,” Stephen muses. “By shaving off small amounts of time in many different ways, it might be possible to make a big difference overall.”

The next step will be to look in more detail at how elapsed time should be analysed. One difficulty was going back in time to source the information, says Stephen. But as more emphasis is put on the importance of data — for example, journals requiring more information on research processes, and the use of outcome-gathering mechanisms, such as Researchfish — this will become easier. And information can unveil itself in unexpected ways.

“One of the more surprising sources of information were transcripts of court cases over patents and intellectual property,” says Stephen.

Ellen Charman 

Professor Stephen Hanney was funded through the MRC’s first Economic Impact Call. Three further awards aiming to better understand the link between research and its wider economic and societal impacts have been made in the second round of funding. 

No comments yet

Leave a Reply

You may use basic HTML in your comments. Your email address will not be published.

Subscribe to this comment feed via RSS