The 3R’s for better science

The use of animals for medical research is a sensitive and complex topic.  While the majority of people in the UK understand and support the need for animal experiments to advance medical research, many may not be aware of the united front within the research community to ensure that these experiments are carried out to the highest standards.  The National Centre for the Replacement, Refinement and Reduction of Animals in Research (NC3Rs) is leading this movement to help researchers and research funders improve animal experimentation, and ultimately produce better science.

The 3R’s stand for Replacement, Refinement and Reduction, and identify the key ways research can be improved when it involves the use of animals.  Replacement encourages research which seeks to develop or utilise non-animal methods, refinement identifies ways in which experimental methods can be produced to improve animal welfare, and reduction encourages better study design to reduce the number of animals used in research.  The 3R’s are adopted by funding bodies such as medical research charities, the Medical Research Council and the Wellcome Trust and adoption of the principles is required before funding is released.

Making science better

Adhering to and promoting the 3R’s is not just about protecting the animals, it’s about making science better, more efficient, and more reproducible, so that the animals used in experiments are not wasted.  We should be asking researchers to disclose full details of their animal work not only so we can scrutinise how well they’ve treated their animals but also so we can see that the work is justified.

There are several things a researcher can do to adhere to the 3R’s.  Carefully planned and well thought out experimental design can reduce the sample size of animals needed.  Asking for advice from a statistician when designing protocols can identify areas where numbers of animals can be reduced through accurate prediction of animal numbers needed and power calculations.  Understanding what other statistical options are available could also reveal more effective ways to conduct experiments.  There also needs to be in place plans to minimise experimental bias to enhance the quality of research and prevent over-use of animals to achieve robust results.  This could be as simple as blinding the analysis of animal tissue to investigators, ultimately producing better data sets with fewer animals.

Reporting for reproducibility

In 2010, the NC3Rs published a paper in PLoS Biology highlighting the need for better reporting of animal research in published studies.  The statistics they report are alarming, only 59% of 271 randomly selected articles stated a research objective and the number of animals or species used, 87% did not report randomising their studies and 86% did not report any blinding to reduce bias.  Only 70% properly reported statistics and presented the results with a measure of precision or variability.

The solution was to develop the ARRIVE guidelines for the reporting of in vivo experiments.  The ARRIVE guidelines are essentially a checklist for researchers to follow when reporting animal research.  Animal experiments can be written up in great detail and submitted to journals as supplementary material should there be content limits set by the editors.  It’s imperative that research funders demand this full disclosure and transparency from their grant holders so that maximal output is achieved from animal experiments, work is reproducible and the need for excessive animal use is avoided.


As discussed above, transparent reporting of animal research is crucial for accuracy and reproducibility of experiments.  This transparency ensures that any deviation from what the researcher expects of their experiment can be picked up and resolved before any unnecessary experiments are carried out.  But there also needs to be transparency in how the animals were developed for experimentation.  For example, if a researcher is developing their own mouse models or is maintaining mouse models for an extended period of time, they need to be reporting phenotype and genotype data to pick up any deviation in the animals background.  Genetic drift occurs rapidly in mouse breeding programmes with over 100 SNP’s appearing in the mouse genome in one generation.  Without full reporting the mouse phenotype may not be reproducible due to the emergence of sub-strains over time.


The 3R’s for a better future

Animal testing is currently necessary for new treatments to be developed for diseases such as cancer and dementia.  Legal regulations require that new treatments are shown to be safe and effective in at least two different animal species before they can be approved for clinical trials in humans.  It is not practical to ban animal testing for medical research as we don’t currently have the methods or alternatives to remove these strict but necessary regulations on drug testing.  The work by the NC3Rs is ensuring that where these experiments have to take place, they are being done to maximise the use of each animal, to produce data which has the biggest impact and maintain the highest level of animal welfare.  On top of this they are promoting and funding work into the replacement of animal models with non-animal alternatives so that we can move towards a future without the need for animal experimentation.  It’s important to remember that it’s not solely about a reduction in numbers of animals being used but a refinement of the techniques, husbandry and reporting, which will ultimately lead to better science.

A strange incident of open access science

I am currently taking part in the Reproducibility Project: Cancer Biology, a study run by the Open Science Framework which aims to investigate the replicability of the top 50 cancer biology studies published between 2010-2012.  Reproducibility is a key component of scientific progress and being able to replicate the findings of other peoples published work should be straight forward and consistent.

This post is not about the project but about a strange and disturbing anomaly I picked up on accessibility to the published paper I was working on.

The paper (‘The microRNA miR-34a inhibits prostate cancer stem cells and metastasis by directly repressing CD44‘) was published in Nature Medicine in 2011 and is freely available, in its entirety, in PubMed.  What’s odd though is that if you look at the PubMed entry for this paper there are two ‘link out’ buttons on the top right which are supposed to take you to the full paper – but only one of these is open access.

If you click the ‘Free in PMC’ link you can access the full paper and all the supplementary material.  If you click the ‘Nature Medicine’ link you are taken to a pay wall asking you to purchase the article for $32.

Worryingly, if you google search the paper title, the top hit is a link to the Nature Medicine paywall and not the free version in PMC.

This seems odd to me.  Is it devious or am I missing something important?  Has anyone else noticed this before?

Gene that makes cells ‘sticky’ could be the key to stopping spread of breast cancer

The clinical statistics for breast cancer paint a remarkably accurate picture of where research has got to and where it is going in the future.  You only need to look at two timelines of events to see real life impact of the disease and the breakthroughs that have shaped it.

The amount of new breast cancer cases diagnosed each year has been steadily increasing since the mid 1970’s – a stark reflection of an ageing population (cancer is an ageing disease after all) and a product of routine screening.  There are currently around 50,000 new breast cancer cases diagnosed each year and this is predicted to rise to 57,000 by 2025.  Conversely, the amount of people dying from breast cancer has been steadily decreasing since the 1990’s, a direct result of early disease detection and development of effective treatments.  Deaths from breast cancer have dropped nearly 39% since the mid 1980’s, but can we expect this decline to continue?

An estimated 12,000 women die of breast cancer each year in the UK and most if not all of these deaths can be attributed to the secondary form of the disease.  This is also known as metastatic breast cancer, when tumour cells spread to other tissues of the body, set up shop and seed a secondary tumour.  A tumour in the breast tissue is not likely to be lethal but a tumour in the brain, lungs or liver tells a different story entirely.  In fact, breast cancers confined to the breast have cure rates that exceed 90%, whereas metastasis to the brain can reduce survival rates to below 20%.  A lack of treatment options and no specific drugs for metastatic breast cancer means that unless all breast cancers are prevented from progressing then death rates can’t fall into a continual decline.

It is for this reason that cancer charities and researchers are turning their attention to understanding the process of metastasis so that new treatments and preventative strategies can be developed.  Metastasis is a complex process involving genetic and molecular changes to tumour cells at different stages of progression.  These changes help the cells to migrate away from the primary tumour, recruit blood vessels to aid their spread, invade biological tissue at a secondary site and survive there long enough to develop a new tumour.  Because there may be tumour cells at various stages of metastasis at any one time, these changes also make it extremely difficult to treat.

Researchers from the Breakthrough Breast Cancer Research Centre, housed at the Institute of Cancer Research in London, are one team pitching in to find out how and why secondary breast cancer occurs and what can be done to stop it.  In a paper recently published in Cancer Discovery, Professor Clare Isacke and her team describe a new gene that promotes the formation of secondary breast cancer, and reveal a potential new treatment that could block it.

The team first silenced over 1000 genes in mammary tumour cells before implanting them into mice.  They then waited until secondary tumours formed in the lungs of the mice before collecting tumour samples and analysing the cancer genomes.  The theory was that tumour cells with certain genes switched off would set up home in the lungs, and by assessing the genetics of these tumours, the researchers hoped to identify specific genes that facilitate breast cancer metastasis.

One of the genes that they identified was of particular interest because of a known role it plays in enabling metastatic cancer cells to stick to secondary tissues and seed new tumours.  The gene in question, called ‘ST6GalNAc2’ codes for a protein that alters the characteristic features of the cell surface.  The outer surface of a cell is embedded with molecules that dot the landscape like mountains and forests on the Earth’s surface.  The ST6GalNAc2 protein chops off a specific molecule from the surface of the cell and as a result, the cell is no longer able to pick up a second molecule called ‘galectin-3’, which is present in the fluid surrounding our cells.  With the ST6GalNac2 gene switched off, the cancer cells pick up lots of galectin-3 and become more likely to stick to areas like the lungs.

The researchers suggest that that the ST6GalNac2 gene could be used as a marker to select patients with a particular type of breast cancer that would benefit from treatment with a drug that prevents galectin-3 from making the tumour cells ‘sticky’.  Clinical trials have already shown these drugs to be safe and could provide a new treatment option for the prevention of secondary disease.





WDDTY – Leaked article from Nov_2013 issue

DISCLAIMER:  This is a work of fiction – nothing below is true or ever will be true. Do not take as medical advice.

WDDTY Nov 2013 – Embargoed until October 31st

Cancer cells confused by memory molecules

Doctors would have you believe that to treat your cancer you need to poison your body with toxic petrochemicals that do more harm than good.  But now there is significant evidence that homeopathy – a billion year old treatment – could be far better than chemo at delivering a knock-out punch to cancer.

Research carried out by ‘Dr’ Buller S. Hitenberg at the Quasar University And Chopra Knowledge centre has finally demonstrated not only does homeopathy cure cancer – but how it works.  Before now the ‘medical community’ has refused to accept homeopathy as a treatment for ‘cancer’ because there was no ‘evidence’ to ‘prove’ its ‘efficacy’.

This new research proves 100% that homeopathy cures all cancer at least 99% of the time.  Compared to chemo rates (which work only 2% of the time and kill nearly 98% of all hospital admissions) this represents an increase of more than 100,000 times.

Commenting on the research ‘Dr’ B.S Hitenberg said:

“This is a remarkable breakthrough in the way we treat cancer.  We gave 50,000 cancer patients with 27 different cancer types one highly diluted dose of homeopathy – and 99% of them showed complete regression.  We also believe that the 1% not cured didn’t actually have cancer – we have data to prove this.”

The research team also discovered just how homeopathy was curing cancer.  They describe how cancer cells became flustered when presented with water molecules retaining the memory of a carcinogen.

“It seemed to confuse the cells.  They normally stick together and talk to each other but when they see these memory molecules they get disorientated and don’t know what to do.  Eventually they just sort of go back to being normal and stop being so cancerous”.

A spokesperson from the charity Homeopathy for Hominids said:

“I’m so pleased that homeopathy has been proven to cure cancer.  We’ve been telling people forever that homeopathy works and no one would listen.  Big Pharma has always had a monopoly on the market and organisations they fund like Sense about Science, British Pharmaceutical Association, Simon Singh Corporation, The Times, Greggs and J.D Wetherspoons have always tried to silence us.”

This research marks a historic comeback for alternative medicine and it is expected that the NHS will soon be unveiling homeopathy clinics all across the country.

Sense about Science, British Pharmaceutical Association, Simon Singh Corporation, The Times, Greggs and J.D Wetherspoons could not be reached for comment.

WDDTY – They say they have a ‘qualified researcher’; but do they really?

Discussions on WDDTY’s Facebook page are apparently meant to be a fair and open place for debate.  However I, as have others, have found ourselves banned from commenting because of debasing and abusive comments.  In my case this is simply not true.  My ban came in response to a claim made by WDDTY that they have a qualified researcher who checks all their references and statistics before they publish.  I posted on their Facebook that if they did indeed have a researcher, why was there so many things wrong in their Angelina Jolie piece – an article that I found to be riddled with referencing errors.  Their retort was to ban me and delete my comments – an action some people would take this as admission of guilt.


In the midst of all the Facebook patter I noticed WDDTY made a statement about how ‘prescribed drugs are now one of the biggest killers in the west’.  I’d heard similar claims from WDDTY before and remembered an article that they had published recently claiming that medicine is one of the biggest killers in the US.  The article appeared in the September 2013 issue and is solely based on the National Vital Statistics Report (2012, vol. 61).  They quote some pretty incredulous numbers in the article – but as they have a ‘qualified researcher’ on board I thought “surely they must be right”.  I mean anyone publishing health information would want to make sure that they get their figures right – especially when they want to make claims like medicine kills more than smoking, wouldn’t they?


WRONG.  In fact they have made such a mess of the NVS report I can’t believe for a second that anyone with half a brain even looked at it.  Let me begin:

The first point worth noting is that volume 61 of the NVS report contains 9 sections.  I am going to assume (because WDDTY don’t specify) that the one they got data from was number 6 – Deaths: Preliminary Data for 2011.  The reason for this is because they say:

“America’s DOH and Human Services classifies all deaths in the US every year: in 2011 – the most recent year available…”

This is true – great job ‘qualified researcher’.

Next they say there were a total of 2.53 million deaths in the US in 2011.  It’s actually 2.51 (to 2 d.p.) but I’ll cut them some slack.


They say that the biggest killer was heart disease (596,339 deaths), followed by cancer (575,313 deaths).  Also true.  Wow this person is doing a great job so far…

This is where it gets good (or bad).  They say that:

“Adverse drug reactions account for 106,000 deaths”


This number has been plucked straight out of the air.  Adverse (medical) drug reactions are classified under the NVS codes Y40-Y59.  Within the NVS report for 2011 these codes are grouped under ‘Complications of medical and surgical care’ along with codes Y60-84 and Y88.  Deaths under this category total 2,580 – not 106,000.  In fact, totalling all adverse drug reaction (Y40-59) from 1999-2006, only accounts for 2341 deaths in an 8 year period.

They also claim that 98,000 people are killed by doctors.  This is clearly wrong because these deaths also fit in the 2,580 accounted for under ‘complications of medical and surgical care’.

Their conclusion then is to add 106,000 and 98,000 to get a total of 204,000 deaths by adverse drug reaction or medical error – making it the third biggest killer in the US (around 8% of total deaths) after heart disease and cancer.  In fact they account for 2,580 deaths or 0.1% of total deaths.

So by making up numbers, the ‘researcher’ at WDDTY has over-stated the number of deaths by a (approx.) whopping 100 times.  Maybe WDDTY will be good enough to explain where these numbers have come from because as far as I can tell they have been manufactured to propagate the idea that medicine and pharma are out to get everyone.

Re-wiring immune cells for cancer therapy

Cancer treatment that is personalised for individual patients is a dream shared by researchers and oncologists alike.  We know that cancer is a fiercely complex disease and as a result it is difficult to predict how well any one patient may respond to a given treatment.  Current treatments, be it radiotherapy, chemotherapy or targeted drugs, are administered based on the best available evidence, collected through rigorous scientific testing.

New research published this week in the journal Nature describes a powerful new technique that could revolutionise the way cancer is treated.  The work carried out by a team at Memorial Sloan-Kettering Cancer Centre in New York builds on progress made in a treatment strategy called ‘adoptive T-cell therapy’.

Adoptive T-cell therapy utilises the patient’s own immune system to generate immune cells or ‘T-cells’ that are capable of specifically attacking the cancer cells.  These cells are generated from patient derived cells that are re-programmed in the lab to stem cells known as ‘induced pluripotent stem cell’, before being coaxed into T-cells.  The problem faced by scientists is harvesting enough of these T-cells to be used therapeutically and also getting these T-cells to recognise a chosen target – in this case a characteristic protein present on the surface of a cancer cell.

In the present study, the team have managed to engineer T-cells with high specificity towards a protein, called CD19, which is present on some blood cancers.  To do this they obtained human T-cells from a volunteer’s blood and genetically engineered them to revert to a stem cell state.  These stem cells were then once again genetically modified to express the ability to recognise CD19 before being chemically induced into T-cells.  The authors found that when mice carrying CD19 relevant cancers were injected with these T-cells, their tumours completely regressed, and the treatment provided a survival benefit for the animals.

This work takes a huge step towards personalising cancer therapy because it allows therapeutic T-cells to be developed to attack the unique characteristics of an individual’s cancer.  This technology is still in early stages of development but promise has already been show for adoptive T-cell therapy in clinical trials.  The progress made here will strengthen the possibility that one day all cancer patients will have their therapy tailor made ‘in the dish’.



The sweetness of cancer

Recently, I have noticed an increase in the number of headlines that mention ‘cancer’s sweet tooth’, ‘cancer cells sugar craving’ and even ‘sugar is cancers favourite food’.  I don’t have an issue so much with the analogy but I do think that it simplifies the reality a little too much.  Metabolic regulation within a cell is extremely complex.  Just looking at this diagram should be enough to convince you.

Scientists know that as a tumour develops there are fundamental changes to the metabolic programme of cancer cells.  Being a cell is a very energetic lifestyle and in order to keep up with the relentless days and nights of manufacturing proteins, breaking down molecules and warding off toxic compounds – cells need a decent supply of energy.

Under normal conditions this is readily achieved by a process called aerobic respiration.  Here’s a quick, school biology catch-up:

Glucose + oxygen → carbon dioxide + water + ENERGY

This ‘energy’ is actually a molecule called ATP or adenosine triphosphate.  It is this molecule that is used to keep the lights on, so to speak.  Just to reiterate how incredibly simplified the above equation is – here is a fuller picture of aerobic respiration.

The problem with tumours is that as they grow they become increasingly cut off from the body’s blood supply.  This creates an environment that is very low in oxygen and as such the amount of aerobic respiration that can be done is reduced.  When this happens the cell starts kicking out a protein called HIF-1 which rapidly activates genes that control a process called glycolysis.  This is another metabolic pathway, like aerobic respiration, except that it can create ATP from glucose without the need for oxygen.  Interestingly, what happens in cancer cells exposed to this pressure is they end up permanently switching on their glycolysis programme, so that even when there is oxygen available they preferentially manufacture ATP by glycolysis (a phenomenon known as the Warburg Effect).  The problem with this is that glycolysis is massively inefficient compared to aerobic respiration – producing only 2 ATP molecules compared to 38 – from one molecule of glucose.

It is this that has led to cancer cells to be called ‘sugar addicted’.  Not only do they produce very little ATP per glucose molecule, they are also much more energetic then normal cells, so they require a lot more glucose to keep themselves going.

This might sound simple enough and warrants the simple analogy but in reality it is much more complex.  Emerging research has shown that the environment around tumour cells, called the stroma, also plays an important role in cancer metabolism.  Healthy cells within the tumour stroma have been shown to succumb to the Warburg Effect and as a result begin to ‘eat’ themselves to obtain fuel to make ATP.  This is also driven by the lack of oxygen within the tumour stroma and results in energy rich nutrients spilling out into the local environment.  It has been proposed that cancer cells take up these nutrients and use them to produce their own energy.  Interestingly, these nutrients include compounds called ‘ketones’, which are much more efficient at producing ATP when metabolised by cancer cells.

So it is not clear cut whether cancer cells are ‘addicted’ to sugar.  They definitely require a lot more ATP and so they definitely need more fuel to produce it.  But the complexity of the varied metabolic systems and the relatively unknown contribution of the tumour stroma make it difficult to establish exactly what is going on.  This is something that needs to be taken into account when establishing how the Warburg Effect can be targeted therapeutically in cancer.