Monthly Archives: February 2012

Happy birthday to Avinash.

Just wishing Avinash Sheshachalam, one of the authors here, a very happy birthday and an excellent year head, and many more instances of the same on behalf of the rest of us.

Brain cake!

Cheers mate, and may the articles keep coming.
Ankur.

A few thoughts on Personalised Cancer Medicine.

I was in the clinic, twiddling my thumbs impatiently and waiting for the medic to arrive in the office; these circumstances were unique. This was the first time personalised cancer medicine was available in this little cancer hospital in some obscure place; the enormity of this, I felt, was staggering, but it would be hard to put it in perspective with the same consummate ease with which I had learned to handle the complexities of the philosophy of treating cancer without reference to the distant past, to an era where cancer therapy was a quagmire and had all the finesse of intricate patterns being formed with a sledgehammer.

The medic disturbed my journey into the past as he walked in with a knowing smile, and told me that the reports I wanted to study were being prepared on the cloud system that the hospital had access to, and would be in my hands shortly. Time to head back into the past for a little while longer, I thought, and so it happened.

I recalled that cancer chemotherapy had been essentially crude, even lacking in the selectivity that Paul Ehrlich originally postulated as an essential requirement for his magic bullets; it had just been a case of bombing replicating, metabolising cells with DNA damaging agents or antimetabolites in the hope that cancer cells would be fatally hit while normal cells would be less affected, not killing the patient even if there were sometimes horrible side effects, then, when Gleevec arrived, some things changed and there had been some emphasis on targeted therapies, be they small molecules or antibodies that often hit “druggable” targets by binding to enzymes in place of substrates or by blocking receptor dimerisation. Eventually, some clever chemists and nanotechnologists had come up with solutions to effectively target siRNA and miRNA sponges and HCR constructs to cancer cells, and we were in a position to knock out what we wanted with exquisite selectivity.

The medic walked in with a stack of sheets, each containing a table listing, by pathway, the lesions that had been identified in each of the 20 new patients we were processing. Using an algorithm, drug regimens had been computationally predicted for each; HCR constructs where fusion genes were involved, the odd miRNA sponge to knockdown miRNAs that, when overexpressed, drove these tumours, or occasionally, small molecule inhibitors and monoclonal antibodies, all in combination with an hTert or ALT pathway inhibitor. None of these patients had anything particularly novel, and our models had been good, meaning that they could all go home with extremely good outcomes following short cycles of therapy. This was perfect…

And then the alarm rang, waking me up from a blissful slumber, and I was no longer in the same postcode or time period as the one I was in during my dream, but since I remembered quite a bit of the dream, it led me to cogitate about how far we are from having the aforementioned scenario being played out at cancer clinics with regular occurrence. I think there are technological and biological advances being made that have left us not very far from realising ambitions of stratified and personalised cancer medicine.

Schematic of development of stratified cancer medicine. Image courtesy Cancer Research UK. Go check out their post on efforts in the UK being made to develop this approach further here http://scienceblog.cancerresearchuk.org/2011/06/10/government-committee-puts-6m-into-genetic-technologies-for-cancer/ .

One of the biggest challenges so far has been understanding the genetic makeup of cancers, and what drives them, and how that would affect therapy, for a very long time now cancers have been classified based on the tissue of origin and in rare cases surface markers in combination with that; a case study is the use of ER +ve and ER –ve classification for breast cancer, with the added inclusion of Her 2 expression status for recommending the use of Trastuzumab in therapy. (See http://www.nature.com/nrclinonc/journal/v3/n11/full/ncponc0636.html ) for a reference.

Another example of rudimentary classification, but nonetheless a therapeutically useful one, would be the use of Ras mutational status to decide if Cetuximab may be used. This is useful because Cetuximab is far less effective relative to standard chemotherapy in patients with tumours that overexpress EGFR but have a downstream Ras mutation as opposed to those with wild type Ras. (See Karapetis et al, http://www.nejm.org/doi/full/10.1056/NEJMoa0804385 )

Ras mutational status has a bearing on the efficacy of Cetuximab, courtesy Karapetis et al, New England Journal of Medicine. http://www.nejm.org/doi/full/10.1056/NEJMoa0804385

There are two technological platforms which I believe will be critical to the development of stratification and personalisation in my opinion, one is whole genome sequencing, which will produce extremely high resolution maps of everything that is genetically and epigenetically different in cancer cells compared to normal tissues, and microarray technology, which will enable expression profiling to pick out what effects modified genomes or epigenomes have on the transcriptome, and correspondingly, the proteome. The costs of sequencing a cancer genome have been steadily dropping and with the announcement of even cheaper, even more powerful genome sequencing platforms by companies like Oxford Nanopore at the Advances in Genome Biology Conference (See http://www.nature.com/news/nanopore-genome-sequencer-makes-its-debut-1.10051 for a reference), costs, it would appear, will fall further if these new platforms work as expected.

Costs of sequencing have been falling, image courtesy NHGRI

Microarray analysis has already led to significant progress in the stratification of therapy for breast cancer, for instance, differential expression in a panel of 70 genes was found to be predictive of prognosis for node negative breast cancer by van’t Veer et al (See http://www.nejm.org/doi/full/10.1056/NEJMoa021967 for the paper).

Prognosis comparison of good and bad signatures derived according to the 70-gene expression profile mentioned above. Graph courtesy Rob Weinberg's excellent text 'The Biology of Cancer'

The hypothesis was that those with a good signature could be treated locoregionally alone, and systemic therapy could be avoided. The array platform was commercialised as Mammaprint, and has been at the centre of a clinical trial in progress, called MINDACT (Microarray In Node Negative and 1 to 3 Node Positive Disease May Avoid Chemotherapy). This trial will serve as a marker for stratification; once it becomes possible to avoid treating those who don’t need chemotherapy with it, the savings in terms of their quality of life and in terms of reduced treatment costs will pay for the initial costs of the array.

One more way that cancer therapy may benefit from improved integrated analyses is in the development of rational drug combinations. One example that comes to mind is Dexamethasone resistance in Acute Lymphoid Leukaemia; scientists were able to, using expression profiling, find out that there was a specific expression signature that characterised dexamethasone sensitive cells which was identical to dexamethasone resistant cells treated with rapamycin, an inhibitor of the mTOR pathway. (See Lamb et al, http://www.sciencemag.org/content/313/5795/1929.short) for the paper). They were then able to infer that the addition of Rapamycin would sensitise previously insensitive cells to dexamethasone treatment.

Rapamycin Treatment sensitises previously resistant cells to Dexamethasone. Image courtesy Lamb et al, Science, 2006.

Our understanding of cancer is continually being improved by applying some of the technology mentioned above; for instance, the application of an integrated genomic and epigenomic analysis to Retinoblastoma, which is generally extremely stable revealed that Syk was a novel oncogene that was dysregulated and it could be knocked down to kill cells. (See Zhang et al, http://www.nature.com/nature/journal/vaop/ncurrent/full/nature10733.html  for the paper)

Another example of clinical utility in stratification involves neuroblastoma, which is a cancer of nerves outside the skull. It is a heterogeneous disease and some types spontaneously differentiate into a benign state and regress, some others are extremely hard to treat and do not regress. The question is: what makes a neuroblastoma aggressive? Knowing this has clinical utility since it is a paediatric tumour and one may not want to treat a tumour that will regress but will want to aggressively deal with aggressive forms. One potential signature has been confirmed using a three gene panel subject to aberrant epigenetic regulation. (See Caren et al, http://www.biomedcentral.com/1471-2407/11/66 for the paper).

Simultaneous stratification strategies have been evolved for neuroblastoma using a 25 gene microRNA signature, and this demonstrates that there may be concurrent ways to effectively arrive at stratification strategies. (See http://clincancerres.aacrjournals.org/content/17/24/7684.abstract for the apposite paper).

And we aren’t even talking about emerging therapies, which are so diverse as to each deserve treatment in a full blog post, which I might do in the future, but ultimately, stratification and personalisation will depend on two things; good classification and therapies that suit said classification, for instance, those cancers marked by tumour suppressor gene CpG Island methylation could be treated with combinations that involve HDAC inhibitors and Demethylating agents in combination with RNAi or antisense for whatever oncogenes these cells are addicted to, and inhibitors of metastasis if disease hasn’t disseminated, and the odd telomerase or ALT inhibitor. Immunotherapy might be thrown in if there is a surface marker that makes tumour cells targetable by immune cell therapy ( a prominent case that comes to mind is the recent success of T cells with chimaeric receptors engineered to target CD19, a marker characteristic of Chronic Lymphoid Leukaemia Cells).

Clinical trials may also evolve to accommodate the new depth of data we will be able to derive from analyses of tumours; you could, for instance, test new candidate drugs only on those patients who have molecular signatures that indicate that the tumour is dependent on the protein or gene being targeted, and this might reveal to us groups that would benefit from therapy that wouldn’t have been otherwise approved because they didn’t benefit a significant amount of patients in a cohort simply defined on the basis of what the original tissue of the cancer was. Cancer Informatics and Cancer Therapeutics may well result in a long and prosperous relationship, and the prospect is exciting. A novel approach in this regard is being taken by the TREAT 1000 project, but it is quite hard to find information on the web for this.

That is all from me this time round, stay tuned for more posts.

Cheers,
Ankur.

Tools of the trade – An introduction to MeDIP-Chip.

In the previous post on epigenetics, I introduced the topic of promoter CpG island methylation and how that could silence gene expression. In effect, by looking at whether the promoters of genes are methylated or not, one could go some way towards predicting whether those genes are expressed or not, and building genome wide methylation profiles is a very good way of trying to figure out what expression changes happen.There is a variety of approaches that may be employed to look at methylation profiles, and with the advent of genomics and the publication of the human genome, it has become possible to do so on a large scale basis, both of which are based on Methylated DNA Precipitation  (MeDIP) and a subsequent set of processes.

In MeDIP, DNA is isolated from the tissue that is being profiled, is denatured and separated into methylated and non-methylated components. In order to do this, first, DNA is sonicated into small fragments that are 100-1000 base pairs long. It is then incubated overnight with magnetic beads and an antibody that only binds to methylcytosine (this is how DNA is methylated). The magnetic beads contain an antibody to the antibody that holds methylated DNA, and this enables methylated DNA to be collected using a magnetic field that aggregates all these magnetic beads and by extension methylated DNA together. Whatever is not attached to the beads is then discarded.

The antibody is then digested away using Proteinase K, which is an enzyme, and this frees up DNA to go into the proteinase K solution, where it stays unharmed.
This DNA is then isolated using ethanol extraction and purification, and once it is isolated, further processing can take place.Since we’re looking across the whole genome, the same DNA must be amplified, this is done using commercially available whole genome amplification kits that work in ways which only the manufacturers know. Once this DNA is amplified and purified, it can be stored away for the next phase, which involves the actual data gathering and analysis. MeDIP is cheap but array analysis can be expensive, which is why there are quality control features built in, and after the DNA is prepared, it is subjected to quantitative PCR and more electrophoresis to ensure that the fragments are still in the originally sonicated range and to confirm that immunoprecipitation has enriched the methylated fraction compared to unmethylated DNA.If a processed sample meets requirements, array analysis (Chip) can go ahead.

A summary of a MeDIP process. Image courtesy Weber et al, Nat. Genet., 39 (2007), pp. 457–466

The immunoprecipitated DNA and the unprocessed input (which is also whole genome amplified) are then labelled with different fluorescent dyes and hybridized to an array with specific gene probes. A readout of the array will then yield an address of all the regions of the genome that have a probe on the array and will have been hypermethylated compared to the control. Doing this across multiple samples enables those samples to be compared, for instance, cancerous tissue versus normal tissue, or two different types of cancer, or those that respond to a certain kind of therapy versus those that do not.

Concept of a methylation array, courtesy Illumina. http://www.illumina.com/technology/infinium_methylation_assay.ilmn

Such analysis is carried out computationally, and the results are used to identify differentially methylated regions (DMRs) which can serve as a molecular signature of a certain tissue state. Again, the possibilities are endless.The data generated is often complex, but further bioinformatics analysis can simplify things a great deal.

A heatmap of differentially methylated regions.

References

Nina Pälmke et al, Comprehensive analysis of DNA-methylation in mammalian tissues using MeDIP-chip, Methods, Volume 53, Issue 2,
February 2011, Pages 175-184, ISSN 1046-2023, 10.1016/j.ymeth.2010.07.006.

Weber et al, Chromosome-wide and promoter-specific analyses identify sites of differential DNA methylation in normal and transformed human cells
Nature Genetics – 37, 853 – 862 (2005)

Stephan Beck, Vardhman K. Rakyan, The methylome: approaches for global DNA methylation profiling, Trends in Genetics, Volume 24, Issue 5, May 2008, Pages 231-237, ISSN 0168-9525, 10.1016/j.tig.2008.01.006.
(http://www.sciencedirect.com/science/article/pii/S0168952508000577)

Cancer epigenomics: DNA methylomes and histone-modification maps