I was in the clinic, twiddling my thumbs impatiently and waiting for the medic to arrive in the office; these circumstances were unique. This was the first time personalised cancer medicine was available in this little cancer hospital in some obscure place; the enormity of this, I felt, was staggering, but it would be hard to put it in perspective with the same consummate ease with which I had learned to handle the complexities of the philosophy of treating cancer without reference to the distant past, to an era where cancer therapy was a quagmire and had all the finesse of intricate patterns being formed with a sledgehammer.
The medic disturbed my journey into the past as he walked in with a knowing smile, and told me that the reports I wanted to study were being prepared on the cloud system that the hospital had access to, and would be in my hands shortly. Time to head back into the past for a little while longer, I thought, and so it happened.
I recalled that cancer chemotherapy had been essentially crude, even lacking in the selectivity that Paul Ehrlich originally postulated as an essential requirement for his magic bullets; it had just been a case of bombing replicating, metabolising cells with DNA damaging agents or antimetabolites in the hope that cancer cells would be fatally hit while normal cells would be less affected, not killing the patient even if there were sometimes horrible side effects, then, when Gleevec arrived, some things changed and there had been some emphasis on targeted therapies, be they small molecules or antibodies that often hit “druggable” targets by binding to enzymes in place of substrates or by blocking receptor dimerisation. Eventually, some clever chemists and nanotechnologists had come up with solutions to effectively target siRNA and miRNA sponges and HCR constructs to cancer cells, and we were in a position to knock out what we wanted with exquisite selectivity.
The medic walked in with a stack of sheets, each containing a table listing, by pathway, the lesions that had been identified in each of the 20 new patients we were processing. Using an algorithm, drug regimens had been computationally predicted for each; HCR constructs where fusion genes were involved, the odd miRNA sponge to knockdown miRNAs that, when overexpressed, drove these tumours, or occasionally, small molecule inhibitors and monoclonal antibodies, all in combination with an hTert or ALT pathway inhibitor. None of these patients had anything particularly novel, and our models had been good, meaning that they could all go home with extremely good outcomes following short cycles of therapy. This was perfect…
And then the alarm rang, waking me up from a blissful slumber, and I was no longer in the same postcode or time period as the one I was in during my dream, but since I remembered quite a bit of the dream, it led me to cogitate about how far we are from having the aforementioned scenario being played out at cancer clinics with regular occurrence. I think there are technological and biological advances being made that have left us not very far from realising ambitions of stratified and personalised cancer medicine.
One of the biggest challenges so far has been understanding the genetic makeup of cancers, and what drives them, and how that would affect therapy, for a very long time now cancers have been classified based on the tissue of origin and in rare cases surface markers in combination with that; a case study is the use of ER +ve and ER –ve classification for breast cancer, with the added inclusion of Her 2 expression status for recommending the use of Trastuzumab in therapy. (See http://www.nature.com/nrclinonc/journal/v3/n11/full/ncponc0636.html ) for a reference.
Another example of rudimentary classification, but nonetheless a therapeutically useful one, would be the use of Ras mutational status to decide if Cetuximab may be used. This is useful because Cetuximab is far less effective relative to standard chemotherapy in patients with tumours that overexpress EGFR but have a downstream Ras mutation as opposed to those with wild type Ras. (See Karapetis et al, http://www.nejm.org/doi/full/10.1056/NEJMoa0804385 )
There are two technological platforms which I believe will be critical to the development of stratification and personalisation in my opinion, one is whole genome sequencing, which will produce extremely high resolution maps of everything that is genetically and epigenetically different in cancer cells compared to normal tissues, and microarray technology, which will enable expression profiling to pick out what effects modified genomes or epigenomes have on the transcriptome, and correspondingly, the proteome. The costs of sequencing a cancer genome have been steadily dropping and with the announcement of even cheaper, even more powerful genome sequencing platforms by companies like Oxford Nanopore at the Advances in Genome Biology Conference (See http://www.nature.com/news/nanopore-genome-sequencer-makes-its-debut-1.10051 for a reference), costs, it would appear, will fall further if these new platforms work as expected.
Microarray analysis has already led to significant progress in the stratification of therapy for breast cancer, for instance, differential expression in a panel of 70 genes was found to be predictive of prognosis for node negative breast cancer by van’t Veer et al (See http://www.nejm.org/doi/full/10.1056/NEJMoa021967 for the paper).
The hypothesis was that those with a good signature could be treated locoregionally alone, and systemic therapy could be avoided. The array platform was commercialised as Mammaprint, and has been at the centre of a clinical trial in progress, called MINDACT (Microarray In Node Negative and 1 to 3 Node Positive Disease May Avoid Chemotherapy). This trial will serve as a marker for stratification; once it becomes possible to avoid treating those who don’t need chemotherapy with it, the savings in terms of their quality of life and in terms of reduced treatment costs will pay for the initial costs of the array.
One more way that cancer therapy may benefit from improved integrated analyses is in the development of rational drug combinations. One example that comes to mind is Dexamethasone resistance in Acute Lymphoid Leukaemia; scientists were able to, using expression profiling, find out that there was a specific expression signature that characterised dexamethasone sensitive cells which was identical to dexamethasone resistant cells treated with rapamycin, an inhibitor of the mTOR pathway. (See Lamb et al, http://www.sciencemag.org/content/313/5795/1929.short) for the paper). They were then able to infer that the addition of Rapamycin would sensitise previously insensitive cells to dexamethasone treatment.
Our understanding of cancer is continually being improved by applying some of the technology mentioned above; for instance, the application of an integrated genomic and epigenomic analysis to Retinoblastoma, which is generally extremely stable revealed that Syk was a novel oncogene that was dysregulated and it could be knocked down to kill cells. (See Zhang et al, http://www.nature.com/nature/journal/vaop/ncurrent/full/nature10733.html for the paper)
Another example of clinical utility in stratification involves neuroblastoma, which is a cancer of nerves outside the skull. It is a heterogeneous disease and some types spontaneously differentiate into a benign state and regress, some others are extremely hard to treat and do not regress. The question is: what makes a neuroblastoma aggressive? Knowing this has clinical utility since it is a paediatric tumour and one may not want to treat a tumour that will regress but will want to aggressively deal with aggressive forms. One potential signature has been confirmed using a three gene panel subject to aberrant epigenetic regulation. (See Caren et al, http://www.biomedcentral.com/1471-2407/11/66 for the paper).
Simultaneous stratification strategies have been evolved for neuroblastoma using a 25 gene microRNA signature, and this demonstrates that there may be concurrent ways to effectively arrive at stratification strategies. (See http://clincancerres.aacrjournals.org/content/17/24/7684.abstract for the apposite paper).
And we aren’t even talking about emerging therapies, which are so diverse as to each deserve treatment in a full blog post, which I might do in the future, but ultimately, stratification and personalisation will depend on two things; good classification and therapies that suit said classification, for instance, those cancers marked by tumour suppressor gene CpG Island methylation could be treated with combinations that involve HDAC inhibitors and Demethylating agents in combination with RNAi or antisense for whatever oncogenes these cells are addicted to, and inhibitors of metastasis if disease hasn’t disseminated, and the odd telomerase or ALT inhibitor. Immunotherapy might be thrown in if there is a surface marker that makes tumour cells targetable by immune cell therapy ( a prominent case that comes to mind is the recent success of T cells with chimaeric receptors engineered to target CD19, a marker characteristic of Chronic Lymphoid Leukaemia Cells).
Clinical trials may also evolve to accommodate the new depth of data we will be able to derive from analyses of tumours; you could, for instance, test new candidate drugs only on those patients who have molecular signatures that indicate that the tumour is dependent on the protein or gene being targeted, and this might reveal to us groups that would benefit from therapy that wouldn’t have been otherwise approved because they didn’t benefit a significant amount of patients in a cohort simply defined on the basis of what the original tissue of the cancer was. Cancer Informatics and Cancer Therapeutics may well result in a long and prosperous relationship, and the prospect is exciting. A novel approach in this regard is being taken by the TREAT 1000 project, but it is quite hard to find information on the web for this.
That is all from me this time round, stay tuned for more posts.