Toward in silico structure-based ADMET prediction in Drug Discovery

G Moroy, V.T. Martini, P. Vayer, B.O. Villoutreix and M.A. Miteva, Drug Disc. Today 2012 Jan 17 (1-2) 44-55

For a drug to be successful in treating a condition, it must not only modulate the condition’s underlying mechanism, but also have a suitable absorption, distribution, metabolism, excretion and toxicology (ADMET) profile. Accurate prediction of this profile is a key way of reducing costs, animal studies and other resources for molecules that are destined to fall at ADMET hurdles. Accurate prediction is also, unfortunately, somewhat elusive.

Traditionally, computationally predicted ADMET relied on 2D and 3D QSAR/QSPR (Quantitative Structure-Activity / Structure-Property Relationships) or knowledge-based / expert systems as its preferred method of model development. QSAR comes with the problem of requiring high quality (and usually large) data sets of materials that have been tested biologically. The authors of this paper have noted recent changes to this historical approach and have detailed current movement away from using QSAR/QSPR on its own, using systems that consider the 3D nature of the interacting proteins, rather than solely a set of ligands.

There has been change in pace in isolation and production of high quality crystal structures, and the authors note that for a large number of ADMET-involved proteins, there are crystal structures in the Protein DataBank (PDB) which have allowed them to investigate the concept of using flexible docking for finding potential pitfalls in their compound development – a list of common Human CYP450 and Human sulfontransferases (SULTS), along with their PDB codes are given, with insights into the state of the art over a range of areas associated with ADMET (for example, plasma-binding protiens, hERG, ABC transporters and so on). They then go to demonstrate that for one particular family of proteins, the SULTS, it was possible to develop a flexible docking model to support ADMET prediction.

The authors conclude that whilst flexible docking is amenable to some ADMET-involved protein families it is not so straightforward: these proteins are designed to be a promiscuous (some with multiple binding sites), may not have the whole picture with regards to water, and require flexible modelling. They propose that a series of other emerging techniques such as MD / MM and proteochemometrics may be useful in addition to ligand based methods.


There is no doubt that in silico ADMET prediction is a big challenge in computational chemistry – the aim here is not to simply replace in vivo and in vitro tests, but to improve the knowledge-base and reduce those materials that are very likely to be liabilities as early as possible – preferably before synthesis: Fail early, fail fast, fail cheap.

It may be a mistake to attempt to tackle the problem using only one tool in the toolbox (e.g. topological analysis systems) – A mixture of QSAR, toxicophore / pharmacophore, flexible docking and expert systems together could open up an avenue to much more accurate desktop prediction. This paper goes a distance in explaining not only where the successes lie, but more importantly, which issues still challenge computational chemists working toward in silico structure-based ADMET prediction in drug discovery.

Fluorescent membrane potential. A good surrogate for E-phys?

Fluorescent membrane potential. A good surrogate for E-phys?
The key for every screen is to effectively predict efficacy in disease. Practically higher-throughput screens tend to sacrifice disease relevance for throughput. These then filter the compounds through to the lower throughput- more disease relevant assays. With ion channels it is often too expensive and/or throughput is too low to run large screens via electrophysiology and therefore fluorescent assays are often used as a higher-throughput and cheaper surrogate. The assumption is there is a good correlation between the two, but generally how good this correlation is only know after many compounds have been screened and the doubt is always there as to how many active compounds are discarded as false negatives.
In a recent paper Ghisdal et al., ( compared a set of ~41 compounds identified as GABAA binders by 3H flunitrazepam binding to rat brain membranes and compared their functional effects using a validated fluorescence membrane potential (FMP) assay vs. automated patch clamp. The correlation between 3H flunitrazepam binding and the fluorescent membrane potential assay was good. However, when the relative efficacy of these compounds in the FMP assay was compared to automated patch, there was no correlation and some compounds that produced significant potentiation in patch clamp were inactive in the FMP. This suggests FMP may not a good surrogate to measure compound efficacy and whether FMP should be used for high-throughput screening to identify novel chemistry.
A selection of fluorescent dyes are now available for measuring Na+ (sodium green, SBFI) and K+ ions (PBFI, FluxOR) and yellow fluorescent protein as a halide dye. It would be interesting to hear others experience with these dyes and whether they are more suitable surrogates for than a fluorescent membrane potential dye.
Key words
Fluorescent membrane potential

The Reproducible Irreproducibility of The ScientificProcess: NIH Plans for Increased Rigour

In a previous blog the issue of data replication was discussed as exemplified by Bexarotene as a potential treatment for Alzheimer’s disease (see here). This issue is gaining increased attention as evidenced by a recent article in The Economist entitled “Trouble in the Lab” and “How Science Goes Wrong” (here) and now Francis Collins and Lawrence Tabak, who are respectively Director and Principal Deputy Director of the National Institutes of Health, offer their perspective on the issue (see here). In their article they emphasize that irreproducibility is only rarely due to deliberate fabrication or fiddling of the data. Rather, they criticise poor experimental design, publications that limit the space given to technical details (and which therefore make the reproduction of experiments more difficult) coupled with the pressure for publication, particularly in high-profile journals, that does not encourage scientists to try replicate or disprove their own data. In addition, methodological descriptions are sometime kept deliberately vague or omit a key step or “secret sauce” to maintain a competitive advantage within the originating lab.
As regards drug discovery, it is remarkable that critical decision-making, preclinical efficacy studies often pay scant regard to such basic principles of experimental design as blinding, randomization and power-calculations; elements that are essential components in the more regulated – and therefore more rigourous – arena of clinical trial design. This issue was highlighted by researchers at Bayer (see here) who reported reproducibility of 20-25% of published data associated with 67 projects, which were mainly in the oncology area whereas researchers at Amgen were only able to reproduce data in only 6 out of 55 (11%) key publications relating to hematology and cancer targets (see here). Moreover, although much is made of the so-called “Pharma bias” in publication relating to perceived potential conflicts of interest of academic scientists or the preferential publication of positive clinical trials, there is also a considerable academic bias towards publishing positive but not negative findings which “creates a huge conflict of interest for academics, and a strong bias to write papers that support the hypotheses included in grant applications and prior publications” (see here ). Furthermore, as C. Glenn Begley (who led the Amgen study and is now the Chief Scientific Officer of TetraLogic) states in an interview with Reuters: “The real problem is that scientists are reluctant to speak up about studies that won’t replicate because there is so much to lose,” Begley said. “If I criticize you, and you review my next grant application, you might [take revenge]. That’s why people are afraid to say the reason they couldn’t replicate a study is that it was just plain wrong” (see here).
john 1
Figure: Consequences of data irreproducibility (from the Global Biological Sciences Institute report on The Case for Standards in Life Science Research – Seizing Opportunities at a Time of Critical Need).
Clearly the issue of the lack of reproducibility not only has a negative impact on scientific, institutional and journal reputations but also causes a huge waste of time, effort and resources and damages the publication opinion of scientific research in the life sciences (see Figure). So, what’s to be done about it? Well, the NIH recognize that part of the problem may well be due to a lack of training of scientists resulting in poor experimental design and accordingly they are instigating formal training for intramural scientists that could serve as a template for wider dissemination. In addition, the NIH will encourage more rigorous examination of grant applications and may require additional preclinical studies to support clinical trials that are based on only limited preclinical efficacy data. Efforts are also being made to encourage journals to devote more space to methodological details as well publish negative findings. However, perhaps the toughest nut to crack is the academic incentive system which prizes publications in high-profile journals; a system that may well encourage rapid submission before systematic replication is carried out. Moreover, the use by university promotion committees of the number of publications in high impact factor journals is a convenient metric but is not necessarily the means of judging a scientist’s contribution. Still, the fact that there is an open discussion of these multiple, interrelated factors and that steps are being taken to correct the self-correcting mechanisms that underpin the scientific progress are in themselves major steps forward.

2013 FDA drug approvals

This analysis by Asher Mullard published in Nature Reviews Drug Discovery (2014,13, 85-89) reports the new drugs approved by FDA in 2013. From a total of thirty-six applications, twenty-five new small molecules and two new biologics were approved. The same trend as the previous years was overall maintained, with the exception of 2012. (Figure 1).
A notable achievement was the high approvals (33%) of new molecular entities for the treatment of orphan disease. In addition, 33% of the new approvals had a unique mode of action and were identified as first-in-class agents. The anticancer therapeutic area obtained the majority of approvals (eight, six of which are for orphan indication), followed by metabolic and endocrinology, antiviral and medical imaging (three approvals for each category). Cardiology, neurology, respiratory and women’s health have two agents approved each, and only one new approval for psychiatry and dermatology.
Ten drugs received a priority review status, which is given to drugs that potentially have significant advances in term of safety or effectiveness for the treatment, diagnosis or prevention of serious conditions, including: ado-trastuzumab emtansine, radium RA 223 dichloride, afatinib, obinutuzumab, and ibrutinib (anticancer); dolutegravir, simeprevir, and sofosbuvir (antiviral); gadoterate meglumine (diagnostic); and riociguat (cardiology). Among these, the humanized CD-20-specific monoclonal antibody obinutuzumab (chronic lymphocytic leukaemia), the Bruton’s tyrosine kinase inhibitor ibrutinib (mantle cell limphona) and the HCV nucleotide analogue NS5B polymerase inhibitor sofosbuvir (chronic HCV infection as a part of an antiviral treatment regimen), received also the breakthrough designation status. This status is given to drugs used to treat serious or life-threatening disease or condition and that showed a substantial improvement versus existing therapies as evidenced on early clinical data.

Thirteen new potential blockbusters have been forecast by 2018-2019 (Thomson Reuters Cortellis) (Figure 2) including:
– Sofosbuvir (Gilead), first-in-class HCV nucleotide analogue polymerase inhibitor, with an annual potential of 6.8 billions $US;
– Dimethyl fumarate (Biogen Idec), approved for the treatment of multiple sclerosis with an unknown mechanism of action, with sale forecasts of 6 billions $US;
– Ibrutinib (Pharmacyclics), first-in-class Bruton’s tyrosine kinase inhibitor, approved for the treatment of mantle cell limphoma, with an annual potential of 4.5 billions $US;
– Ado-trastuzumab emtansine (Genentech), an antibody-drug conjugate approved for the treatment of HER2-positive metastatic breast cancer, with a potential income of 4.1 billions $US;
– “Umeclidinium and vilanterol” and “Fluticasone and vilanterol” (GSK) both approved for the treatment of chronic obstructive pulmonary disease, with sale forecasts of 3.1 and 2.8 billions $US respectively.
Figure 2: new potential blockbusters.

In terms of companies, GSK had the highest number of approvals (five, including the HIV-integrase inhibitor dolutegravir develop with ViiV) within three different therapeutic areas, and with three potential blockbusters.

Despite a 31% drop of approvals compared to the previous year, 2013 saw the approvals of innovative new products that will have a significant impact for medical care. Analysts forecast thirteen new potential blockbusters (almost half of all approvals), and six of them may have a multibillion dollar potential