BAP Certificate in Non-Clinical Psychopharmacology


At the beginning of the month, I was lucky enough to attend a residential course held by the British Association for Psychopharacology (BAP) in Cambridge. The training, which was held over four days, provided an overview of many major techniques used in this area of scientific research, as well as advances within the field.

We heard about cutting-edge research, from experts in academia, industry and the heath care sector. Our first lecture started with the basic concepts of genomics, and went on to the difficulties involved in interpreting genome-wide association studies (GWAS), an approach which is sometimes used to identify candidate genes for genetically complex neurological disorders. Another talk covered techniques including optogenetics and designer G-protein-coupled receptors (DREADDs). We went over the advantages of both these methods, which are used to precisely control neural activity, but also touched upon some of the limitations that still exist with these technologies. Other talks covered application of imaging methods and behavioural models.

Lectures were broken up by workshops on statistics and experimental design, as well as a group project. In a workshop focusing on PK/PD calculations, I was introduced to the concept of counter clockwise hysteresis plots. This is when you see two different response levels at a given drug concentration, the result of a delayed effect of the drug at the target. During this session, we spoke about the importance of considering these factors when designing a study as to avoid producing misleading data.

For our group project, we were given the task to form a Drug Discovery leadership team, where we had to choose a drug target for a neurodegenerative disorder, which we deemed to be the strongest candidate. With this target in mind, we put together a plan outlining why it is a worthy target and how we would go about identifying a molecule to take to clinic. Our conclusions were pitched later in the week in a “Dragons’ Den” like situation to see if our case was strong enough to get funding.

As part of the course, we took a trip to the Addenbrook’s hospital where we had the opportunity to take a tour around The Wolfson Brain Imaging Centre. Here we were able to see their clinical and pre-clinical imaging facilities, which included Positron Emission Tomography and Magnetic Resonance Imaging and heard about some of the ongoing research in the department.

After the intensive days, we all had the chance to sit down as a group for dinner and talk to the individuals who had presented throughout the day and had after dinner talks including one from the president of BAP. On the final evening, we headed down to Queens’ College, where we were presented with our certificates in Non-clinical psychopharmacology, which was a perfect way to finish off the course.

As someone who is relatively new to the area of research, I took away a lot from the course. I would definitely recommend this programme and believe that it could be beneficial for individuals at any stage of their career. The programme provided a fantastic platform to network and interact with others from many different area of psychopharmacology. I am excited to attend the BAP summer meeting 2018 to hear more from world leading scientist in both clinical and non-clinical psychopharmacology, and attend the evening disco, which has even been described as legendary!

Blog written by Olivia Simmonds

 

Advertisements

Development of Ultra-Rapid Insulins


The goal of insulin therapy for diabetic patients is to mimic closely the physiologic pattern of insulin release by the pancreas in order to maintain normoglycaemia.

Available as the beef/pig pancreas derived hormone since 1922, the first human recombinant insulin was developed by Genentech and marketed by Eli Lilly in 1982.

Standard 2 Zinc-insulin (which is hexameric) must be injected ~30 minutes before a meal to allow for disassembly in the subcutaneous depot into dimers and monomers (the active species).

At the turn of the millenium, to facilitate more accurate dosing, the principles of protein engineering were applied to destabilize the dimer and hexamer interfaces and produce rapid-acting insulin analogs (Fig. 1). Raj 1

Figure 1. Amino acid compositions of the rapid-acting insulin analogues (1).

Lispro/Humalog (Eli Lilly), Aspart/Novalog (Novo Nordisk) and Glulisine/Apidra (Sanofi-Aventis) can all be injected 5 to 15 minutes before a meal (Fig. 2).

Raj 2

Figure 2. 4 hour physiological plasma insulin profiles plotted together with pharmacokinetic profiles for insulin lispro and human insulin in type 1 diabetes (2).

Attention has more recently focused on the development of ultra-rapid insulins for dosing at (or even after) the time of the meal, for the benefit of children, insulin pump users and for highly insulin-resistant type 2 diabetics. Current approaches to speeding up the onset of absorption include modification of excipients and enabling of tissue diffusion.

Affreza

Powdered insulin delivered by a nebulizer into the lungs is absorbed more rapidly than subcutaneous insulin and absorption is of short duration. Exubera, developed by Inhale Therapeutics (then Nektar) was the first hexameric inhaled insulin product to be marketed by Pfizer in 2006. Unfortunately a 2007 study concluded that Exubera “appears to be as effective, but no better than injected short-acting insulin”. Exubera was dispensed using a bulky device (Fig. 3) with little dosing flexibility and poor sales led to its withdrawal in 2007. More recently, Afrezza, a monomeric inhaled insulin developed by Mannkind was approved by the FDA in 2014. Afrezza is delivered using a small device about the size of an asthma inhaler (Fig. 3), peaks at ~15-20 minutes and is eliminated from the body within ~2-3 hours. Raj 3

Figure 3.

The rapid absorption and decreased duration of Afrezza closely resembles physiological insulin release (Figs. 2 & 4).

Raj 4

 Figure 4. Pharmacokinetic profiles for inhaled Afrezza and SC insulin lispro (in type 1 DM patients) and for inhaled Exubera and SC human insulin (in type 2 DM patients)(4).

Biochaperone Lispro

An alternative to engineering the insulin aggregation interfaces is to introduce biotechnological enhancers. Eli Lilly has complexed Lispro insulin with French biotech company Adocia’s proprietary BioChaperone (BC) to accelerate absorption (licensed in 2014) (Fig. 5).

Raj 5

Figure 5. Adocia’s BioChaperone technology is based on polymers, oligomers and organic compounds. The BC-insulin complex forms spontaneously in water, protecting it from enzymatic degradation and enhancing absorption after injection (5).

BC Lispro promoted a statistically significant 63% increase in metabolic effect over the first hour in comparison with Novolog, having previously been demonstrated to outperform Eli Lilly’s Humalog (Fig. 6).

Raj 6

Figure 6. Comparison of mean blood glucose profiles after subcutaneous injection of lispro and BC lispro (6).

Despite results from 6 clinical studies indicating that BC Lispro performs better than Humalog, Eli Lilly decided to terminate its collaboration with Adocia in 2017 (possibly because of the costly failure of its Alzheimer’s drug solanezumab). Lilly is now developing its own ultra-rapid Lispro in house (LY900014 currently in phase 3), formulated with two new excipients, treprostinil (a vasodilator) and citrate (a vascular permeabilizer). The rights to BC Lispro reverted back to Adocia from Eli Lilly at no cost and the company is currently seeking a new partner to shoulder the costs of phase 3 clinical trials, regulatory and marketing hurdles.

Fiasp

Novo’s Faster-acting insulin aspart (FIASP), the first ultra-rapid insulin to be approved and marketed (Fig. 7), is an innovative formulation containing Vitamin B3 (niacinamide) to increase the speed of absorption, and the naturally occurring amino acid (L-Arginine) for stability.

FIASP was sidelined by the FDA in October 2016 but approved in September 2017 following clarification of immunogenicity and clinical pharmacology data.

Raj 7

Figure 7. Fiasp FlexTouch Prefilled Pens -100 units/mL (7).

 FIASP can be injected from 2 minutes before to up to 20 minutes into a meal and acts twice as fast as Novolog/Aspart (Fig. 8).

rAJ 8

Figure 8. Blood insulin Aspart concentration after subcutaneous injection of Fiasp and Novolog in patients with type 1 diabetes (8).

This was achieved without a significant difference in the overall rate of severe or confirmed hypoglycemia. Clinical trial data showed that FIASP gave a lower post-meal spike and that patients also lowered their A1C levels.

Novo Nordisk is very keen to expand the use of ultra-rapid acting FIASP in an artificial pancreas setting and it is already approved for use in insulin pumps in Europe.

Given that Novolog was the third best selling diabetes medication in 2015 with $3.03 billion in global sales (9), Fiasp is well poised to become the leader in this enormous market segment.

Raj 9

http://www.doctablet.com

References

  1. https://www.diapedia.org/management/8104096115/short-acting-insulin-analogues
  2. Home, P.D. (2015) Plasma insulin profiles after subcutaneous injection: how close can we get to physiology in people with diabetes? Diabet. Obes. Metab. 17, 1011-1020.
  3. http://pharmamkting.blogspot.co.uk/2015/02/look-ma-no-bong-afrezza-inhaled-insulin.htmlAl-Tabakha, M.M. (2015) Future prospect of insulin inhalation for diabetic patients: The case of Afrezza versus Exubera. J. Control. Release 215, 25-38.
  4. https://www.adocia.com/technology/biochaperone-technology-2/
  5. http://www.diyabetimben.com/ultra-hizli-etkili-insulin-biochaperone-lispro/
  6. https://online.pharmacy/product/fiasp-insulin-3/
  7. https://www.fiasppro.com/the-fiasp-story/onset-of-appearance.html
  8. pharmaceutical-technology.com

Blog written by Raj Gill

Reducing attrition in drug discovery: the AstraZeneca 5R-framework


The high attrition in drug discovery is responsible for the extremely high cost of developing a new medicine. Some suggestions aimed at reducing the attrition have been put forward such as: improving efficacy and safety profiles, reducing toxicity, improving preclinical models, better understanding of mechanism (Nat Rev Drug Discov. 2004, 3, 711-716) and shifting the attrition to earlier phases (Nat. Rev. Drug Discov. 2010, 9, 203–214).

Based on this, scientists at AstraZeneca have established a five-dimensional framework (5R framework) (Nat. Rev. Drug Discov. in 2014) aimed at improving the low success rates in the process. The framework includes five determinants: right target, right tissue, right safety, right patient and right commercial potential, identified as key features in the drug discovery process (summary depicted in Figure 1).

Marco 1

Figure 1. The 5R framework (from Nat. Rev. Drug Discov. 2018).

In addition to the “5R framework”, AstraZeneca scientists decided to reduce their diseases portfolio and to potentiate their capabilities for target selection and validation (better understanding of biology and mechanism of the disease, stronger target rational).

Furthermore they improved their lead generation strategy (expansion of their compounds library including library sharing, and integration with other screening approaches), and their pharmacokinetic/pharmacodynamics modelling, patient stratification and biomarkers. In a more recent paper published in Nat. Rev. Drug Discov. in 2018, they show how the application of these guidelines have led to an increase in project success rates (Fig. 2) and to a reduction of cycle time and cost (Fig. 3).

Marco 2

Figure 2. Project success rates for the AstraZeneca (AZ) portfolio.

The overall project success rates have increased from 4% (2005-2010) to 19% (2012-2016); the cost to reach clinical proof of concept has decreased by 31% when comparing the two time cohorts, and by 42% compared to the industry average.

Marco 3

 

Figure 3. Metrics for projects costs (a. first good laboratory practice dose; b. clinical proof of concept) and cycle times.

With regard to cycle times, they observed a considerable reduction of the length for phase II (50% shorter than industry average) although this may return in line to the average in the future.

To further support how the introduction of the 5R framework has influenced AstraZeneca pipelines, Table 1 reports the new molecular entities and new biologics that entered phase III in 2012-2016, highlighting how their progression was influenced by these guidelines. The 5R framework can be used at any stage of the process, as it has been done for Olaparib (selected as drug candidate before 2011) resulting in the initiation of novel clinical studies. (Readers are encourage to check Box 1 in the paper for an example of the 5R framework applied to Osimertinib).

Marco 4

Table 1. Influence of 5R framework for 15 projects new molecular entities and new biologics entering phase III.

Although these guidelines have clearly improved productivity, 81% of the projects still failed at some stage of the process, but it is clear they are moving the process in the right direction and it will be interesting to see how these guidelines may affect the R&D strategy of other companies.

Blog written by Marco Derudas

Pronucleotides Assemble! A multifunctional catalyst designed to stereoselectively synthesise prodrugs


Nucleosides have emerged as a key chemical class in successful antiviral and anticancer drugs with nearly half of currently marketed drugs possessing these cores.1 However, the biologically active nucleoside phosphates, which are generated in vivo by phosphorylation, are poor drug candidates due to issues of permeability and stability. One pronucleotide strategy developed to address these challenges has been described by McGuigan. His ProTide platform introduces a 5’-arlyoxy phosphoramidite to the drug candidate which can result in improved cell permeability and rate of phosphorylation compared with non-phosphoramidite containing nucleosides.2,3

However the introduction of the phosphoramadite in the prodrug can have a significant effect on the potency, toxicity and rate of metabolism, effects which can be associated with the stereochemistry at the phosphoramidite phosphorus. Unlike for carbon chemistry, where stereocontrol is a sophisticated branch of asymmetric catalysis, p-chiral chemistry is far less developed. Although methods do exist to steer products towards a preferred p-chiral isomer, such as dynamic kinetic asymmetric transformation (DYKAT) using chiral auxiliaries or desymmetrisation of achiral species,3 these approaches have significant drawbacks with the former suffering from poor selectivities and low catalytic turnovers, whilst the latter is often a complicated multistep synthesis.

In addition to the stereo-isomeric challenges during prodrug assembly there also remains the challenge of chemoselectivity for 5’ versus 3’ phosphoramidation. With these challenges in mind, I wanted to highlight an excellent paper published by a team from the process research and development group of Merck & Co., USA, where they report the first good example of a catalyst being designed to address the issue of stereo- and chemoselectivity in the synthesis of pronucleotide prodrug candidates. In their paper they focused on a hepatitis C virus RNA polymerase inhibitor currently in late stage clinical trials (MK-3682, Figure 1).4

Figure 1Figure 1: General phosphoramidation scheme and the best in-class catalysts developed to effect the coupling. Yield is the total yield of phosphoramidite isolated, chemoselectivity for 5’ vs 3’ is represented by ratio 5’:3’and d.r is the ratio of P(R) to P(S).

Using mechanistic studies, computational modeling and an understanding about the enzymatic mechanism of P-O bond formation in the phosphorylation of nucleosides, the team successfully developed several small molecule organic catalysts that mimic the concomitant series of activation modes used by enzymes to effect the P-O bond formation. Early studies identified carbamates as a privileged class for controlling stereo- and chemoselectivities with catalyst (R)-B being the best of the first generation catalysts developed. Using computational modeling the carbamate was theorized to carryout 3 roles; leaving group activation, general base catalysis and oxyanion stabilization via a pentavalent transition state, giving rise to a 2.3kcal/mol differentiation between the desired R-stereochemistry and the S-stereochemistry at the phosphoramidite phosphorus (Figure 2). Catalyst I, the best of the catalysts reported, evolved from a conscious effort to increase the transition state differentiation by decreasing the entropy of the system via linkage of catalyst (R)-B. As highlighted in Figure 1 the linked catalyst, Catalyst I, was able to achieve excellent yields and high selectivities for both the desired 5’ product (99:1 in favour of the 5’ product) and with a d.r (of 99:1) in favour of the desired P(R) isomer.

Figure 2Figure 2: Transition state model showing multiple catalyst modes of action. Reproduced from reference 4.

This work demonstrates an excellent step forward in the controlled synthesis of pronucleotide prodrugs by continuing to employ rational design beyond the discovery phase SAR, well into the late stage development of the prodrug. Moreover the published work is an elegant example of the power of using an interdisciplinary approach to solve chemical problems via a rational design cycle.

 

Written By Jason A. Gillespie

 

References

  1. P. Jordheim, D. Durantel, F. Zoulim, C. Dumontet, Nat. Rev. Drug Discov. 12, 447–464 (2013).
  2. Cahard, C. McGuigan, J. Balzarini, Mini Rev. Med. Chem. 4, 371–381 (2004).
  3. J. Sofia et al., J. Med. Chem. 53, 7202–7218 (2010).
  4. A. DiRocco et al., Science, 356, 426–430 (2017).

Multiple Multiparameter Optimisations, and the Success of Confirmation Bias


This blog article refers to the article in press by A.K. Ghose et al. on “Technically Extended MultiParameter Optimization”, 1 and the somewhat pivotal works of T.T. Wager et al. onDefining Desirable Central Nervous System Drug Space through the Alignment of Molecular Properties, in Vitro ADME, and Safety Attributes” and  “Central Nervous System Multiparameter Optimization Desirability: Application in Drug Discovery”,2 and attempts to explain what an MPO is and discusses the two systems’ design.

 A Multiple Parameter Optimisation (MPO) tool in any application domain is one where the user selects several important parameters that collectively indicate prediction of an outcome for a particular endpoint (e.g. oral bioavailability). The user then creates a scoring system which balances a scoring matrix across all of the selected parameters, so as to reduce the data to (usually) a single number or small collection of numbers (“scores”). This is simply a data reduction, which was briefly discussed in a previous article about man-made metrics for drug discovery (Reducing Data: Ligand Efficiency and Other Fallacies).3 Unlike QSAR / QSPR, where we use various rigorous mathematical and statistical methods to determine which factors are important, and weight them accordingly, often an MPO in drug design uses criteria that are picked by senior scientists who have many years’ experience observing the particular endpoint in question.  It has the added benefit of usually being a bit more human-readable compared to typical QSAR / QSPR

An MPO differs from a hard-logic filter (e.g. Lipinski’s Rule of 5), in that it considers optimal and suboptimal values with graduated scores, whereas in a hard-logic filter, there are only two states: pass or fail. Typically anything failing a hard-filter is thrown away, whereas a moderate scoring MPO material might be tweaked to improve it as part of Lead Op. You simply show the MPO your structure (or SMILES string), and it will calculate the values for the selected criteria, and give you a score – in the case of the Wager MPO, a score between 0 and 6, with a score of 4 or higher being representative of something that is “probably CNS penetrant”. If you are in abundance of chemical matter, you might throw the lower scorers away, but if you are limited with molecular hit matter, you might design your molecule to improve it (it is an MP optimiser after all).

An MPO for determining likelihood of central nervous system (CNS) penetration has been outlined back in 2010 when Travis Wagner and colleagues at Pfizer, using internal data, determined a six-criterion system.2 It is fair to say that this work changed the way many medicinal chemists designed their output for neuroscience targets across multiple organisations (including ours). The same authors revisited their work earlier this year with some further pseudopost-hoc validation data (more on this later). Arup Ghose is a name well known by chemoinformaticians, and I recall reading his works at the turn of the millennium alongside those of Lipinski and Oprea for filters for oral bio-availability and the like (he also attributed with the invention of the AlogP method of LogP prediction). Ghose and colleagues recently published their own ideas on a suitable CNS MPO using a humanised-QSPR type approach.

The 2010 Wagner paper has been cited over 200 times, with the Pfizer MPO being used by various organisations and groups as their primary MPO for neuroscience projects. As a result, Ghose’s suggestions are an interesting variance, especially as Ghose’s is statistically more rigorous in its design.

Design

 For a model to be a useful one, it needs to be validated. This is normally done by taking a data set, and randomly splitting it into a design (training) set and a validation (test) set. You build the model on the training data and see how well it holds up against the test set. This is how the Ghose model was designed and built, and hence it can statistically demonstrate its validity within the data set. In the case of the Travis MPO, they fell into the normal non-statistical pitfall of creating a Texas Sharpshooter Fallacy (like Lipinski and many others, below), in that they used the whole data set to build the model, and then had no data that was external to the training set in order to validate it. In the case of Wager’s 2016 paper, they effectively demonstrated conformational bias in recent development.

A Texas Sharpshooter Fallacy is a mathematical fault-of-reasoning where a person shoots a wall with a gun and then goes and draws the target around all of the bullets in the wall, and claims they were all within the target. Without an extra set of bullets to then go back and shoot into the target, you cannot validate how good a shot he is. This is basically the use of all of the available data in the creation of a model, and leaving none aside to test it with, and then calling it successful because all of the data matches (even though it was the same data used to make the model).

ben-1

Figure 1: Data on numbers of candidate and drugs with their MPO ranges from Wager et al. (loc. cit.)

As can be seen in Figure 1, these are trends and not steadfast rules, however MPO’s are very useful in reducing the perceived risk of endpoint failure. There may be a problem in assessing the quality of this MPO, as many organisations use this method for development of early materials, and as a result, we have a confirmation bias issue, discussed at the end of this article.

Criteria

Table 1 shows the different criteria used for their optimisation methods. Though it contains more components, Ghose’s system requires less computing to calculate, having only AlogP as a required machine-calculable element (the rest could be done by eye), however, realistically you would have your software do the maths for them all. Ghose and colleagues used a computational data reduction method to reduce the criteria to eight in a way that has effectively tried to reduce the elements that give variation by method (e.g. different software will determine pKa, LogP and logD differently. In fact the same software will give different values depending on the version. Recently we have seen our LogP and LogD values change overnight as ChemAxon changed the way they natively calculate those criteria). By trying to avoid these and stick to human measurable, functionally discreet criteria it becomes less method dependant. The use of AlogP rather than a complex ClogP, KlogP, ACDlogP also attempts to minimise errors (or rather make errors more consistent) where there are novel chemotypes that are not likely in the model set for the LogP calculator (see previous article on CLogP and other short stories).4

Table 1: Comparison of the criteria in both Wager MPO and Ghose TEMPO

ben-table

* No method suggested

The way the criteria are scored varies between authors. In the case of Wager et al. the criteria were scored according to Figure 2 (mostly monotonic, with a hump function for tPSA), whereas Ghose used a hump function across all of the criteria (Figure 3 and Table 2).

ben-2

Figure 2: Criteria plots, each detailing a parameter (“desirability function”) in the Pfizer (Wager) MPO. The six criteria are scored on their value for each compound, with a result being a score of between 0 and 6.

ben-3

Figure 3: A hump function (albeit upside down), where P is preferred, and Q is qualifying, U is upper and L is lower. The penalty is the applied in a scale for materials outside of the preferred range.

Table 2: The scoring range for Ghose et al.’s TEMPO

ben-4

Weighting

A good scoring system should be a mathematical construct based on a criteria and its relevance as described in Figure 4.

ScoreABC… = (criterionA * coeffA) + (criterionB * coeffB) + (criterionC * CoeffC)…

Fig. 4: A simple scoring formula, where a criterion (e.g. LogP), is multiplied by its weighting coefficient. All the component products are then summed.

In a system like that of Figure 4, each criteria is multiplied by its weighting, which is derived from its determined importance in its contribution to the endpoint. In classical QSAR, it is determined by PCA or other regression technique, but in the case of human data reduction it is often whimsical. In the case of Wager et al., each component of the MPO was given the same weight, that is, each coefficient was 1. In the case of Ghose et al. each criteria was given a weight derived from the data reduction analysis in the model design (Table 2, column 6 “coeff (C)”). You can see in Ghose’s that the number of basic amines is three times more important than the number of rotatable bonds, for example, whereas in the Wager MPO, all features are as valuable as each other.

 Comparison and confirmation bias

Typically we could compare models here to see which predicts or gives a better correlation to CNS penetration by taking a dataset from our pipeline and seeing how it predicts, however we have a problem with confirmation bias. Like the atomic bomb dispersing isotopes rendering certain archaeological aging techniques impossible in modern times, the Wager CNS MPO system may have dispersed into product pipelines since (and possibly before, if the system was used internally) their paper, and so materials that are CNS penetrant that also score highly in the CNS MPO in our compound deck may be because materials that did not score highly, were not developed in the first place.

As a result, we would need a data set that was evidently not based on or used in the generation of  this CNS MPO system, or in fact any CNS or development guide as they will influence the materials in the comparison set.

Conclusions and comments from the blogger (whose opinions are his own).

Without a doubt MPOs are fantastic tools to simplify and somewhat humanise the abundance of data in order to give chemists information about materials to make or avoid. It is also beyond question that the original works by those Old Guard of filters (the Lipinski’s and Oprea’s and Ghose’s), have shaped how we design and prioritise materials, and likewise Travis Wager and the Pfizer team really influenced multiple groups around the world by shining a light on how to optimise their materials for CNS penetration at the design stage. I believe that the Ghose et al. TEMPO, despite probably being named that way purely for the cool acronym, is a statistically and logically more rigorous piece of data reduction. Wager’s 2010 paper it seems more contextual, and thoroughly details the thoughts and trends behind the MPO is a less pure-stats way.

The problem with confirmation bias is actually testament to how well adopted the Wager CNS MPO system and others have been picked up,  – it does however now make it quite difficult to compare systems (all we can do is try to re-zero across the MPO’s to see what it translates to. It is likely we will keep an eye on both systems in our data generation and see how they track side by side.

Conformational bias is apparent in a number of other areas in drug discovery, and a prime example is Ligand Efficiency metrics, which have permeated multiple organisation’s design principles.

In the few organisations I have worked in, I have seen and used multiple CNS and other MPO tools which means that the compounds at the end are tainted by design, and are useless for comparing methods. I wonder how true this is across the larger chemical community.

So my challenge to the reader, next time you look at your enumerations / libraries and line up your synthesis priorities next to your nicely coloured coded MPO score columns, whichever tool you use, and whatever the endpoint, what information you are really getting from them, and are you perpetuating a cycle of confirmation bias (and limiting chemical space in doing so)?

1.Arup K. Ghose, regory R. Ott and Robert L. Hudkins, ACS Chem. Neurosci., article in press,

DOI: 10.1021/acschemneuro.6b00273.

2. (a) Travis T. Wager, Xinjin Hou, Patrcik R. Verhoest and Anabella Villalobos

DOI: 10.1021/cn100008c |ACS Chem. Neurosci. (2010), 1, 435–449

(b)Travis T. Wager, Xinjin Hou, Patrcik R. Verhoest and Anabella Villalobos.

DOI:10.1021/ascchemneuro.6b00029 | ACS Chem. NeuroSci (2016), 7, 767-775.

Blog written by Ben Wahab

Small but mighty: Nanoparticle polymers aid drug delivery


Nanotechnology, the branch of technology that is conducted at the atomic, molecular, or supramolecular scale (or ‘nanoscale’) has captured the public imagination since the concept was first introduced over fifty years ago. Even before the relevant technology caught up and allowed nanotechnology to transition from the abstract to the practical realm, it was a subject that inspired scientists and technologists – as well as doomsday enthusiasts and science-fiction writers. With good reason – it is easy to imagine how harnessing the ability to manipulate matter at absurdly small dimensions could open the door for a litany of potential applications in medicine, electronics, energy production and consumer products. And, as just one recent paper has shown (dx.doi.org/10.1016/j.jconrel.2015.12.022), nanotechnology even has a role to play in enhancing drug delivery.

In this paper, the authors have set out to combat the issue of poor drug solubility, which leads to low bioavailability and therapeutic efficacy and can be a great hindrance to the development of otherwise promising compounds. This has particular relevance for medications that target tumour cells, which are often hydrocarbons that are inherently hydrophobic and so require higher dosages to enter aqueous environments. While there have been relatively successful attempts to improve the solubility of pharmaceuticals by ‘nanosizing’ drug formulations to the 10-1000nm range, it is difficult to prevent them aggregating and consequently losing some potency.

To solve this problem, the authors have sought to stabilise drug compounds by using branched copolymer nanoparticles (BCN) made from biocompatible polymers – in this case polyethylene glycol and isopropylacrylamide (PEG-PNIPAM). They synthesised these polymers as branched spheres with varying degrees of cross-linking between the carbon chains, resulting in a structure with both hydrophobic and hydrophilic elements. These characteristics then make it possible to create an emulsion between the drug compound and PEG-PNIPAM when they are mixed together, and subsequent freeze-drying of the emulsion spurred formulation of the organic nanoparticles directly within the pores of the PEG-PNIPAM polymer. Unlike earlier nanosized drugs, these nanoparticles are protected from aggregation within the unique, highly interconnected scaffold structure, which was exquisitely visualised using scanning electron microscopy and well characterised using dynamic light scattering.

The emulsion-freeze-drying process was then applied to the poorly water-soluble drug indomethacine (IMC) to highlight the versatility of this technique. IMC was dissolved in o-xylene and was emulsified with PEG-PNIPAM prior to freeze-drying, which fragmentised the emulsion into nanoparticles. The researchers then showed that the IMC nanoparticles could be readily dissolved in water to form an aqueous dispersion, even after eight months of storage. Not only this, but when the procedure was repeated with two more drugs – ketoprofen and ibuprofen – the resulting nanoparticles achieved an impressive 100% yield. Such a simple and elegant approach could realistically be applied to a wide range of pharmaceuticals to improve drug solubility issues and carve the way for new nanotechnology roles in medical treatment.

Blog written by Chloe Koulouris

Mass spectrometry as a primary screen


Alternative methods to identify hit material in the drug discovery process always catch my eye. When a very recent publication using a Mass spectrometry (MS) detection system as a primary assay was released, I took the time to read it through. A link to the publication is shown below:

http://jbx.sagepub.com/content/early/2016/10/07/1087057116673181.full

In this publication the authors were looking to identify inhibitors of the obesity target monoacylglycerol acyltransferase, which is responsible for acylation of monoacylglycerol (MAG) to diacylglycerol (DAG) in certain tissues. Diacylglycerol is then further metabolised by diacylglycerol acyltransferases resulting in triacylgycerol (TAG). Triacylglycerol is then stored in tissues as an energy source. Interrupting this metabolic pathway could then be used as a method to assist in diseases such as type 2 diabetes, which are influenced by excess storage of Triacylglycerol

Historic assay formats for this target previously included a scintillation proximity assay and thin layer chromatography, both of which have specific drawbacks. The authors took a different path for this target and developed a Mass spectrometry readout utilising the rapid-fire system.

In this assay format, crude human intestinal microsomes were allowed to react with substrate in the presence of test compounds using 384 well plates. The reaction was then quenched and transferred to the rapid-fire system (a solid phase extraction system). The samples where then measured on a triple quad mass spectrometer. The specific products of the reaction were then identified and a % inhibition determined for each test compound.

gareth-1

Enzyme activity monitored on a Mass spectrometry system. Adachi, R., Ishii, T., Matsumoto, S., Satou, T., Sakamoto, J., and Kawamoto, T. (2016). Discovery of human intestinal MGAT inhibitors using high-throughput mass spectrometry. Journal of biomolecular screening (in press).

One of the advantages of using the Mass spec based detection method was that as crude human intestinal microsomes were being used, both the production of DAG and TAG could be monitored in one reaction sample. This allows the ability to identify different enzyme inhibitors from one screen.

Remarkably the screen was carried out on 500,000 compounds at a screening concentration of 1μM. Given the cycle time of 10 seconds per sample, this suggests about 2 month’s work, assuming 100% up time for the Mass spec system. This is probably longer than a standard plate based biochemical assay could take to screen those number of compounds, however I would assume further improvements to this cycle time could occur with further technical development. The results showed the screen had an average Z prime of 0.7 and 0.83 for the different enzymes measured. Hit compounds were further classified with concentration response curves against a variety of different Monoacylglycerol acyltransferase subtypes (MGAT2, MGAT3), and they were able to release a structure of a selective compound in the paper and highlighted a number of other compounds which were identified.

The use of mass spectrum based screening has been highlighted in other publications, although mostly used at a hit confirmation stage of a screening cascade. This is the first time, I personally have seen it used as a primary screen with compound numbers of this size. The technique may open up primary screening of large compound collections on targets that have unable to be fully explored due to the failure to develop a robust assay format.

I do envision more targets using mass spectrometry as a primary screen and I think this publication is a step forward in that direction.

Blog written by Gareth Williams

 

 

 

 

Gabbing about GABAΑ receptor and gamma waves (or the GABA-gamma-cognition triangle)


oana-1

Not long ago, I came across an abstract about basmisanil (RG1662), the promising then, now failed Roche compound aimed at improving cognition in Down syndrome. Bolognani et al (ref 1) were reporting in 2015 encouraging RG1662 data from qEEG recordings from young Down syndrome (DS) patients. They were reporting statistically significant (p < 0.001) dose-dependent increases in gamma oscillations power after 10 days of dosing with RG1662 and also a dose-dependent lowering of the DS index. It very much played to expectations from animal preclinical models, the results backing the tight but still misty in many way connections between the GABAA receptor (the RG1662 target), the gamma waves and cognition.

Let’s meet first the gamma waves: rapid electrical oscillations in the brain cycling at higher than 30 times per second frequency. They are rather small voltage fluctuations of about 10-20µV that can be recorded in cortical and subcortical brain regions using techniques such as electroencephalography, magnetoencephalography and local field potential measurements. They have a rather small contribution of up to 10% of the total brain local field signal. They are not actively propagated in the brain. Don’t be misled however, gamma waves have been the subject of a thriving field of research for their mighty correlations to our learning memory, attention and ‘’conscious’’ experience.

It is well established prominent spontaneous or induced gamma waves provide a signature of engaged neuronal networks. An example of a distinct and striking ‘‘bump’’ in the power spectrum in the gamma range recorded in a sensory stimulus-driven state can be seen in the figure below (reproduced from ref 2).

oana-2

Figure reproduced from ref 2. Local brain field potentials (LFPs) for spontaneous and stimulus-driven activity. (Left) Example traces of the LFP during spontaneous activity and visually driven activity in primary visual cortex. (Right) The corresponding power spectra for the two conditions, with the frequency ranges of different rhythms indicated.

There is a clear correlation between gamma oscillations and a wide range of primary and high-level cognitive processes such as attention, decision making, learning and working memory. Dysfunctions in gamma activity have been observed in neurological disorders such as schizophrenia, Alzheimer’s disease, Parkinson’s disease and epilepsy. However, it is less clear and subject to much research (and controversy) whether gamma rhythms are simple the byproduct of network activity or have an important functional role in the above mentioned cognitive processes.

Let’s meet now the GABAA (gamma-aminobutyric acid) receptor-mediated inhibition: a key ingredient of gamma oscillations.

A whole array of modelling, in-vitro and in-vivo animal studies suggest that gamma properties depend on GABAergic inhibition; the waves are generated by a network formed by interconnected fast-spiking GABAergic inhibitory interneurons and excitatory pyramidal neurons. The neurotransmitter GABA concentration, the GABAA receptor density and the GABAA receptor positive and negative modulators have all been shown to influence the gamma oscillations in both animals and humans (ref 3 and 4). A consistent correlation appears to be between the receptor pharmacological modulation and gamma rhythms signature. Christian EP et al. (2015) have even shown that the higher the in vitro compound modulatory effect on GABAAR (albeit not the cognition linked GABAAR subtype) the higher was the increase in gamma wave power in rat.

oana-3

Figure reproduced from ref 4. Quantitative evaluation (in rat) across a set of 10 study compounds of the relationship between mean intrinsic modulatory capacity to enhance GABA signalling and mean spectral EEG power change produced by the compound in the gamma band.

There is a lot of literature showing modulation of the GABAA receptor alpha5 subtype results in improved cognition in animal models and a few human studies suggest the same. And the results published by Roche initially dotted nicely the expected GABAA – gamma waves – cognition triangle. Why the failure of RG1662? Speculations are ripe, eyes and ears on Roche hopefully sharing more in the future as much could be learned from the failed clinical trials.

Addendum: The blog author thinks there is enough data and lots of interest to warrant further exploration of the potential that gamma waves have not only as biomarkers of behavioural states or disease conditions but also as a pharmacodynamic biomarker for a drug to engage the GABAA receptors. And…was Down syndrome the wrong indication for RG1662?

References:

  1. Bolognani F, Squassante L, d’Ardhuy XL, Hernandez M-C, Knoflach F, Baldinotti I, Noeldeke J, Wandel C, Nave S and Khwaja O (2015). RG1662, a Selective GABAA α5 Receptor Negative Allosteric Modulator, Increases Gamma Power in Young Adults with Down Syndrome. Neurology vol. 84 no. 14 Supplement P6.273
  2. Jia X and Kohn A (2011). Gamma Rhythms in the Brain, PLoS Biol 9(4): e1001045. doi:10.1371/journal.pbio.1001045
  3. Jan Kujala, Julien Jung, Sandrine Bouvard, Françoise Lecaignard, Amélie Lothe, Romain Bouet, Carolina Ciumas, Philippe Ryvlin & Karim Jerbi (2015) Gamma oscillations in V1 are correlated with GABAA receptor density: A multi-modal MEG and Flumazenil-PET study. Nature Scientific Reports | 5:16347 | DOI: 10.1038/srep16347
  4. Christian EP, Snyder DH, Song W, Gurley DA, Smolka J, Maier DL, Ding M, Gharahdaghi F, Liu XF, Chopra M, Ribadeneira M, Chapdelaine MJ, Dudley A, Arriza JL, Maciag C, Quirk MC, Doherty JJ (2015). EEG-β/γ spectral power elevation in rat: a translatable biomarker elicited by GABA(Aα2/3)-positive allosteric modulators at nonsedating anxiolytic doses. J Neurophysiol. 113(1):116-31. doi: 10.1152/jn.00539.2013

Blog writen by Dr Oana Popa

 

 


					

Personalised Analgesia? Why have researchers waited so long?


A recent study in Science Translational Medicine (http://stm.sciencemag.org/content/8/335/335ra56) has received much attention both in the scientific literature and on other science blogs (http://relief.news/moving-toward-the-dream-of-precision-pain-medicine/) The paper describes a clinical study in a small group of patients with the rare condition Inherited Erythromelalgia which is caused by again of function mutation in the gene SCN9A which encodes the voltage gated sodium channel Nav1.7. The patients were treated with a selective Nav1.7 inhibitor developed by the Neusentis group. It might have been expected that all individuals would experience a large reversal of the phenotype, however results were highly variable among the individuals with some experiencing near total relief from symptoms where in others the drug barely worked at all In parallel with the clinical study the authors generated neuronal cells from the individuals in the study. Cells isolated from the patients’ blood samples were genetically reprogrammed into induced pluripotent stem cells (iPSCs). The iPSCs were then differentiated into nerve cells which functioned in a similar way to the native neurons in the patients. The drug used in the clinical study was then tested on the nerve cells. Intriguingly the compound showed different responses in each cell tested and the degree of effect corresponded closely with the response seen in the clinical trial. Even though the number of people treated is small the correlation of clinical effects and results from electrophysiological studies is striking and has rightly generated a great deal of excitement.

What will become of the results of this study remains to be seen. It would be reasonable to hope and expect that pharma companies would adopt this strategy in future clinical studies and if successful in marketing strategies. However the proposal to use iPSCs in analgesic drug discovery was proposed some years ago (http://www.nature.com/nm/journal/v16/n11/abs/nm.2230.html) but until now it has not been truly tested. Indeed the author proposed using patient derived cells throughout the drug discovery process.

Using this strategy has great potential; both for patients where it offers promise to provide better pain relief by matching the right individual with the correct medicine thereby overcoming a major issue with current analgesics which fail to show efficacy in the majority of patients (http://www.bmj.com/content/346/bmj.f2690), and for future treatments. Virtually all experimental analgesics fail in clinical studies and whilst a number of recent successes in phase 2 have been highlighted (most notable Nav1.7 blockers and angiotensin 2 receptor antagonists) it is highly unlikely any will reach the market using current clinical trial paradigms.

Clearly there is a long way to go and iPSC technology is currently expensive and has been slow to develop. Additionally it may not be suitable for assessing all pain targets, but this recent study is both very exciting and promising in that it does provide hope that new analgesic medicines will reach the market in the coming years. For this to happen those engaged in analgesic drug discovery need to abandon the existing and broken model and be prepared to embrace these new methodologies.

Blog written by Paul Beswick

Nickel-Catalyzed Cross-Coupling of Redox-Active Esters with Boronic Acids


Whilst looking for sp2-sp3 cross coupling conditions I came across this interesting paper (Angew. Chem. Int. Ed. 2016, 55, 9676) from the Baran group titled “Nickel-Catalyzed Cross-Coupling of Redox-Active Esters with Boronic Acids”. This paper expands the use of N-hydroxyphthalimide (NHPI) esters that they had published earlier in the year (J. Am. Chem. Soc. 2016, 138, 2174−2177) in which they coupled aryl zinc reagents with alkyl esters of N-hydroxyphthalimide.

This paper uses this discovery from the Baran laboratory that N-hydroxy-tetrachlorophthalimide (TCNHPI) esters are able to accept an electron from a low-valent metal in a single electron transfer based thermal process. Using moderate temperatures thermal decarboxylative radical formation was achieved and this radical was immediately captured by a transition metal (Ni). This new cross-coupling reaction allows for the facile coupling of activated alkyl carboxylic acids and boronic acids, figure 1.

lewis-1

Baran gives a snapshot into the optimisation of the reaction conditions but explains they were arrived at by extensive experimentation and some of the empirical observations are poorly understood, figure 2.

lewis-2

It was found that DMF was necessary as a co-solvent in 1,4-dioxane for the reaction to proceed in a reasonable yield. Triethylamine was the best base tested and an optimal metal to ligand ratio of 1:1 for the NiCl2 : 4,4′-di-tert-butyl-2,2′-dipyridyl (BBBPY) system was described. Activated TCNHPI esters were used in placed of the previously used NHPI. The activated NHPI esters were more electron rich and incompetent coupling partners under these reaction conditions. All of the reagents for this reaction are commercially available and reasonably priced.

There are over 30 cross-coupling examples given in this paper which cover primary and secondary alkyl carboxylic acids, heteroaromatic boronic acids and show the tolerance for various functional groups. Baran has also shown that this reaction can be telescoped with the in situ generation of the activated ester, figure 3.

lewis-3

The experimental ease of this reaction was demonstrated by using wet solvents and a flask open to the air whilst still achieving a 65% isolated yield. The reaction was also performed on a gram scale with a 61% yield, figure 4.

lewis-4

A mechanism for this reaction has been proposed based on prior mechanistic investigations of Ni-catalysed reactions alkyl halides and Baran’s previous studies using organozinc reagents. Initially Ni complex I undergoes a base and water aided transmetalation with an arylboronic acid to give complex II. Reduction of the activated TCNHPI ester by complex II gives intermediate III which fragments to give the alkyl radical and phthalimide anion. This radical and anion combine with complex IV to yield complex V. The desired product is formed upon reductive elimination along with regenerating the catalytically active species I.

lewis-5

Although the scope of this reaction in very general Baran does highlight a few examples when diminished yields are observed. These include when an ortho-methoxy group is present on a boronic acid or if the activated ester is labile to hydrolytic cleavage.

This short communication describes a very simple and mild reaction that uses cheap and readily available reagents. It is tolerant of a range of functional groups and offers an attractive route to rapidly synthesise an array of compounds using a sp2-sp3 bond formation.

Blog written by Lewis Pennicott