How do molecules get into cells…?…the continuing debate


Prompted by a number of recent comments in meetings and blogs, although the paper came out last year it is worth highlighting again the strongly held and opposing opinions on this topic.

If you pick up Drug Discovery Today from 2011, and read the Kell and Dobson article, you would be forgiven for thinking that you had missed a significant development in the understanding of diffusion into cells, and immediately start disregarding all the passive permeability data you’re generating.  Essentially, the argument in this paper, forcefully made and apparently with no clear challenge or comment during review & revision, is that passive diffusion plays no significant contribution to the entry of molecules into cells.  Instead, all drugs are actively transported by a diverse range of transporters – some know and some yet to be identified as illustrated by figure from paper below:

sw5

This paper is at pains to refute an article in Drug Discovery Reviews the previous year discussing the co-existence of passive and active transport mechanisms.  The argument is based on the observation that when comparing the rates of drug transport in natural versus artificial membranes, there are discrepancies of over 100x, with the natural membranes demonstrating higher permeability.  The paper very appropriately points out the pitfalls of blindly using Caco-2 and/or MDCK systems and highlights numerous examples of drugs which don’t fit to the widely held views.

All these points were strongly (and convincingly) made, and some have read this paper and associated references, and taken all the points strongly to heart.

However, the co-existers rose up and fought back in the same journal, publishing an article the following year which is essentially a rebuttal of all the Kell and Dobson claims and a reassertion of the 2010 DDR paper.  In particular, it makes a strong re-assertion that passive diffusion is the major mechanism for blood-brain barrier permeation of lipophilic small molecules (Figure below from article; ref 31 = Tsinman, Pharm Res, 2011, 337).

sw6

These arguments look reasoned and balanced, and importantly are based on considerable data. The door is still open to the presence of many other active transport processes, as well as to the approximate nature of current permeability screening assays, however a complete disregard of passive diffusion appears premature at best.

These papers have been already well covered by the excellent critique by Derek Lowe, however, the lack of awareness across some of our community of the existence of this debate and the uncompromising nature of the language in both papers circulating today (and the fun of seeing groups slogging it out over the pages of Drug Discovery Today) warranted a second airing…

FDA and New Drugs for Alzheimer’s Disease: Lowering the Bar or Circumventing a Roadblock?


A month or so ago, the Food and Drug Administration (FDA) kicked-off a bit of a kerfuffle with, depending upon your viewpoint, its innovative, radical, and/or dangerous proposals to overhaul aspects of the regulatory path to approval for new drugs for Alzheimer’s disease. The recently published draft guidance (see pdf here and Webinar here) invited comment and that is exactly what it got. On the one hand, the proposals were welcomed by clinicians and patient groups that are desperate to see new treatments come to market while on the other hand there was a degree of scepticism by those that regarded them being overly favourable towards pharmaceutical companies. But let us first reflect upon why the FDA felt the need to stir the pot in the first place.

It has been noted that of the greater than 100 drugs that entered development for the treatment of Alzheimer’s disease since 1998, only three have achieved Food and Drug Administration (FDA) approval. These drugs were the acetylcholinesterase inhibitors rivastigmine from Novartis (Exelon, approved in 2000) and galantamine from Forest/Janssen (Reminyl, 2001) along with the NMDA receptor antagonist memantine from Merz/Forest/Lundbeck (Namenda, 2003). They joined the acetylcholinesterase inhibitor donepezil (Aricept; Eisai/Pfizer), which was approved in 1996, to comprise the quartet of FDA-approved therapies for the symptomatic treatment of Alzheimer’s disease. More recently, disease-modifying rather than symptomatic-relief approaches have attracted most attention with the amyloid hypothesis predominating, although recent clinical trial failures of amyloid-related drugs have instigated  a re-appraisal of this approach (for review see here).

We have previously discussed the state of amyloid-related therapeutics for the treatment of Alzheimer’s Disease, with the focus clearly shifting to the treatment of earlier, mild forms of the disease or the prevention of the disease in susceptible populations (see here). Most notably, the Lilly antibody Solanezumab showed signs of efficacy in early (mild) Alzheimer patients in the failed Expedition 1 and Expedition 2 Phase III studies and these data have encouraged additional Phase III studies specifically targeting such patients, although it should be noted that diagnostic accuracy in such patients could be an issue. Hence, it is not unreasonable to assume that in later stages of the disease, neuronal damage may have become too widespread for effective disease-modifying intervention, particularly as regards amyloid-based therapeutics (see here).

It was the recognition that effective treatment would most likely occur in the early stages of the disease that prompted the FDA’s proposals. At the moment, regulatory requirements for a drug approval require an improvement in cognition to be accompanied by a functional improvement in an activity of daily living, such as making a cup of tea. However, in a recently-published article in New England Journal of Medicine which summarises their proposals, the FDA note that in Alzheimer patients that do not have overt dementia meaningful functional deficits are currently difficult to measure. Accordingly, they propose reducing or dropping the requirement for a functional improvement in early forms of the disease (see Figure below). Moreover, as the chronology of Alzheimer’s disease pathology becomes better defined by biomarker and imaging studies such as the Alzheimer’s Disease Neuroimaging Initiative (ADNI; for example Jack et al, 2013), early cognitive deficits plus appropriate biomarkers may be used to address the issue of accuracy of diagnosis in early Alzheimer’s Disease.

ja1

What, therefore, are the implications of these proposals? Well, in a New York Times editorial (18th March, 2013), the FDA’s proposals have been described as lowering the bar for Alzheimer’s disease drug approval. The term “lowering the bar” implies a reduction in scientific rigour but this is not necessarily the case, with the FDA recognising that “innovative approaches to trial design and end-point selection are urgently needed”. Moreover, the phrase implies that the bar could be cleared if only one jumped high enough (i.e. if the drugs were good enough) but as the emphasis moves more towards treating early Alzheimer’s disease, the current requirement for cognitive improvement to be coupled with a functional improvement may be seen more as an insurmountable roadblock than a barrier (especially if there is limited, if any, evidence of a functional deficit in early Alzheimer’s disease). The New York Times Editorial further elaborated on its glass-is-half-empty viewpoint by warning that the FDA might “end up approving drugs that provide little or no clinical benefit yet cause harmful side effects in people who take the medications for extended periods.”

The viewpoint expressed by the NY Times is disputed by those in the field (see here). For example, an opposing glass-is-half-full opinion is offered by Dr. Eric Siemers, senior medical director at Eli Lilly, who commented in the March 14th edition of the New York Times article (the one that triggered the subsequent Editorial) that “This is really a huge advance” and in an era when failures in the drug discovery process can sometimes all too readily be apportioned to the regulatory authorities he added the seldom-heard comment “Kudos to the F.D.A.” There is no doubt that the proposed guidelines map out an innovative path to new treatments that if adopted could circumvent the current potential regulatory road block. Indeed, commenting on a recent article which quantifies the financial costs of dementia in the US (see here), the NY Times itself noted last week “the number of people with dementia will more than double within 30 years, skyrocketing at a rate that rarely occurs with a chronic disease”. So, as the population ages and a tsunami of dementia-related financial and emotional burden looms large, if ever there was a time to reshape the Alzheimer’s disease drug development paradigm it is surely now.

 

Rotamers- assigned by a simple NMR experiment


Rotamers are conformational isomers where interconversion by rotation around a single bond is restricted and an energy barrier has to be overcome in order to convert one conformer to another.  When this rotation strain barrier is high enough to allow for the isolation of the conformers then the isomers become atropoisomers. Rotamers are however not separable and their existence normally complicates the 1H NMR interpretation. Variable temperature (VT) NMR is the generally preferred method for  studying the equilibration of the rotamers and at low temperatures the spectrum is assigned to the frozen equilibrium and multiple peaks are observed while at higher temperatures the spectrum simplifies as the equivalent peaks are averaged out. Other methods for NMR simplification include the introduction of a complexing agent and solvent switching. All these techniques are inconvenient to the organic synthetic chemist when working on small scale. To overcome this problem, Steve Ley and co-workers have identified that chemical-exchange NMR experiments, such as 1D NOE, can be used to identified resonances corresponding to those protons in chemical exchange processes, therefore distinguishes rotamers form other impurities or even stereoisomers, in a non intrusive way.

 

In a 1D gradient NOE experiment, a selected peak is irradiated leading to a negative peak at the site of irradiation while those protons connected to the targeted frequency region through space appear as positive peaks (or in the opposite phase to the irradiated peak). On the other hands those protons undergoing chemical exchange with the irradiated protons will appear as negative peaks, (or in the same phase as the irradiated peak). Figure 1 illustrates this, where a chemical-exchange experiment can be used to distinguish sets of rotamers in the presence of diastereoisomers.

cv1

Figure 1. (a) 1H NMR spectrum of a sample containing 3 and 4. Four NMR resonances (I, II, III, and IV) are observed corresponding to protons HA and HB in 3, 4, and their respective rotamers. (b) 1D gradient NOE spectrum after selective excitation of the resonance at 4.59 ppm (I) produces a single downfield resonance in the same phase at 4.28 ppm (III), indicating that resonances I and III belong to two rotamers of the same diastereomer (3) and that resonances due to one diastereomer do not transfer spin information via chemical exchange to the other. (c) 1D gradient NOE spectrum after selective excitation of the diastereomeric peak at 4.52 ppm (II) also produces a single downfield peak in the same phase (IV), indicating that resonances II and IV belong to two rotamers of the same diastereomer (4). Only the 5.2−4.1 ppm region is shown for clarity

Recently this technique has been applied by Proksch et al. to unambiguously determine the presence of four rotamers in two new depsipeptides. In this publication, they have also used 2D ROESY or NOESY experiments to find the same results.

cv2

 

cv3

 

The art of fragment screening


With the cost involved in HTS, fragment screening (FBLD) has become over the last decade the method of choice to identify novel hit matter. Dan Erlanson from the Practical fragment blog and Ben Davis have just published a brief review (free of charge from Elsevier) looking at the potential experimental pitfalls. Nothing new there but clearly stating in one concise article what they call the ‘unknown known’ but more generally issues not only affecting FBLD.

The FBLD approach works but false positive hits are a common occurrence. These can be eliminated with a good understanding of the technique limitations and composition of your fragment file.

Do you know what’s in your fragment file: The authors look at the various issues leading to false positives from fragment stability, impurities and promiscuous frequent hitters.

mp5

The discussion in this review then goes in the limits of each of the popular biophysical screening techniques. From instrumentation artifacts to potential data misinterpretation leading to many false positives. ‘Knowing about possible problems can help you recognise them before investing additional resources or embarrassing yourself publicly’.