Invasive Candidiasis

Posted by Carla Rothaus • October 9th, 2015

Invasive CandidiasisDespite advances in antifungal therapy, the mortality associated with invasive candidiasis remains as high as 40%. A new review article summarizes recent trends and current strategies, including early treatment and the emergence of resistance against triazoles and echinocandins.

Invasive candidiasis is the most common fungal disease among hospitalized patients in the developed world. Mortality among patients with invasive candidiasis is as high as 40%, even when patients receive antifungal therapy. In addition, the global shift in favor of nonalbicans candida species is troubling, as is the emerging resistance to antifungal drugs.

Clinical Pearls

• What are the major risk factors for invasive candidiasis?

The incidence of candidemia is age-specific, with the maximum rates observed at the extremes of age. The presence of central vascular catheters, recent surgery (particularly abdominal surgery with anastomotic leakages), and the administration of broad-spectrum, antibiotic therapy constitute the major risk factors for invasive candidiasis.

Figure 1. Pathogenesis of Invasive Candidiasis.

Table 1. Risk Factors for Invasive Candidiasis.

• Does Candida albicans remain the dominant pathogen?

The species distribution has changed over the past decades. Whereas Candida albicans had previously been the dominating pathogen, this species today accounts for only half the isolates detected in many surveys. C. glabrata has emerged as an important pathogen in northern Europe, the United States, and Canada, whereas C. parapsilosis is more prominent in southern Europe, Asia, and South America. Changes in species distribution may drive treatment recommendations, given the differences in susceptibility to azoles and echinocandins among these species.

Morning Report Questions

Q: What diagnostic tests are available for invasive candidiasis?

A: The armamentarium available for diagnosing invasive candidiasis includes direct detection, in which specimens of blood or tissue from other normally sterile sites are cultured, and indirect detection, in which surrogate markers and polymerase-chain-reaction (PCR) assays are used. No test is perfect, and it is therefore necessary to perform several diagnostic tests to achieve maximal accuracy. Culture is currently the only diagnostic approach that allows subsequent susceptibility testing. The sensitivity of blood cultures is far from ideal, with sensitivity of 21 to 71% reported in autopsy studies. Candida mannan antigens and antimannan antibodies and beta-D-glucan are the primary surrogate markers for invasive candidiasis. The reported performance of assays for these markers varies somewhat according to case mix, the frequency of sampling, and the choice of comparator. Studies that include healthy controls or less severely ill patients may overestimate specificity, since there are many potential sources of contamination of beta-D-glucan testing that can produce false positive results, and these are found more frequently in patients at high risk for candidiasis. The major diagnostic benefit of beta-D-glucan is its negative predictive value for invasive candidiasis in environments in which the prevalence is low to moderate. A number of in-house PCR tests for the detection of invasive candidiasis have been evaluated. However, limited validation and standardization have hindered their acceptance and implementation.

Table 2. Diagnostic Tests for Invasive Candidiasis.

Q: Are echinocandins superior to azoles for the treatment of invasive candidiasis?

A: A pivotal study compared the efficacy of anidulafungin with that of fluconazole. Although the study had been designed to assess the noninferiority of anidulafungin, overall response rates were significantly higher with anidulafungin than with fluconazole (76% vs. 60%; P=0.01). The apparent superiority of anidulafungin over fluconazole was most distinct in patients infected with C. albicans (global response, 81% vs. 62%; P=0.02), even though the C. albicans was almost uniformly susceptible to fluconazole. Inferior outcomes with fluconazole were also observed in patients with low scores (indicating less severe disease) on the Acute Physiology and Chronic Health Evaluation (APACHE II), which suggested that inferior outcomes with fluconazole were not related to severity of illness. Post hoc multivariate analyses have not indicated that the differences in outcome with each drug were related to other confounding factors. Nevertheless, the question of whether a single noninferiority trial can establish the superiority of echinocandins over azoles for the treatment of invasive candidiasis has remained controversial, and opinions among experts in mycology are divided. More recent studies have provided reasonable support, but no formal proof, for the superiority of echinocandins as treatment for the majority of patients with invasive candidiasis. Most notable is the pooled analysis of patient-level data from seven randomized trials that assessed antifungal treatments. With 30-day all-cause mortality used as an unequivocal end point, the most important finding was that randomization to an echinocandin was associated with better survival rates and greater clinical success than treatment with a triazole or amphotericin B. The improved outcomes were most evident among patients infected with C. albicans or C. glabrata. The benefit of echinocandin therapy was observed among patients with APACHE II scores in all but the highest quartiles, suggesting that the survival benefit associated with echinocandin treatment is not limited to the sickest patients.

Insomnia Disorder

Posted by Carla Rothaus • October 9th, 2015

Insomnia DisorderEvaluation of insomnia should include a complete medical and psychiatric history and assessment of sleep-related behaviors and symptoms. A new Clinical Practice article covers therapies for persistent insomnia, which include cognitive behavioral therapy (considered the first-line treatment) and hypnotic medications.

Insomnia is the most common sleep disorder, with a reported prevalence of 10 to 15%, depending on the diagnostic criteria used. Reductions in perceived health and quality of life, increases in workplace injuries and absenteeism, and even fatal injuries are all associated with chronic insomnia. Difficulty maintaining sleep is the most common symptom (affecting 61% of persons with insomnia), followed by early-morning awakening (52%) and difficulty falling asleep (38%); nearly half of those with insomnia have two or more of these symptoms.

Clinical Pearls

• What medical conditions have been associated with insomnia?

Roughly 50% of those with insomnia have a psychiatric disorder, most commonly a mood disorder (e.g., major depressive disorder) or an anxiety disorder (e.g., generalized anxiety disorder or post-traumatic stress disorder). Various medical illnesses are also associated with insomnia, particularly those that cause shortness of breath, pain, nocturia, gastrointestinal disturbance, or limitations in mobility. Although roughly 80% of those with major depressive disorder have insomnia, in nearly one half of those cases, the insomnia predated the onset of the mood disorder. A meta-analysis of more than 20 studies concluded that persistent insomnia is associated with a doubling of the risk of incident major depression. Associations have also been reported between insomnia and increased risks of acute myocardial infarction and coronary heart disease, heart failure, hypertension, diabetes, and death, particularly when insomnia is accompanied by short total sleep duration (<6 hours per night).

• Do the diagnostic criteria for insomnia distinguish between insomnia with and without coexisting psychiatric conditions?

Older diagnostic systems attempted to distinguish “primary” from “secondary” insomnia on the basis of the inferred original cause of the sleeplessness. However, because causal relationships between different medical and psychiatric disorders and insomnia are often bidirectional, such conclusions are unreliable. In addition, owing to the poor reliability of insomnia subtyping based on phenotype or pathophysiology, the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders takes a purely descriptive approach that is based on the frequency and duration of symptoms, allowing a diagnosis of insomnia disorder independent of, and in addition to, any coexisting psychiatric or medical disorders. The clinician should monitor whether treatment of such coexisting disorders normalizes sleep, and if not, treat the insomnia disorder independently.

Table 1. Criteria for the Diagnosis of Insomnia Disorder.

Morning Report Questions

Q: How is insomnia evaluated?

A: The evaluation of insomnia requires assessment of nocturnal and daytime sleep-related symptoms, their duration, and their temporal association with psychological or physiological stressors. Because there are many pathways to insomnia, a full evaluation includes a complete medical and psychiatric history as well as assessment for the presence of specific sleep disorders (e.g., sleep apnea or the restless legs syndrome). Questioning the patient regarding thoughts and behaviors in the hours before bedtime, while in bed attempting to sleep, and at any nocturnal awakenings may provide insight into processes interfering with sleep. A daily sleep diary documenting bedtime, any awakenings during the night, and final wake time over a period of 2 to 4 weeks can identify excessive time in bed and irregular, phase-delayed, or phase-advanced sleep patterns. Polysomnography is not indicated in the evaluation of insomnia unless sleep apnea, periodic limb movement disorder, or an injurious parasomnia (e.g., rapid-eye-movement [REM] sleep behavior disorder) is suspected or unless usual treatment approaches fail.

Q: What are some treatment options for chronic insomnia?

A: The choice of treatment of insomnia depends on the specific insomnia symptoms, their severity and expected duration, coexisting disorders, the willingness of the patient to engage in behavioral therapies, and the vulnerability of the patient to the adverse effects of medications. In patients with chronic insomnia, appropriate treatment of coexisting medical, psychiatric, and sleep disorders that contribute to insomnia is essential for improving sleep. Nevertheless, insomnia is often persistent even with proper treatment of these coexisting disorders. Treatment for chronic insomnia includes two complementary approaches: cognitive behavioral therapy (CBT) and pharmacologic treatments. CBT addresses dysfunctional behaviors and beliefs about sleep that contribute to the perpetuation of insomnia, and it is considered the first-line therapy for all patients with insomnia, including those with coexisting conditions. CBT is traditionally delivered in either individual or group settings over six to eight meetings. Several medications, with differing mechanisms of action, are used to treat insomnia. Benzodiazepine-receptor agonists include agents with a benzodiazepine chemical structure and “nonbenzodiazepines” without this structure. There is little convincing evidence from comparative trials that these two subtypes differ from each other in clinical efficacy or side effects. Because benzodiazepine-receptor agonists vary predominantly in their half-life, the specific choice of drug from this class is usually based on the insomnia symptom (e.g., difficulty initiating sleep vs. difficulty maintaining sleep). Regular reassessment of the benefits and risks of benzodiazepine-receptor agonists is recommended. The use of sedating antidepressants to treat insomnia takes advantage of the antihistaminergic, anticholinergic, and serotonergic and adrenergic antagonistic activity of these agents. At the low doses commonly used for insomnia, most have little antidepressant or anxiolytic effect. The orexin antagonist suvorexant, which was approved by the FDA in 2014 for the treatment of insomnia, showed decreased time to sleep onset, decreased time awake after sleep onset, and increased total sleep time in short-term randomized trials. Ramelteon is a melatonin-receptor agonist that is FDA-approved for the treatment of insomnia. Short-term studies as well as a controlled 6-month trial showed small-to-moderate benefits for time to sleep onset but no significant improvement in total sleep time or time awake after sleep onset. Meta-analyses of trials of melatonin for insomnia (at a wide range of doses and in immediate-release and controlled-release forms) showed small benefits for time to sleep onset and total sleep time. However, the quality control of over-the-counter melatonin products is unclear.

Table 2. Components of Cognitive Behavioral Therapy for Insomnia.

Table 3. Medications Commonly Used for Insomnia.

Remote Ischemic Preconditioning in Cardiac Surgery

Posted by Chana Sacks • October 7th, 2015

sacks_blog1In a lab at Duke University in 1986, scientists conducted an experiment comparing two different ways to give a dog a heart attack.

The investigators cut off the blood flow of the circumflex artery for 40 minutes in 12 dogs. For 7 of those dogs, they first initiated a “preconditioning” protocol that consisted of four 5-minute occlusions of the vessel, with 5 minutes of reperfusion in between. Five “control” dogs underwent the circumflex occlusion without any antecedent intervention.

Ischemic preconditioning, they found, “paradoxically limited infarct size to 25% of that seen in the control group (p < .001).” A race was underway to elucidate the mechanism of this effect and to determine applications in humans.

In the years that followed, small experiments suggested some benefit to ischemic preconditioning, that could even be done at “remote” sites – short bursts of cutting off circulation in an extremity, for example, might protect perfusion in heart muscle. Early studies were small, and most did not examine clinical outcomes. The exact physiological mechanisms remained a mystery.

Nearly three decades after that 12-dog experiment, two trials are now published in this week’s issue of NEJM in which investigators sought to definitively determine whether remote ischemic preconditioning improves clinical outcomes in patients undergoing cardiac surgery. Neither trial offers reason to be hopeful about this approach.

The first study was a sham-controlled trial that enrolled 1600 adults undergoing on-pump coronary artery bypass graft (CABG) surgeries at 30 centers in the United Kingdom. Participants were randomized either to a remote ischemic preconditioning arm or to a control group. In the intervention group, a blood pressure cuff on the arm was inflated to 200 mmHg for 5 minutes, then deflated for 5 minutes, repeating this cycle a total of 4 times. All participants then underwent surgery as usual, with no other part of the anesthesia or operative care standardized.

The findings were clear: there was no difference in the primary endpoint of cardiovascular death, myocardial infarction, coronary revascularization, or stroke at 12 months (26.5% in the preconditioning arm as compared with 27.7% in the control group, p=0.58).

The second trial of 1400 adults undergoing elective cardiac surgeries at 14 sites in Germany revealed comparable results. Using a similar method of remote ischemic preconditioning, the trial also found no difference in a primary composite endpoint of death, myocardial infarction, stroke, and acute renal failure, with follow up until hospital discharge or a maximum of two weeks.

In an accompanying editorial, Drs. Zaugg and Lucchinetti of the Department of Anesthesiology at the University of Alberta describe several reasons that these trials might have been negative in the face of positive preliminary data: “It is likely that remote ischemic preconditioning is less effective in infarct-remodeled, diabetic and aged hearts. Also, since cardiopulmonary bypass per se, as well as hypothermia and cardioplegia, are known to be protective, perhaps further protection is impossible to achieve. Most importantly, concomitant medications, specifically anesthetics, may interfere with remote ischemic preconditioning.”

Is there any chance this still works? The editorialists don’t think so: “The conclusions from both trials are definitive,” they write. “[R]emote ischemic preconditioning is ineffective in adult patients undergoing on-pump cardiac surgery.”

Much work still remains to improve outcomes for patients undergoing complex cardiac surgery. Despite the initial hopes, remote ischemic preconditioning doesn’t seem to help. Maybe it’s time to determine new targets – time, that is, to go back to the lab.

The authors of the first study and the editorialist are available through October 16 to answer your questions on the NEJM Group Open Forum. There is also an NEJM Quick Take video summary available.




Posted by Ken Bernard • October 6th, 2015


Credit: Indian Health Service

Over the past few weeks I experienced many firsts on the frontline in the Tuba City emergency department. Among many firsts are my first Diné words including yá’át’ééh (yah-tah-hey) which means hello in Diné Bizaad, and, in case you forgot, boozhoo (hello) in Anishinabemowin. So far simple greetings and polite courtesies are all I can muster, but I do look forward to learning more. It is a beautiful language, which carries such power and meaning in its rich timbre and tone. At times it can be melodic and soft, and other times abrupt and to the point, the words sounding like the cracking of a hard walnut in your hand.

These new experiences would not be possible without the subtle mix of luck, the support of family and friends, and the scholarship program offered through the Indian Health Service (IHS). Founded in 1955, the IHS is an operational division of the United States Department of Health and Human Services. It was originally formed as a part of the War Department in the late 1800s with an ulterior motive: to stem the spread of infectious diseases to Western expansion populations and U.S. soldiers via vaccinations and quarantined care of Native people. Its structure and mission have thankfully changed significantly since then, and now provides comprehensive health care services, scholarships, and jobs to roughly 2 million patients from over 560 federally recognized tribes through a network of tribal and federal facilities, health and dental clinics, and urban health programs. There is wide variation in the services each facility is able to provide. Despite improvements in the state of health of many Native communities, many challenges still impact the quality of care delivered to Native people. Even with my nascent experience in the ED, I now see these plainly and they have led to some other firsts for my career.

For example, last week was the first time I treated seven patients from the same family all together in the same exam room. Just recently, a close family contact had been diagnosed with meningitis and one member of the family presented now with seizures and fever. The other family members were evaluated, thankfully asymptomatic, and received chemoprophylaxis. Despite some of the most successful vaccination programs in the country, living conditions on the reservation promote the spread of harmful disease like invasive, gram negative meningitis. On the “rez,” the unemployment rate can be greater than 50%. Unchecked poverty and scant economic opportunity leads to a considerable amount of cohabitation, with multiple generations living under one roof or “hogan.” Roughly 30% of homes do not have access to adequate plumbing and the same proportion have earthen floors. Close quarters and living standards below what other Americans have come to expect create a near perfect environment for the spread of communicable disease.

I also transferred my first critically ill patient to another care facility — an act unheard of where I trained in Boston with five Level 1 trauma centers and the best tertiary and quaternary specialty care in the world. But for Tuba City Regional Health Care on the Navajo Nation reservation in Northern Arizona, access to critical care is about 30 minutes away by helicopter or 2 hours by ambulance and can come at a cost of more than $20,000. And in a rationed health care system like the IHS, the economic impact of these transfers can delay or preclude necessary care for non-critical patients. At times it can feel like we are caught in a zero-sum situation. For some to gain others must lose. Beyond the practice liability and cost is the burden to patients’ families. Many have already delayed care due to lack of adequate transportation, some traveling over an hour on unpaved roads. And if the patient happens to be the primary wage earner for the household, in an area where the unemployment rates can be as high as 50%, the result is even more financial devastation.

Despite these obstacles, most of my initial interactions have been enlightening, educational, inspirational, and, quite frankly, touching. Like the first time I called an elderly woman “shimá” or grandmother and saw a surprised smile come over her face, melting away her apprehensions of being in the ED. Or the first pregnancy diagnosed at bedside with ultrasound, with the tears of an elated mother-to-be who had all but given up hope of having children. And finally, my interactions with my dedicated and experienced co-workers, many from the community they serve and eager to have a new physician to teach and learn from, have been welcoming and comforting to myself and my family.

With that said, I have a lot to learn and a lot more experience to gain, not to mention Diné vocabulary. And I cannot wait. Challenges aside, this place has made quite a positive impression on me and I look forward to the days, months and years to come.  This is a place for great medicine, great friends, and great potential.

Springing a Leak

Posted by Carla Rothaus • October 2nd, 2015

Springing a LeakIn a new Clinical Problem-Solving article, a 52-year-old man presented to the emergency department with general weakness and swelling in his legs. Symmetric swelling had begun 4 weeks earlier and had progressed to the point that it was difficult for him to wear shoes.

The nephrotic syndrome is most commonly caused by membranous nephropathy, focal segmental glomerulosclerosis, minimal-change disease, or membranoproliferative glomerular disease.

Clinical Pearls

• What are the diagnostic criteria for the nephrotic syndrome?

The diagnostic criteria for the nephrotic syndrome include nephrotic-range proteinuria (>3.5 g in 24 hours), hypoalbuminemia (<3 g of albumin per deciliter), peripheral edema, and hyperlipidemia.

• What are some of the causes of the nephrotic syndrome?

The nephrotic syndrome is most commonly caused by membranous nephropathy, focal segmental glomerulosclerosis, minimal-change disease, or membranoproliferative glomerular disease. All forms of the nephrotic syndrome may be associated with neoplasms, particularly lymphoma, which may cause a paraneoplastic-associated membranous nephropathy or minimal-change disease. Other secondary causes of the nephrotic syndrome include medications (e.g., nonsteroidal antiinflammatory agents); hepatitis or other chronic infections, including human immunodeficiency virus (HIV) infection; systemic lupus erythematosus; and amyloidosis.

Morning Report Questions

Q: Under what circumstances is hypothyroidism a possible complication of the nephrotic syndrome?

A: Hypothyroidism is an unusual complication of the nephrotic syndrome. The majority (99%) of T4 in the serum is bound to thyroid-binding proteins, including thyroid-binding globulin, transthyretin, and albumin. In patients with an intact hypothalamic-pituitary-thyroid axis, the thyroid gland compensates for urinary losses by increasing the production of T4. Consequently, despite heavy loss of proteins associated with the nephrotic syndrome, levels of biologically active free thyroid hormone are typically unaffected, and most patients with the nephrotic syndrome remain clinically euthyroid. However, patients who are reliant on a fixed exogenous source of T4 cannot compensate for the loss of T4, and thus, clinical hypothyroidism develops as a consequence of proteinuria. In patients who are dependent on exogenous thyroxine replacement, nephrotic-range proteinuria should prompt consideration of urinary loss of thyroid hormone.

Q: Describe some of the features of minimal-change disease.

A: Minimal-change disease is characterized clinically by nephrotic-range proteinuria and pathologically by diffuse effacement of epithelial foot processes on electron microscopy in the context of a relatively normal appearance on light microscopy. Although the disease has been reported after stem-cell transplantation, it is less common than membranous nephropathy in these patients. Glucocorticoids are the mainstay of drug therapy. Other immunomodulatory therapies, including calcineurin inhibitors, cytotoxic agents, rituximab, azathioprine, and mycophenolate mofetil, have been used in conjunction with low-dose glucocorticoids in patients who have unacceptable side effects with high-dose glucocorticoids or, more commonly, as glucocorticoid-sparing agents in relapsed disease. Adjunctive therapies for minimal-change disease typically include loop diuretics for edema, inhibitors of the renin-angiotensin-aldosterone system for hypertension and proteinuria, and 3-hydroxy-3-methylglutaryl-coenzyme A reductase inhibitors (i.e., statins) for hyperlipidemia. Angiotensin-converting-enzyme inhibitors and angiotensin II receptor blockers improve the size selectivity of the glomerular barrier and may have long-term renal protective effects. Relapses are common in minimal-change disease despite therapy, and younger patients (<40 years of age) are more likely to have a relapse than older patients. A small case series suggested that up to a quarter of patients have three or more relapses per year.

Figure 1. Renal-Biopsy Specimens.

Benznidazole for Chagas’ Cardiomyopathy

Posted by Carla Rothaus • October 2nd, 2015

randomized trial of benzindazole for chronic chagasThe benefits of antitrypanosomal therapy for patients with Chagas cardiomyopathy are unclear. In this Original Article, a double blind, placebo controlled trial in 2854 patients with Chagas cardiomyopathy showed that no benefit was found for 2-3 months of benznidazole therapy over the following 5 years of follow-up.

Chronic Chagas’ cardiomyopathy is associated with malignant arrhythmias, conduction disturbances, heart failure, and pulmonary and systemic embolism and is associated with an annual mortality of approximately 4% among patients who are followed in outpatient clinics. The Benznidazole Evaluation for Interrupting Trypanosomiasis (BENEFIT) trial, conducted by Morillo et al., evaluated the efficacy and safety of benznidazole, as compared with placebo, in reducing adverse clinical outcomes among patients with chronic Chagas’ cardiomyopathy.

Clinical Pearls

• How common is Chagas’ cardiomyopathy?

Chagas’ disease is the third most common parasitic disease globally, after malaria and schistosomiasis. Chagas’ cardiomyopathy is the most common form of nonischemic cardiomyopathy and one of the leading causes of complications and death in Latin America. An estimated 6 million to 7 million persons are infected, and 36,800 new cases occur each year. Chagas’ cardiomyopathy develops in approximately 25% of patients infected with Trypanosoma cruzi.

• What is the role of the Trypanosoma cruzi parasite in the development of Chagas’ cardiomyopathy? 

T. cruzi causes an acute disease, which can be cured with trypanocidal treatment. However, in chronic cardiomyopathy, the role of the parasite is debated and the effect of trypanocidal treatment is unclear. In some previous studies, autoimmune mechanisms were implicated as potential causes of late cardiac injury because of the apparent absence of parasites in the cardiac inflammatory lesions on classic histologic analysis and the occurrence of autoimmune responses related to polyclonal activation, molecular self-mimicry by parasite antigens, or cryptic epitopes shared by the host and parasites. However, the identification of T. cruzi antigens in inflamed myocardium with the use of sensitive techniques, such as immunohistochemical analysis and polymerase-chain-reaction (PCR) assay, suggests that parasite persistence may be an important factor that, in conjunction with individual host factors, triggers the inflammatory process.

Morning Report Questions

Q: Does benznidazole treatment reduce clinical progression of cardiac disease among patients with established Chagas’ cardiomyopathy?

A: In the study by Morillo et al., benznidazole treatment significantly reduced the detection of circulating parasites but did not reduce cardiac clinical progression among patients with established Chagas’ cardiomyopathy. The primary study outcome in the time-to-event analysis was the first occurrence of death, resuscitated cardiac arrest, insertion of a pacemaker or an implantable cardioverter-defibrillator, sustained ventricular tachycardia, cardiac transplantation, new heart failure, stroke or transient ischemic attack, or a systemic or pulmonary thromboembolic event. The primary outcome occurred in 394 patients (27.5%) in the benznidazole group and 414 patients (29.1%) in the placebo group (unadjusted hazard ratio, 0.93; 95% confidence interval [CI] 0.81 to 1.07; P=0.31; adjusted hazard ratio, 0.92; 95% CI, 0.81 to 1.06; P=0.26). No significant between-group differences were observed in any component of the primary outcome. There was no significant difference in treatment response on the basis of individual markers of clinical severity, including New York Heart Association (NYHA) class, cardiothoracic ratio of more than 0.5, segmental or global wall-motion abnormalities, low QRS voltage, left ventricular end diastolic diameter of more than 5.0 mm, or left ventricular ejection fraction of less than 40%, or on the basis of sex or age.

Figure 1. Primary Composite Outcome during 7 Years of Follow-up.

Figure 2. Primary Outcome, According to Subgroup.

Table 2. Primary Outcome and Its Components, Hospitalizations, and Deaths.

Q: What is the role of benznidazole, potential or otherwise, in the treatment of chronic Chagas’ infection?

A: According to Morillo et al., it is possible that the benefit of benznidazole may be observed in patients who are at very low risk before the appearance of cardiac damage or that the benefit may accrue with more prolonged therapy (as is the case with therapy for some other chronic infections, such as tuberculosis or leprosy), with repeated pulses of benznidazole, or with treatment at an earlier stage of the disease. These hypotheses are untested. Whether longer follow-up is needed to detect the emergence of a benefit is also a consideration but is speculative, since 60% of patients in the BENEFIT trial were followed for more than 6 years and 25% for more than 7 years, and no obvious signal of possible benefit was observed. The findings of the Morillo trial do not challenge current guidelines that recommend treatment with trypanocidal therapy in the early stages of chronic Chagas’ infection (which are based on several studies, including one that showed the benefit in preventing congenital transmission) and should not detract from the pursuit of general goals for exploring more effective or earlier treatments with new drugs or drug combinations.

How to “Nudge” Smokers to Reduce Tobacco Use?

Posted by James Yeh, M.D. M.P.H. • September 30th, 2015

Reduced-Nicotine Standards Insight Post 9-30-15Health problems due to smoking account for 6 million deaths annually and are the leading cause of death worldwide. Despite the dramatic reduction of smoking rates in the past 50 years in the U.S., nearly 18% of adults are current smokers. Each day 2100 youth and young adults become regular daily smokers. As nicotine sustains tobacco use, several interventions are used to reduce nicotine dependence. Clinicians often can prescribe nicotine replacement treatments or other pharmacologic interventions for patients who are interested in quitting. On a regulatory level, the taxing of tobacco products helped reduce smoking uptake and increase smoking cessation world-wide.

What about other forms of intervention? There is some evidence that low nicotine content cigarettes containing less than 0.7 mg/g of tobacco (different than the so-called “light cigarettes”) can reduce nicotine dependence and increase smoking cessation.

In this week’s issue of NEJM, Donny and colleagues conducted a 6-week multi-site double-blinded clinical trial randomizing over 800 adult daily smokers (self-reported > 5 cigarettes daily) with laboratory evidence of active smoking who are not interested in quitting to either placebo (their usual brand of cigarettes) or to one of 6 investigational cigarettes that contained varying amount of nicotine content or tar level. The primary outcome was the average number of self-reported cigarettes smoked per day during week 6 of the intervention. Additional outcomes included laboratory markers of nicotine product use and subjective assessment of withdrawal.

In the study, those assigned to the low nicotine content cigarettes containing 0.4, 1.3, and 2.4 mg of nicotine per gram of tobacco smoked less than the placebo or the 15.8 mg/g control cigarettes (14.9, 16.3, and 16.5 versus 22 and 21.3 cigarettes daily; p<0.001). Individuals assigned the cigarettes containing 5.2 mg or less of nicotine per gram of tobacco had lower urinary nicotine equivalents than those assigned the control 15.8 mg/g cigarettes (p <0.01). There were no differences in individuals with self-reported withdrawal symptoms among those assigned the low nicotine content cigarettes compared to the placebo and the control nicotine content cigarettes during week 1 or week 6 of the intervention. Individuals assigned 0.4 mg/g nicotine level cigarettes had relatively lower dependence score (p=0.001).

From this 6-week study we learned that cigarettes with 2.4 mg/g or less nicotine helped individuals smoke up to 40% fewer cigarettes per day. The lowest nicotine content cigarettes (0.4 mg/g) reduced nicotine dependence. This suggests that if nicotine content is reduced adequately, smokers may smoke less and be less dependent. The study findings would need to be validated by a longer follow-up period.

In a NEJM Perspective, Fiore and Baker from the Center for Tobacco Research and Intervention at the University of Wisconsin School of Medicine believe “these data support exploration of a national nicotine-reduction policy, and we recommend that additional attention be paid to low-nicotine cigarettes as a potential clinical smoking-cessation resource”. However, the success of such a policy would be dependent on the application and enforcement of the rules to all nicotine containing tobacco products.

What is your view on the use of such policies to “nudge” people towards healthier lifestyles choices?

How effective are smoking cessation interventions that you implement for your patients?

The authors of these studies are available through October 9th to answer your questions on the NEJM Group Open Forum

A Man with Cardiogenic Shock

Posted by Carla Rothaus • September 25th, 2015

Cardiogenic ShockIn a new Case Record of the Massachusetts General Hospital, a 50-year-old man with a history of cardiomyopathy and progressive muscle weakness was admitted with cardiogenic shock. Electroencephalography showed total suppression of cerebral activity; ventilator support was withdrawn, and he died. An autopsy was performed.

Myotonic dystrophy is the most common muscular dystrophy in adults and most prevalent overall, with an incidence of approximately 1 per 7000 to 8000 persons. It is associated with an autosomal dominant inheritance pattern.

Clinical Pearls

• What is the cause of myotonic dystrophy type 1?

Myotonic dystrophy type 1 is caused by an expansion of CTG repeats in the dystrophia myotonica protein kinase (DMPK) gene. The myotonic discharges in myotonic dystrophy are caused by dysfunction of a chloride channel protein that results from dysregulated alternative splicing of the chloride channel messenger RNA (mRNA). The expansion of CUG repeats in mutant DMPK gene mRNA causes missplicing of pre-mRNA from at least several dozen other genes besides the chloride channel. The mechanism involves the sequestration of splicing regulator proteins in the muscleblind-like (MBNL) family through the expansion of CUG repeats, resulting in formation of nuclear inclusions and loss of MBNL activity. However, except for the effects of the dysfunctional chloride channel protein, the downstream effects of most of the splice variants associated with myotonic dystrophy type 1 are unknown. A clinical phenomenon that is more characteristic of myotonic dystrophy type 1 than of myotonic dystrophy type 2 is anticipation, which is defined as the lowering of age of onset, worsening of disease severity, or both in successive generations. The biologic basis for anticipation is the intergenerational expansion of the unstable CTG repeats, which is more likely to occur in maternal transmissions than in paternal transmissions.

• What is the correlation between the number of repeats and disease severity in myotonic dystrophy type 1?

There are normally 5 to 34 CTG repeats at this locus, but patients with myotonic dystrophy type 1 have hundreds to thousands of CTG repeats. Persons with 35 to 49 CTG repeats are asymptomatic, but the number of repeats may be expanded in the next generation and cause a disease phenotype. Persons with 50 to 150 CTG repeats often have mild disease (characterized by myotonia and cataracts), an age of onset of 20 to 70 years, and a normal or moderately reduced life expectancy. Those with 150 to 1000 CTG repeats typically have moderate or severe disease (characterized by weakness, myotonia, cataracts, and multisystemic involvement), an age of onset of 12 to 30 years, and a reduced life expectancy. Those with more than 1000 CTG repeats have the most severe disease, which is congenital and characterized by hypotonia, contractures, feeding difficulties, respiratory insufficiency, and delayed motor and mental development. However, it is important to note that there is considerable variation in the correlation between the number of repeats and disease severity, and thus attempts to predict disease severity on the basis of the number of repeats may not particularly helpful.

Morning Report Questions

Q: What are some of the clinical manifestations of myotonic dystrophy type 1?

A: Cardiac involvement is common in myotonic dystrophy type 1 and can result in asymptomatic electrocardiographic abnormalities, progressive heart block, atrial tachyarrhythmias, ventricular tachyarrhythmias, nonischemic cardiomyopathy, and sudden death. Hearing loss is an occasional complication of myotonic dystrophy type 1 that resembles presbycusis. Insulin resistance, which is common in patients with myotonic dystrophy type 1, may be related to inappropriate splicing of the insulin-receptor transcript in muscle. Muscle wasting may result from one of many splice variants acting alone or in combination with several others, or it could occur independently of splicing.

Q: Are there any effective interventions for myotonic dystrophy?

A: Although there is no current therapy for affected patients or their affected children that would alter the disease course, establishing the diagnosis is key to help anticipate, identify, and treat coexisting conditions. For example, yearly electrocardiographic examinations are recommended to identify those who may benefit from pacemaker and implantable cardioverter-defibrillator (ICD) placement. That said, it is unclear whether placement of an ICD reduces mortality since the most common cause of death associated with this disorder is respiratory failure due to weakness of the muscles of respiration. Therapeutic approaches designed to reduce the toxic effects of mutant DMPK RNA are in development.

Figure 3. Muscle Specimens Obtained at Autopsy.

The Asthma-COPD Overlap Syndrome

Posted by Carla Rothaus • September 25th, 2015

Asthma-COPD Overlap Syndrome 1Although in textbooks asthma and chronic obstructive pulmonary disease (COPD) are viewed as distinct disorders, there is increasing awareness that many patients have features of both. A new review article covers the asthma–COPD overlap syndrome.

Approximately 1 in 12 people worldwide are affected by asthma or chronic obstructive pulmonary disease (COPD); once regarded as two distinct disease entities, these two conditions are now recognized as heterogeneous and often overlapping. The term “asthma-COPD overlap syndrome” (ACOS) has been applied when a person has clinical features of both asthma and COPD.

Clinical Pearls

• Does ACOS have a specific definition?

Even though overlaps between asthma and COPD are a clinical reality, Global Initiative for Asthma (GINA) and Global Initiative for Chronic Obstructive Lung Disease (GOLD) documents have not given a specific definition of ACOS and have stated that more evidence on “clinical phenotypes and underlying mechanisms” is needed.

Table 1. Four Examples of Patients with Obstructive Airway Disease.

• How prevalent is the overlap syndrome, and how is it treated?

According to a case definition of ACOS that has been widely promulgated, the syndrome is estimated to be present in 15 to 45% of  the population with obstructive airway disease, and the prevalence increases with age. However, despite this presumed high prevalence, no double-blind, prospective studies have been conducted to provide information on how to treat these types of patients. Indeed, studies of COPD have excluded nonsmokers and patients with some bronchodilator reversibility, whereas studies of asthma have excluded smokers and patients without substantial ronchodilator reversibility. Thus, the most effective treatment of patients with ACOS remains unknown.

Morning Report Questions

Q: What are some of the features that have traditionally been associated with either asthma or COPD, but can, in fact, be present in both conditions?

A: Reversibility of airway obstruction after inhalation of a bronchodilator drug such as albuterol is a hallmark in early asthma and has long been regarded as a criterion to distinguish asthma from COPD. Reversibility of airway obstruction is frequently present in COPD as well; in two studies, reversibility was observed in up to 44% and 50% of patients with COPD. There is broad consensus that asthma typically has an eosinophilic and a Th2-driven cytokine pattern of inflammation, whereas neutrophilic inflammation dominates in COPD. Bronchial-biopsy studies, sputum studies, and exhaled-breath studies have provided evidence of substantial heterogeneity in mucosal inflammation. Patients with asthma who have severe or late-onset disease or chronic infections or who smoke may also exhibit neutrophilic inflammation and CD8 cells in the airways, both of which were once believed to be hallmarks of COPD. A Th2 inflammatory signature can also be present in COPD. Eosinophils are present in 15 to 40% of patients with stable COPD — in sputum, bronchoalveolar lavage, and lung tissue — even after careful exclusion of patients with reversibility of airway obstruction, bronchial hyperresponsiveness, atopy, or a childhood history of asthma; eosinophil activation is associated with disease severity. Eosinophil levels can also be increased in the sputum of patients with COPD exacerbations.

Figure 2. Risk Factors for Asthma and COPD and the Influence of Environment and Aging.

Q: Should ACOS be designated a disease entity?

A: According to the authors of this review article, it is premature to recommend the designation of ACOS as a disease entity in primary and specialist care. More research is needed to better characterize patients and to obtain a standardized definition of ACOS that is based on markers that best predict treatment response in individual patients. The danger of seeing ACOS as a disease entity is that the lines may be blurred between asthma and COPD, because studies addressing the patient population with ACOS specifically are lacking, which could lead to overtreatment, particularly with inhaled glucocorticoids.

Efficacy and Longer-Term Safety of the Dengue Vaccine in Endemic Regions

Posted by Rupa Kanapathipillai • September 23rd, 2015

Vaccine EfficacyYou are sitting on the beach in Colombia, taking a weekend break from your hospital and outpatient duties, where you have seen a broad spectrum of illness resulting from dengue – mild fever to Dengue Hemorrhagic Shock Syndrome. You think to yourself: “if only there was a vaccine that could reduce the morbidity and mortality associated with dengue”. You marvel at the time and effort it takes to create a safe, efficacious vaccine, and reflect on the complex path to product licensure, mentally noting the need for a vaccine that would ideally provide long-lasting immunity to all serotypes, both in those with and without prior dengue infection. Immunity that’s not long-standing enough could potentially result in increased risk of infection, particularly in a dengue endemic setting.

CDY-TDV is a live attenuated chimeric vaccine with the yellow fever virus as a backbone that carries the structural proteins of the dengue virus. The schedule was 3 doses administered over a 12-month period, with encouraging short-term safety and efficacy. Results of previously published work by Villar et al. published in NEJM showed a 67-80% reduction in dengue hospitalizations among vaccinees. Efficacy was higher in vaccinees with previous dengue infection compared to those who were seronegative at baseline, and efficacy varied across the 4 dengue serotypes.

The recently published article by Hadinegoro et al. in NEJM summarizes the efficacy and longer-term safety data of the CYD-TDV Dengue Vaccine. 35,000 children aged between 2-16 years were enrolled in 3 studies – two phase III trials, CYD14 and CYD15, and a phase IIb trial, CYD23/57. A safety endpoint was assessed – incidence of hospitalization with new dengue infection in years 3-6 post vaccine in the 3 studies. During year 3 in the CYD14, CYD15 and CYD57 trials, there was a lower risk of hospitalization for dengue among participants in the vaccine group compared to controls in those ≥9 years, but was higher in those <9 years. Pooled relative risks of hospitalizations for dengue were 0.84 among all participants, 1.58 in those <9 years, and 0.5 in those ≥9 years.

As described in the accompanying editorial, by Dr. Cameron Simmons, vaccination appears to be associated with an elevated risk of hospitalization in vaccinated children <9 years, most markedly in those 2-5 years of age, likely following natural infection in the 3rd year post vaccination. It’s hypothesized that vaccination of some young children may elicit transient immunity, with subsequent waning and pre-disposition to more serious infections that result in hospitalization. Surveillance was limited to hospital-based only – it remains unknown whether the <9 years subgroup experiences a higher incidence of symptomatic infection outside of the hospital setting also. Notably, vaccination did not increase the frequency of severe, life-threatening complications.

Vaccine efficacy was assessed using pooled data from the first 25 months of CYD14 and CYD15. Children between 9-16 years continue to benefit, likely due to vaccine-induced pan-serotype immunity in many recipients who were previously naturally infected; benefits are far less clear in children <9 years. Pooled rates of efficacy for symptomatic dengue during the first 25 months were 60.3% for all participants, 44.6% for those <9 years, and 65.6% for those ≥9 years. Longer-term efficacy will continue to be assessed, informing the need for booster doses.

Deputy Editor Lindsey Baden, MD comments “There is a tremendous need for a vaccine to prevent dengue associated illness, especially severe disease. A major challenge to developing such a vaccine has been the concern for disease enhancement associated with serotype-specific partial immunity. Longer term studies of dengue vaccine safety, such as this one, are critical for our understanding vaccine safety and for the development of a successful vaccine.”

You slap another Aedes mosquito away from your leg, resigned to the idea that while certainly closer, programmatic vaccination with CYD-TDV may not be available by the end of your weekend holiday, particularly for those aged <9 years.