Remote Ischemic Preconditioning in Cardiac Surgery

Posted by Chana Sacks • October 7th, 2015

sacks_blog1In a lab at Duke University in 1986, scientists conducted an experiment comparing two different ways to give a dog a heart attack.

The investigators cut off the blood flow of the circumflex artery for 40 minutes in 12 dogs. For 7 of those dogs, they first initiated a “preconditioning” protocol that consisted of four 5-minute occlusions of the vessel, with 5 minutes of reperfusion in between. Five “control” dogs underwent the circumflex occlusion without any antecedent intervention.

Ischemic preconditioning, they found, “paradoxically limited infarct size to 25% of that seen in the control group (p < .001).” A race was underway to elucidate the mechanism of this effect and to determine applications in humans.

In the years that followed, small experiments suggested some benefit to ischemic preconditioning, that could even be done at “remote” sites – short bursts of cutting off circulation in an extremity, for example, might protect perfusion in heart muscle. Early studies were small, and most did not examine clinical outcomes. The exact physiological mechanisms remained a mystery.

Nearly three decades after that 12-dog experiment, two trials are now published in this week’s issue of NEJM in which investigators sought to definitively determine whether remote ischemic preconditioning improves clinical outcomes in patients undergoing cardiac surgery. Neither trial offers reason to be hopeful about this approach.

The first study was a sham-controlled trial that enrolled 1600 adults undergoing on-pump coronary artery bypass graft (CABG) surgeries at 30 centers in the United Kingdom. Participants were randomized either to a remote ischemic preconditioning arm or to a control group. In the intervention group, a blood pressure cuff on the arm was inflated to 200 mmHg for 5 minutes, then deflated for 5 minutes, repeating this cycle a total of 4 times. All participants then underwent surgery as usual, with no other part of the anesthesia or operative care standardized.

The findings were clear: there was no difference in the primary endpoint of cardiovascular death, myocardial infarction, coronary revascularization, or stroke at 12 months (26.5% in the preconditioning arm as compared with 27.7% in the control group, p=0.58).

The second trial of 1400 adults undergoing elective cardiac surgeries at 14 sites in Germany revealed comparable results. Using a similar method of remote ischemic preconditioning, the trial also found no difference in a primary composite endpoint of death, myocardial infarction, stroke, and acute renal failure, with follow up until hospital discharge or a maximum of two weeks.

In an accompanying editorial, Drs. Zaugg and Lucchinetti of the Department of Anesthesiology at the University of Alberta describe several reasons that these trials might have been negative in the face of positive preliminary data: “It is likely that remote ischemic preconditioning is less effective in infarct-remodeled, diabetic and aged hearts. Also, since cardiopulmonary bypass per se, as well as hypothermia and cardioplegia, are known to be protective, perhaps further protection is impossible to achieve. Most importantly, concomitant medications, specifically anesthetics, may interfere with remote ischemic preconditioning.”

Is there any chance this still works? The editorialists don’t think so: “The conclusions from both trials are definitive,” they write. “[R]emote ischemic preconditioning is ineffective in adult patients undergoing on-pump cardiac surgery.”

Much work still remains to improve outcomes for patients undergoing complex cardiac surgery. Despite the initial hopes, remote ischemic preconditioning doesn’t seem to help. Maybe it’s time to determine new targets – time, that is, to go back to the lab.

The authors of the first study and the editorialist are available through October 16 to answer your questions on the NEJM Group Open Forum. There is also an NEJM Quick Take video summary available.




Posted by Ken Bernard • October 6th, 2015


Credit: Indian Health Service

Over the past few weeks I experienced many firsts on the frontline in the Tuba City emergency department. Among many firsts are my first Diné words including yá’át’ééh (yah-tah-hey) which means hello in Diné Bizaad, and, in case you forgot, boozhoo (hello) in Anishinabemowin. So far simple greetings and polite courtesies are all I can muster, but I do look forward to learning more. It is a beautiful language, which carries such power and meaning in its rich timbre and tone. At times it can be melodic and soft, and other times abrupt and to the point, the words sounding like the cracking of a hard walnut in your hand.

These new experiences would not be possible without the subtle mix of luck, the support of family and friends, and the scholarship program offered through the Indian Health Service (IHS). Founded in 1955, the IHS is an operational division of the United States Department of Health and Human Services. It was originally formed as a part of the War Department in the late 1800s with an ulterior motive: to stem the spread of infectious diseases to Western expansion populations and U.S. soldiers via vaccinations and quarantined care of Native people. Its structure and mission have thankfully changed significantly since then, and now provides comprehensive health care services, scholarships, and jobs to roughly 2 million patients from over 560 federally recognized tribes through a network of tribal and federal facilities, health and dental clinics, and urban health programs. There is wide variation in the services each facility is able to provide. Despite improvements in the state of health of many Native communities, many challenges still impact the quality of care delivered to Native people. Even with my nascent experience in the ED, I now see these plainly and they have led to some other firsts for my career.

For example, last week was the first time I treated seven patients from the same family all together in the same exam room. Just recently, a close family contact had been diagnosed with meningitis and one member of the family presented now with seizures and fever. The other family members were evaluated, thankfully asymptomatic, and received chemoprophylaxis. Despite some of the most successful vaccination programs in the country, living conditions on the reservation promote the spread of harmful disease like invasive, gram negative meningitis. On the “rez,” the unemployment rate can be greater than 50%. Unchecked poverty and scant economic opportunity leads to a considerable amount of cohabitation, with multiple generations living under one roof or “hogan.” Roughly 30% of homes do not have access to adequate plumbing and the same proportion have earthen floors. Close quarters and living standards below what other Americans have come to expect create a near perfect environment for the spread of communicable disease.

I also transferred my first critically ill patient to another care facility — an act unheard of where I trained in Boston with five Level 1 trauma centers and the best tertiary and quaternary specialty care in the world. But for Tuba City Regional Health Care on the Navajo Nation reservation in Northern Arizona, access to critical care is about 30 minutes away by helicopter or 2 hours by ambulance and can come at a cost of more than $20,000. And in a rationed health care system like the IHS, the economic impact of these transfers can delay or preclude necessary care for non-critical patients. At times it can feel like we are caught in a zero-sum situation. For some to gain others must lose. Beyond the practice liability and cost is the burden to patients’ families. Many have already delayed care due to lack of adequate transportation, some traveling over an hour on unpaved roads. And if the patient happens to be the primary wage earner for the household, in an area where the unemployment rates can be as high as 50%, the result is even more financial devastation.

Despite these obstacles, most of my initial interactions have been enlightening, educational, inspirational, and, quite frankly, touching. Like the first time I called an elderly woman “shimá” or grandmother and saw a surprised smile come over her face, melting away her apprehensions of being in the ED. Or the first pregnancy diagnosed at bedside with ultrasound, with the tears of an elated mother-to-be who had all but given up hope of having children. And finally, my interactions with my dedicated and experienced co-workers, many from the community they serve and eager to have a new physician to teach and learn from, have been welcoming and comforting to myself and my family.

With that said, I have a lot to learn and a lot more experience to gain, not to mention Diné vocabulary. And I cannot wait. Challenges aside, this place has made quite a positive impression on me and I look forward to the days, months and years to come.  This is a place for great medicine, great friends, and great potential.

Springing a Leak

Posted by Carla Rothaus • October 2nd, 2015

Springing a LeakIn a new Clinical Problem-Solving article, a 52-year-old man presented to the emergency department with general weakness and swelling in his legs. Symmetric swelling had begun 4 weeks earlier and had progressed to the point that it was difficult for him to wear shoes.

The nephrotic syndrome is most commonly caused by membranous nephropathy, focal segmental glomerulosclerosis, minimal-change disease, or membranoproliferative glomerular disease.

Clinical Pearls

• What are the diagnostic criteria for the nephrotic syndrome?

The diagnostic criteria for the nephrotic syndrome include nephrotic-range proteinuria (>3.5 g in 24 hours), hypoalbuminemia (<3 g of albumin per deciliter), peripheral edema, and hyperlipidemia.

• What are some of the causes of the nephrotic syndrome?

The nephrotic syndrome is most commonly caused by membranous nephropathy, focal segmental glomerulosclerosis, minimal-change disease, or membranoproliferative glomerular disease. All forms of the nephrotic syndrome may be associated with neoplasms, particularly lymphoma, which may cause a paraneoplastic-associated membranous nephropathy or minimal-change disease. Other secondary causes of the nephrotic syndrome include medications (e.g., nonsteroidal antiinflammatory agents); hepatitis or other chronic infections, including human immunodeficiency virus (HIV) infection; systemic lupus erythematosus; and amyloidosis.

Morning Report Questions

Q: Under what circumstances is hypothyroidism a possible complication of the nephrotic syndrome?

A: Hypothyroidism is an unusual complication of the nephrotic syndrome. The majority (99%) of T4 in the serum is bound to thyroid-binding proteins, including thyroid-binding globulin, transthyretin, and albumin. In patients with an intact hypothalamic-pituitary-thyroid axis, the thyroid gland compensates for urinary losses by increasing the production of T4. Consequently, despite heavy loss of proteins associated with the nephrotic syndrome, levels of biologically active free thyroid hormone are typically unaffected, and most patients with the nephrotic syndrome remain clinically euthyroid. However, patients who are reliant on a fixed exogenous source of T4 cannot compensate for the loss of T4, and thus, clinical hypothyroidism develops as a consequence of proteinuria. In patients who are dependent on exogenous thyroxine replacement, nephrotic-range proteinuria should prompt consideration of urinary loss of thyroid hormone.

Q: Describe some of the features of minimal-change disease.

A: Minimal-change disease is characterized clinically by nephrotic-range proteinuria and pathologically by diffuse effacement of epithelial foot processes on electron microscopy in the context of a relatively normal appearance on light microscopy. Although the disease has been reported after stem-cell transplantation, it is less common than membranous nephropathy in these patients. Glucocorticoids are the mainstay of drug therapy. Other immunomodulatory therapies, including calcineurin inhibitors, cytotoxic agents, rituximab, azathioprine, and mycophenolate mofetil, have been used in conjunction with low-dose glucocorticoids in patients who have unacceptable side effects with high-dose glucocorticoids or, more commonly, as glucocorticoid-sparing agents in relapsed disease. Adjunctive therapies for minimal-change disease typically include loop diuretics for edema, inhibitors of the renin-angiotensin-aldosterone system for hypertension and proteinuria, and 3-hydroxy-3-methylglutaryl-coenzyme A reductase inhibitors (i.e., statins) for hyperlipidemia. Angiotensin-converting-enzyme inhibitors and angiotensin II receptor blockers improve the size selectivity of the glomerular barrier and may have long-term renal protective effects. Relapses are common in minimal-change disease despite therapy, and younger patients (<40 years of age) are more likely to have a relapse than older patients. A small case series suggested that up to a quarter of patients have three or more relapses per year.

Figure 1. Renal-Biopsy Specimens.

Benznidazole for Chagas’ Cardiomyopathy

Posted by Carla Rothaus • October 2nd, 2015

randomized trial of benzindazole for chronic chagasThe benefits of antitrypanosomal therapy for patients with Chagas cardiomyopathy are unclear. In this Original Article, a double blind, placebo controlled trial in 2854 patients with Chagas cardiomyopathy showed that no benefit was found for 2-3 months of benznidazole therapy over the following 5 years of follow-up.

Chronic Chagas’ cardiomyopathy is associated with malignant arrhythmias, conduction disturbances, heart failure, and pulmonary and systemic embolism and is associated with an annual mortality of approximately 4% among patients who are followed in outpatient clinics. The Benznidazole Evaluation for Interrupting Trypanosomiasis (BENEFIT) trial, conducted by Morillo et al., evaluated the efficacy and safety of benznidazole, as compared with placebo, in reducing adverse clinical outcomes among patients with chronic Chagas’ cardiomyopathy.

Clinical Pearls

• How common is Chagas’ cardiomyopathy?

Chagas’ disease is the third most common parasitic disease globally, after malaria and schistosomiasis. Chagas’ cardiomyopathy is the most common form of nonischemic cardiomyopathy and one of the leading causes of complications and death in Latin America. An estimated 6 million to 7 million persons are infected, and 36,800 new cases occur each year. Chagas’ cardiomyopathy develops in approximately 25% of patients infected with Trypanosoma cruzi.

• What is the role of the Trypanosoma cruzi parasite in the development of Chagas’ cardiomyopathy? 

T. cruzi causes an acute disease, which can be cured with trypanocidal treatment. However, in chronic cardiomyopathy, the role of the parasite is debated and the effect of trypanocidal treatment is unclear. In some previous studies, autoimmune mechanisms were implicated as potential causes of late cardiac injury because of the apparent absence of parasites in the cardiac inflammatory lesions on classic histologic analysis and the occurrence of autoimmune responses related to polyclonal activation, molecular self-mimicry by parasite antigens, or cryptic epitopes shared by the host and parasites. However, the identification of T. cruzi antigens in inflamed myocardium with the use of sensitive techniques, such as immunohistochemical analysis and polymerase-chain-reaction (PCR) assay, suggests that parasite persistence may be an important factor that, in conjunction with individual host factors, triggers the inflammatory process.

Morning Report Questions

Q: Does benznidazole treatment reduce clinical progression of cardiac disease among patients with established Chagas’ cardiomyopathy?

A: In the study by Morillo et al., benznidazole treatment significantly reduced the detection of circulating parasites but did not reduce cardiac clinical progression among patients with established Chagas’ cardiomyopathy. The primary study outcome in the time-to-event analysis was the first occurrence of death, resuscitated cardiac arrest, insertion of a pacemaker or an implantable cardioverter-defibrillator, sustained ventricular tachycardia, cardiac transplantation, new heart failure, stroke or transient ischemic attack, or a systemic or pulmonary thromboembolic event. The primary outcome occurred in 394 patients (27.5%) in the benznidazole group and 414 patients (29.1%) in the placebo group (unadjusted hazard ratio, 0.93; 95% confidence interval [CI] 0.81 to 1.07; P=0.31; adjusted hazard ratio, 0.92; 95% CI, 0.81 to 1.06; P=0.26). No significant between-group differences were observed in any component of the primary outcome. There was no significant difference in treatment response on the basis of individual markers of clinical severity, including New York Heart Association (NYHA) class, cardiothoracic ratio of more than 0.5, segmental or global wall-motion abnormalities, low QRS voltage, left ventricular end diastolic diameter of more than 5.0 mm, or left ventricular ejection fraction of less than 40%, or on the basis of sex or age.

Figure 1. Primary Composite Outcome during 7 Years of Follow-up.

Figure 2. Primary Outcome, According to Subgroup.

Table 2. Primary Outcome and Its Components, Hospitalizations, and Deaths.

Q: What is the role of benznidazole, potential or otherwise, in the treatment of chronic Chagas’ infection?

A: According to Morillo et al., it is possible that the benefit of benznidazole may be observed in patients who are at very low risk before the appearance of cardiac damage or that the benefit may accrue with more prolonged therapy (as is the case with therapy for some other chronic infections, such as tuberculosis or leprosy), with repeated pulses of benznidazole, or with treatment at an earlier stage of the disease. These hypotheses are untested. Whether longer follow-up is needed to detect the emergence of a benefit is also a consideration but is speculative, since 60% of patients in the BENEFIT trial were followed for more than 6 years and 25% for more than 7 years, and no obvious signal of possible benefit was observed. The findings of the Morillo trial do not challenge current guidelines that recommend treatment with trypanocidal therapy in the early stages of chronic Chagas’ infection (which are based on several studies, including one that showed the benefit in preventing congenital transmission) and should not detract from the pursuit of general goals for exploring more effective or earlier treatments with new drugs or drug combinations.

How to “Nudge” Smokers to Reduce Tobacco Use?

Posted by James Yeh, M.D. M.P.H. • September 30th, 2015

Reduced-Nicotine Standards Insight Post 9-30-15Health problems due to smoking account for 6 million deaths annually and are the leading cause of death worldwide. Despite the dramatic reduction of smoking rates in the past 50 years in the U.S., nearly 18% of adults are current smokers. Each day 2100 youth and young adults become regular daily smokers. As nicotine sustains tobacco use, several interventions are used to reduce nicotine dependence. Clinicians often can prescribe nicotine replacement treatments or other pharmacologic interventions for patients who are interested in quitting. On a regulatory level, the taxing of tobacco products helped reduce smoking uptake and increase smoking cessation world-wide.

What about other forms of intervention? There is some evidence that low nicotine content cigarettes containing less than 0.7 mg/g of tobacco (different than the so-called “light cigarettes”) can reduce nicotine dependence and increase smoking cessation.

In this week’s issue of NEJM, Donny and colleagues conducted a 6-week multi-site double-blinded clinical trial randomizing over 800 adult daily smokers (self-reported > 5 cigarettes daily) with laboratory evidence of active smoking who are not interested in quitting to either placebo (their usual brand of cigarettes) or to one of 6 investigational cigarettes that contained varying amount of nicotine content or tar level. The primary outcome was the average number of self-reported cigarettes smoked per day during week 6 of the intervention. Additional outcomes included laboratory markers of nicotine product use and subjective assessment of withdrawal.

In the study, those assigned to the low nicotine content cigarettes containing 0.4, 1.3, and 2.4 mg of nicotine per gram of tobacco smoked less than the placebo or the 15.8 mg/g control cigarettes (14.9, 16.3, and 16.5 versus 22 and 21.3 cigarettes daily; p<0.001). Individuals assigned the cigarettes containing 5.2 mg or less of nicotine per gram of tobacco had lower urinary nicotine equivalents than those assigned the control 15.8 mg/g cigarettes (p <0.01). There were no differences in individuals with self-reported withdrawal symptoms among those assigned the low nicotine content cigarettes compared to the placebo and the control nicotine content cigarettes during week 1 or week 6 of the intervention. Individuals assigned 0.4 mg/g nicotine level cigarettes had relatively lower dependence score (p=0.001).

From this 6-week study we learned that cigarettes with 2.4 mg/g or less nicotine helped individuals smoke up to 40% fewer cigarettes per day. The lowest nicotine content cigarettes (0.4 mg/g) reduced nicotine dependence. This suggests that if nicotine content is reduced adequately, smokers may smoke less and be less dependent. The study findings would need to be validated by a longer follow-up period.

In a NEJM Perspective, Fiore and Baker from the Center for Tobacco Research and Intervention at the University of Wisconsin School of Medicine believe “these data support exploration of a national nicotine-reduction policy, and we recommend that additional attention be paid to low-nicotine cigarettes as a potential clinical smoking-cessation resource”. However, the success of such a policy would be dependent on the application and enforcement of the rules to all nicotine containing tobacco products.

What is your view on the use of such policies to “nudge” people towards healthier lifestyles choices?

How effective are smoking cessation interventions that you implement for your patients?

The authors of these studies are available through October 9th to answer your questions on the NEJM Group Open Forum

A Man with Cardiogenic Shock

Posted by Carla Rothaus • September 25th, 2015

Cardiogenic ShockIn a new Case Record of the Massachusetts General Hospital, a 50-year-old man with a history of cardiomyopathy and progressive muscle weakness was admitted with cardiogenic shock. Electroencephalography showed total suppression of cerebral activity; ventilator support was withdrawn, and he died. An autopsy was performed.

Myotonic dystrophy is the most common muscular dystrophy in adults and most prevalent overall, with an incidence of approximately 1 per 7000 to 8000 persons. It is associated with an autosomal dominant inheritance pattern.

Clinical Pearls

• What is the cause of myotonic dystrophy type 1?

Myotonic dystrophy type 1 is caused by an expansion of CTG repeats in the dystrophia myotonica protein kinase (DMPK) gene. The myotonic discharges in myotonic dystrophy are caused by dysfunction of a chloride channel protein that results from dysregulated alternative splicing of the chloride channel messenger RNA (mRNA). The expansion of CUG repeats in mutant DMPK gene mRNA causes missplicing of pre-mRNA from at least several dozen other genes besides the chloride channel. The mechanism involves the sequestration of splicing regulator proteins in the muscleblind-like (MBNL) family through the expansion of CUG repeats, resulting in formation of nuclear inclusions and loss of MBNL activity. However, except for the effects of the dysfunctional chloride channel protein, the downstream effects of most of the splice variants associated with myotonic dystrophy type 1 are unknown. A clinical phenomenon that is more characteristic of myotonic dystrophy type 1 than of myotonic dystrophy type 2 is anticipation, which is defined as the lowering of age of onset, worsening of disease severity, or both in successive generations. The biologic basis for anticipation is the intergenerational expansion of the unstable CTG repeats, which is more likely to occur in maternal transmissions than in paternal transmissions.

• What is the correlation between the number of repeats and disease severity in myotonic dystrophy type 1?

There are normally 5 to 34 CTG repeats at this locus, but patients with myotonic dystrophy type 1 have hundreds to thousands of CTG repeats. Persons with 35 to 49 CTG repeats are asymptomatic, but the number of repeats may be expanded in the next generation and cause a disease phenotype. Persons with 50 to 150 CTG repeats often have mild disease (characterized by myotonia and cataracts), an age of onset of 20 to 70 years, and a normal or moderately reduced life expectancy. Those with 150 to 1000 CTG repeats typically have moderate or severe disease (characterized by weakness, myotonia, cataracts, and multisystemic involvement), an age of onset of 12 to 30 years, and a reduced life expectancy. Those with more than 1000 CTG repeats have the most severe disease, which is congenital and characterized by hypotonia, contractures, feeding difficulties, respiratory insufficiency, and delayed motor and mental development. However, it is important to note that there is considerable variation in the correlation between the number of repeats and disease severity, and thus attempts to predict disease severity on the basis of the number of repeats may not particularly helpful.

Morning Report Questions

Q: What are some of the clinical manifestations of myotonic dystrophy type 1?

A: Cardiac involvement is common in myotonic dystrophy type 1 and can result in asymptomatic electrocardiographic abnormalities, progressive heart block, atrial tachyarrhythmias, ventricular tachyarrhythmias, nonischemic cardiomyopathy, and sudden death. Hearing loss is an occasional complication of myotonic dystrophy type 1 that resembles presbycusis. Insulin resistance, which is common in patients with myotonic dystrophy type 1, may be related to inappropriate splicing of the insulin-receptor transcript in muscle. Muscle wasting may result from one of many splice variants acting alone or in combination with several others, or it could occur independently of splicing.

Q: Are there any effective interventions for myotonic dystrophy?

A: Although there is no current therapy for affected patients or their affected children that would alter the disease course, establishing the diagnosis is key to help anticipate, identify, and treat coexisting conditions. For example, yearly electrocardiographic examinations are recommended to identify those who may benefit from pacemaker and implantable cardioverter-defibrillator (ICD) placement. That said, it is unclear whether placement of an ICD reduces mortality since the most common cause of death associated with this disorder is respiratory failure due to weakness of the muscles of respiration. Therapeutic approaches designed to reduce the toxic effects of mutant DMPK RNA are in development.

Figure 3. Muscle Specimens Obtained at Autopsy.

The Asthma-COPD Overlap Syndrome

Posted by Carla Rothaus • September 25th, 2015

Asthma-COPD Overlap Syndrome 1Although in textbooks asthma and chronic obstructive pulmonary disease (COPD) are viewed as distinct disorders, there is increasing awareness that many patients have features of both. A new review article covers the asthma–COPD overlap syndrome.

Approximately 1 in 12 people worldwide are affected by asthma or chronic obstructive pulmonary disease (COPD); once regarded as two distinct disease entities, these two conditions are now recognized as heterogeneous and often overlapping. The term “asthma-COPD overlap syndrome” (ACOS) has been applied when a person has clinical features of both asthma and COPD.

Clinical Pearls

• Does ACOS have a specific definition?

Even though overlaps between asthma and COPD are a clinical reality, Global Initiative for Asthma (GINA) and Global Initiative for Chronic Obstructive Lung Disease (GOLD) documents have not given a specific definition of ACOS and have stated that more evidence on “clinical phenotypes and underlying mechanisms” is needed.

Table 1. Four Examples of Patients with Obstructive Airway Disease.

• How prevalent is the overlap syndrome, and how is it treated?

According to a case definition of ACOS that has been widely promulgated, the syndrome is estimated to be present in 15 to 45% of  the population with obstructive airway disease, and the prevalence increases with age. However, despite this presumed high prevalence, no double-blind, prospective studies have been conducted to provide information on how to treat these types of patients. Indeed, studies of COPD have excluded nonsmokers and patients with some bronchodilator reversibility, whereas studies of asthma have excluded smokers and patients without substantial ronchodilator reversibility. Thus, the most effective treatment of patients with ACOS remains unknown.

Morning Report Questions

Q: What are some of the features that have traditionally been associated with either asthma or COPD, but can, in fact, be present in both conditions?

A: Reversibility of airway obstruction after inhalation of a bronchodilator drug such as albuterol is a hallmark in early asthma and has long been regarded as a criterion to distinguish asthma from COPD. Reversibility of airway obstruction is frequently present in COPD as well; in two studies, reversibility was observed in up to 44% and 50% of patients with COPD. There is broad consensus that asthma typically has an eosinophilic and a Th2-driven cytokine pattern of inflammation, whereas neutrophilic inflammation dominates in COPD. Bronchial-biopsy studies, sputum studies, and exhaled-breath studies have provided evidence of substantial heterogeneity in mucosal inflammation. Patients with asthma who have severe or late-onset disease or chronic infections or who smoke may also exhibit neutrophilic inflammation and CD8 cells in the airways, both of which were once believed to be hallmarks of COPD. A Th2 inflammatory signature can also be present in COPD. Eosinophils are present in 15 to 40% of patients with stable COPD — in sputum, bronchoalveolar lavage, and lung tissue — even after careful exclusion of patients with reversibility of airway obstruction, bronchial hyperresponsiveness, atopy, or a childhood history of asthma; eosinophil activation is associated with disease severity. Eosinophil levels can also be increased in the sputum of patients with COPD exacerbations.

Figure 2. Risk Factors for Asthma and COPD and the Influence of Environment and Aging.

Q: Should ACOS be designated a disease entity?

A: According to the authors of this review article, it is premature to recommend the designation of ACOS as a disease entity in primary and specialist care. More research is needed to better characterize patients and to obtain a standardized definition of ACOS that is based on markers that best predict treatment response in individual patients. The danger of seeing ACOS as a disease entity is that the lines may be blurred between asthma and COPD, because studies addressing the patient population with ACOS specifically are lacking, which could lead to overtreatment, particularly with inhaled glucocorticoids.

Efficacy and Longer-Term Safety of the Dengue Vaccine in Endemic Regions

Posted by Rupa Kanapathipillai • September 23rd, 2015

Vaccine EfficacyYou are sitting on the beach in Colombia, taking a weekend break from your hospital and outpatient duties, where you have seen a broad spectrum of illness resulting from dengue – mild fever to Dengue Hemorrhagic Shock Syndrome. You think to yourself: “if only there was a vaccine that could reduce the morbidity and mortality associated with dengue”. You marvel at the time and effort it takes to create a safe, efficacious vaccine, and reflect on the complex path to product licensure, mentally noting the need for a vaccine that would ideally provide long-lasting immunity to all serotypes, both in those with and without prior dengue infection. Immunity that’s not long-standing enough could potentially result in increased risk of infection, particularly in a dengue endemic setting.

CDY-TDV is a live attenuated chimeric vaccine with the yellow fever virus as a backbone that carries the structural proteins of the dengue virus. The schedule was 3 doses administered over a 12-month period, with encouraging short-term safety and efficacy. Results of previously published work by Villar et al. published in NEJM showed a 67-80% reduction in dengue hospitalizations among vaccinees. Efficacy was higher in vaccinees with previous dengue infection compared to those who were seronegative at baseline, and efficacy varied across the 4 dengue serotypes.

The recently published article by Hadinegoro et al. in NEJM summarizes the efficacy and longer-term safety data of the CYD-TDV Dengue Vaccine. 35,000 children aged between 2-16 years were enrolled in 3 studies – two phase III trials, CYD14 and CYD15, and a phase IIb trial, CYD23/57. A safety endpoint was assessed – incidence of hospitalization with new dengue infection in years 3-6 post vaccine in the 3 studies. During year 3 in the CYD14, CYD15 and CYD57 trials, there was a lower risk of hospitalization for dengue among participants in the vaccine group compared to controls in those ≥9 years, but was higher in those <9 years. Pooled relative risks of hospitalizations for dengue were 0.84 among all participants, 1.58 in those <9 years, and 0.5 in those ≥9 years.

As described in the accompanying editorial, by Dr. Cameron Simmons, vaccination appears to be associated with an elevated risk of hospitalization in vaccinated children <9 years, most markedly in those 2-5 years of age, likely following natural infection in the 3rd year post vaccination. It’s hypothesized that vaccination of some young children may elicit transient immunity, with subsequent waning and pre-disposition to more serious infections that result in hospitalization. Surveillance was limited to hospital-based only – it remains unknown whether the <9 years subgroup experiences a higher incidence of symptomatic infection outside of the hospital setting also. Notably, vaccination did not increase the frequency of severe, life-threatening complications.

Vaccine efficacy was assessed using pooled data from the first 25 months of CYD14 and CYD15. Children between 9-16 years continue to benefit, likely due to vaccine-induced pan-serotype immunity in many recipients who were previously naturally infected; benefits are far less clear in children <9 years. Pooled rates of efficacy for symptomatic dengue during the first 25 months were 60.3% for all participants, 44.6% for those <9 years, and 65.6% for those ≥9 years. Longer-term efficacy will continue to be assessed, informing the need for booster doses.

Deputy Editor Lindsey Baden, MD comments “There is a tremendous need for a vaccine to prevent dengue associated illness, especially severe disease. A major challenge to developing such a vaccine has been the concern for disease enhancement associated with serotype-specific partial immunity. Longer term studies of dengue vaccine safety, such as this one, are critical for our understanding vaccine safety and for the development of a successful vaccine.”

You slap another Aedes mosquito away from your leg, resigned to the idea that while certainly closer, programmatic vaccination with CYD-TDV may not be available by the end of your weekend holiday, particularly for those aged <9 years.

A Pregnant Woman with Headache

Posted by Carla Rothaus • September 18th, 2015

Axial MRI Scans of the Head.In a new Case Record of the Massachusetts General Hospital, a 38-year-old pregnant woman presented with headache and visual symptoms. MRI of the brain revealed multiple small infarcts, with no thrombosis. Anemia and eosinophilia were present, as was abnormal endocardial material on echocardiography. Diagnostic tests were performed.

Geophagia is practiced in Guatemala and even encouraged during pregnancy as a desired nutritional supplement. Pregnancy-related geophagia (the habit of eating clay or earth) is a potential source of helminthic infection.

Clinical Pearls

• What are some of the causes of pregnancy-associated ischemic stroke?

Ischemic stroke is uncommon during pregnancy, although there is a high relative increase after delivery (relative risk, 8.7). The rate of both ischemic and hemorrhagic strokes during pregnancy may be increasing, perhaps because of increased rates of hypertension and heart disease. Normal pregnancy leads to changes in hemodynamic function, vascular-wall structure and integrity, endothelial function, and the levels and function of procoagulant and anticoagulant proteins that are factors associated with an increased risk of stroke. The most common causes of pregnancy-associated ischemic stroke are cardioembolism, preeclampsia and eclampsia, and cerebral venous sinus thrombosis. Acquired or inherited underlying hypercoagulability may exacerbate the physiologic prothrombotic state that occurs during pregnancy. Atherosclerosis may also contribute to the risk of stroke in patients with risk factors for early-onset disease.

• What are some of the causes and consequences of hypereosinophilia?

Eosinophilia can be caused by the primary hypereosinophilic syndromes or by secondary causes, including parasitic and viral infections, drug-induced and other allergic causes, and lymphoma and other tumors. Current criteria for a diagnosis of the primary hypereosinophilic syndrome include eosinophilia (>1500 cells per cubic millimeter) that has persisted for 6 months, the absence of secondary causes of eosinophilia, and end-organ involvement. Damage to the heart due to either primary or secondary hypereosinophilia can be manifested by eosinophilic endocarditis (Loffler’s endocarditis) or endomyocardial fibrosis (Davies’ disease).

Morning Report Questions

Q: How is hookworm infection acquired?

A: Hookworm, which is present in New England, is usually acquired through penetration of the skin by larvae from contaminated soil but can also be acquired through ingestion. Adult worms cause a chronic intestinal infection that is accompanied by low-level eosinophilia (500 to 700 cells per microliter). Adult worms damage intestinal capillaries and consume 0.03 to 0.20 ml of blood per day, causing iron deficiency anemia. Diagnosis is determined by means of stool examination for ova and parasites, with a single examination reportedly having approximately 80% sensitivity.

Q: What symptoms are commonly associated with toxocara infection, and how is it treated?

A: Toxocara canis and T. cati infect dogs and cats throughout the United States. Toxocara species are acquired through the ingestion of contaminated soil or infected meat. Larvae hatch in the intestine and migrate through the bloodstream to multiple organs, including the liver, spleen, lungs, and eye. Symptoms commonly include wheezing, fever, hepatomegaly, and eosinophilia and occasionally include cough, urticaria, pruritus, pneumonia, and anemia. The treatment of choice for both toxocara and hookworm infection is albendazole. Treatment of toxocara infection may also include glucocorticoids.

Table 2. Potential Parasitic Causes of Hypereosinophilia in this Patient.

Acute Myeloid Leukemia

Posted by Carla Rothaus • September 18th, 2015

Acute Myeloid LeukemiaMany recent biologic insights have shed light on the nosology of acute myeloid leukemia. Although this new knowledge has not yet had a major influence on the treatment of the disease, strategies under investigation may improve outcomes. A new review article summarizes.

Acute myeloid leukemia (AML) is a form of cancer that is characterized by infiltration of the bone marrow, blood, and other tissues by proliferative, clonal, abnormally differentiated, and   occasionally poorly differentiated cells of the hematopoietic system. Although the cytogenetic heterogeneity of AML has been recognized for more than 30 years, the enormous molecular heterogeneity of the disease has become increasingly apparent over the past 15 years. The prognostic importance of this biologic heterogeneity is well accepted, but translation of this new information into improved therapy is just beginning.

Clinical Pearls

• What cure rates can be expected in adults with AML?

Although it was incurable 50 years ago, AML is now cured in 35 to 40% of adult patients who are 60 years of age or younger and in 5 to 15% of patients who are older than 60 years of age. The outcome in older patients who are unable to receive intensive chemotherapy without unacceptable side effects remains dismal, with a median survival of only 5 to 10 months.

• What has been learned about AML with the use of new genomic techniques?

Emerging data gleaned with the use of new genomic techniques — in particular, next-generation sequencing — are providing an unprecedented view of the spectrum and frequency of mutations, their distinct patterns of cooperativity and mutual exclusivity, their subclonal architecture, the clonal evolution during the disease course, and the epigenetic landscape of the disease. The Cancer Genome Atlas Research Network analyzed the genomes of 200 patients with AML (50 with the use of whole-genome sequencing and 150 with the use of whole-exome sequencing, along with RNA and microRNA sequencing and DNA-methylation analysis). Genes that were significantly mutated in AML were organized into several functional categories. Data are lacking from studies involving larger patient cohorts to elucidate the complex interplay of these genetic lesions in individual patients with AML. Studies have shown that most cases of AML are characterized by clonal heterogeneity at the time of diagnosis, with the presence of both a founding clone and at least one subclone. Various patterns of dynamic clonal evolution that occur at relapse probably contribute to resistance to therapy. Other important findings revealed by next-generation sequencing studies relate to the pattern of mutation acquisition and the existence of preleukemic stem cells. The evaluation of molecular genetic lesions as prognostic and predictive markers is an active research area.

Figure 1. Eight Functional Categories of Genes That Are Commonly Mutated in Acute Myeloid Leukemia.

Table 1. Frequency and Clinical Significance of Recurrent Gene Mutations in Adults with AML.

Morning Report Questions

Q: What is the current general approach to treatment of AML?

A: The general therapy strategy in patients with AML has not changed substantially in more than 30 years. Initial assessment determines whether a patient is eligible for intensive induction chemotherapy. If complete remission is achieved after intensive therapy, appropriate postremission therapy is essential. Continuous-infusion cytarabine with an anthracycline remains the mainstay of induction therapy. A complete response is achieved in 60 to 85% of adults who are 60 years of age or younger. In patients who are older than 60 years of age, complete response rates are inferior (40 to 60%). No other induction regimen has been shown convincingly to be superior, with one possible exception: the addition of gemtuzumab ozogamicin, a humanized anti-CD33 monoclonal antibody conjugated with the cytotoxic agent calicheamicin. A recent meta-analysis of five randomized trials showed that although adding gemtuzumab ozogamicin to induction therapy did not increase response rates, it reduced the risk of relapse and improved survival among younger and older adults with favorable-risk and intermediate-risk (but not adverse-risk) cytogenetic findings. Standard postremission strategies include conventional chemotherapy as well as hematopoietic-cell transplantation. Whether allogeneic transplantation is recommended depends mainly on the leukemic genetic-risk profile, scores on established scales that predict the risk of treatment-related death, and specific transplantation-associated factors in the patient.

Q: What treatment options are available for patients who are ineligible for intensive therapy?

A: The treatment of older or frail patients with AML includes best supportive care (including hydroxyurea), low-dose cytarabine, and, more recently, the hypomethylating agents decitabine and azacitidine. Currently, no widely accepted algorithm provides treatment guidelines for older patients who cannot receive intensive chemotherapy. In clinical practice, the patient’s age, general health, and specific coexisting conditions, as well as the disease features, the patient’s wishes (and those of the patient’s relatives), and the physician’s attitude and interest all influence decision making. The hypomethylating agents may have promise. Both decitabine and azacitidine have been studied in phase 3 trials. In an unplanned survival analysis, the use of decitabine, as compared with an agent chosen by the patient and physician (usually low-dose cytarabine), was associated with a survival advantage (median, 7.7 months vs. 5.0 months). On the basis of this increase in survival, the European Medicines Agency, but not the U.S. Food and Drug Administration, granted approval for the use of decitabine for the treatment of older patients with AML.

Table 3. Current Conventional Care of Patients with AML, Including Indications for Allogeneic Hematopoietic-Cell Transplantation.

Table 4. Indications for Allogeneic Hematopoietic-Cell Transplantation and Factors Influencing the Outcome.

Table 5. Selected Newer Agents in Clinical Development for the Treatment of AML.