New Clinical Decisions: Thromboprophylaxis after Knee Arthroscopy

Posted by • February 13th, 2017

In this week’s Clinical Decision, a 60-year-old woman comes to your office to discuss her upcoming arthroscopic knee surgery. She recently received a diagnosis of a right lateral meniscus tear and is scheduled to undergo arthroscopic meniscectomy. She has heard that knee surgery increases the risk of blood clots, and she wants to know whether she should take an anticoagulant after her surgery to reduce the risk. She asks your advice on how she can prevent blood clots after her procedure. What do you recommend?

Clinical Decisions are a great way to help you evaluate treatment options and gain insight from colleagues. The articles include a case vignette, plus clinically acceptable management options, each supported in a short commentary by a respected clinician. You are invited to vote for, and comment on, the options at, where a diverse range of thinking is presented. Browse more Clinical Decisions articles here!

New Interactive Medical Case: Making the Connection

Posted by • January 30th, 2017

In the case, “Making the Connection,” a 41-year-old man with hyperlipidemia and a history of morbid obesity since childhood was referred to a surgeon for a weight-management consultation and consideration of bariatric surgery. The patient had tried dieting on his own and had also tried several commercial weight-loss programs with little long-term success. Can you accurately assess this patient?

Interactive Medical Cases are online simulations based on a real patient’s experience of illness. You follow interactive steps through an evolving patient’s history, diagnosis, and management, from presentation to outcome.

Browse the list of previous Interactive Medical Cases. Try one or all 44!

New Quick Take: Mass Intoxication with Synthetic Cannabinoids

Posted by • January 24th, 2017

11113This week’s Quick Take video describes how medical professionals and law enforcement agents identified the cause of a July 2016 drug-induced intoxication and subsequent hospitalization of an unusually high number of people in Brooklyn. Emergency room workers knew only that the patients had been exposed to some type of “herbal incense” before beginning to exhibit “zombielike” behavior. When a drug that is rarely seen or tested for causes such events, how do clinicians use their experience and resources to pinpoint exactly what it is? Take less than three minutes to see how it happened in Brooklyn.

Quick Take videos are a great way to quickly become familiar with the key findings of select Original Articles. This collection of brief video summaries, updated weekly, offers a succinct, innovative way to understand important article findings that have an impact on medical practice and patient care. Look for a new video each week in the featured article, or browse the collection from the Articles & Multimedia tab on

An 18-Year-Old Woman with Acute Liver Failure

Posted by • January 19th, 2017

1In this article by Olson et al., we learn Wilson’s disease, also known as hepatolenticular degeneration, is an autosomal recessive disease characterized by impaired copper metabolism due to a defective ATPase. Patients with Wilson’s disease may present with chronic liver disease, acute liver failure, hemolysis, and psychiatric or neurologic manifestations. It has been previously noted that viral infection or drug toxicity may serve as a trigger for fulminant Wilson’s disease.


Clinical Pearl

Does acute liver failure in the pediatric population always present with hepatic encephalopathy?

Acute liver failure in adults is characterized by a sudden loss of hepatic function without evidence of preexisting liver disease. Criteria for the diagnosis include the presence of coagulopathy (international normalized ratio [INR], >1.5), hepatic encephalopathy, and an illness of less than 24 weeks’ duration. However, in the pediatric population (which can be considered to include patients who are up to 21 years of age), up to 50% of patients who present with acute liver failure do not present with encephalopathy. Modified criteria for the diagnosis of acute liver failure in children include evidence of acute liver injury and severe coagulopathy (INR, >2.0) in the absence of encephalopathy.


Clinical Pearl

What are the rapid diagnostic screening test criteria for Wilson’s disease?

Rapid diagnostic criteria for Wilson’s disease can be used in patients who present with acute liver failure. A screen that shows a ratio of alkaline phosphatase (IU per liter) to total bilirubin (mg per deciliter) of lower than 4.0 and then subsequently shows a ratio of aspartate aminotransferase (IU per liter) to alanine aminotransferase (IU per liter) of higher than 2.2 has been described as 100% sensitive and specific for the diagnosis of Wilson’s disease.


Morning Report Questions

Q: Is copper staining of liver tissue a reliable test for the pathological diagnosis of Wilson’s disease?

A: The pathological diagnosis of Wilson’s disease is generally based on the presence of compatible histomorphologic features and results of staining for copper, including a rhodanine stain. However, staining for copper in tissue is unreliable, since the presence of copper in the cytoplasm of hepatocytes might not be detected on a rhodanine stain. Therefore, in patients with suspected Wilson’s disease, copper quantification performed on either a dedicated core-biopsy specimen or a paraffin-embedded tissue sample is considered to be the best available diagnostic test.

Q: What is the prognosis of patients with acute liver failure due to Wilson’s disease?

A: In Wilson’s disease, acute liver failure develops in the setting of subclinical chronic liver disease. If liver transplantation is not performed, acute liver failure due to Wilson’s disease is fatal. In this country, the highest priority (United Network for Organ Sharing status 1A) is reserved for patients with liver failure who have a life expectancy of less than 7 days if they do not undergo transplantation. Wilson’s disease is the only cause of acute liver failure that allows a patient with preexisting liver disease to be listed as status 1A. The outcomes associated with liver transplantation for acute liver failure induced by Wilson’s disease are excellent, if transplantation is performed prior to neurologic deterioration.

Screening for Colorectal Neoplasia

Posted by • January 6th, 2017

Click to Enlarge

Click to Enlarge

The percentage of U.S. residents with up-to-date screening for colorectal cancer has not increased appreciably since 2010 and remains at approximately 60%. The National Colorectal Cancer Roundtable has established a goal of 80% adherence to colorectal cancer screening by the year 2018. To achieve the highest level of adherence to colorectal cancer screening, it may be best to provide participants a choice, because the “best” strategy is the one that they will adhere to consistently.


Clinical Pearl

• What interventions may increase patient participation in colorectal cancer screening?

Various interventions used in randomized, controlled trials have been shown to increase patient participation in screening; such interventions include sending patients invitations from their primary care provider, sending reminder letters and making telephone calls, and mailing fecal occult blood test kits to patients’ homes. The most successful programs use patient navigators to reduce logistic barriers, address cultural issues, and encourage participants to undergo screening; the use of patient navigators is especially important in underserved populations.


Clinical Pearl

• How can the benefit of colorectal cancer screening be maximized?

Maximizing the benefit of colorectal cancer screening requires a programmatic approach to implementing screening strategies. The quality of a screening program should be measured by its ability to identify patients who are due for screening, provide access to screening, assess adherence to the screening test and to follow-up colonoscopy if a noncolonoscopy screening test is positive, document test outcomes and disseminate accurate follow-up recommendations, identify patients with a negative test to follow them for repeat screening at the appropriate intervals, and provide timely surgery for cancers.


Morning Report Questions

Q: Are there individual patient factors other than a personal or family history of colonic neoplasia that influence colorectal cancer screening recommendations? 

A: Additional factors that might influence colorectal screening strategies include race, lifestyle factors, or aspirin use. For example, among black men and women, the rates of death from colorectal cancer are 28.4 and 18.9 per 100,000 population, respectively; among white men and women, the corresponding rates are 18.7 and 13.2 per 100,000 population. Obesity, tobacco smoking, low physical activity, high intake of alcohol, high intake of red or processed meat, and low intake of fruits and vegetables are associated with increased risk of colorectal cancer, and regular use of aspirin has been associated with reduced risk. However, none of these factors are currently used to differentiate screening strategy, age of screening initiation, or surveillance intervals.


Q: At what patient age should colorectal cancer screening cease?

A: Although the risk of colorectal cancer increases with age, the competing risk of death from other diseases and the risk of serious complications from colonoscopy also increase with age. Several national organizations recommend that screening for patients between 76 and 85 years of age should be tailored on the basis of the presence of coexisting illnesses and that screening should be stopped after patients reach 85 years of age. A microsimulation model suggested that the intensity of prior screening and the individual risk of colorectal cancer should also be considered in determining the age at which to stop screening. Patients without a notable coexisting illness who are at average or higher risk for colorectal cancer and have had no prior screening would be expected to benefit from screening into their 80s.

A Woman with Progressive Loss of Language

Posted by • January 6th, 2017

Click to Enlarge

The anterior temporal lobe is an area of the brain that is critically involved in object naming and word comprehension. Multiple lines of evidence suggest that the left anterior temporal lobe is specialized for word comprehension (recognition), whereas the right anterior temporal lobe may serve a similar function for objects and faces.


Clinical Pearl

• What underlying pathologic processes are associated with primary progressive aphasia?

In patients with primary progressive aphasia, the underlying pathologic process is usually caused by frontotemporal lobar degeneration or Alzheimer’s disease. Approximately 60% of cases of primary progressive aphasia are associated with frontotemporal lobar degeneration, and 40% are associated with Alzheimer’s disease.


Clinical Pearl

• How does primary progressive aphasia differ from typical dementia?

In contrast to typical dementias that occur in late life, primary progressive aphasia most commonly starts before 65 years of age and is not associated with memory loss. There are three variants of primary progressive aphasia: agrammatic, logopenic, and semantic.

Morning Report Questions

Q: Describe some of the features of the three variants of primary progressive aphasia.

A: The agrammatic variant is characterized by the construction of grammatically incorrect sentences and a loss of fluency in the setting of preserved word comprehension. The logopenic variant is characterized by impairment of word finding, poor language repetition, and fluctuating fluency in the setting of preserved grammar and word comprehension. The semantic variant is characterized by impairment of object naming and word comprehension in the setting of preserved fluency, repetition, and grammar. Pauses for word finding and impaired object naming can occur in each of the variants. Each variant of primary progressive aphasia is associated with a different anatomical site of peak atrophy in the left-hemisphere language network: the inferior frontal gyrus (Broca’s area) in the agrammatic variant, the temporoparietal junction (Wernicke’s area) in the logopenic variant, and the anterior temporal lobe in the semantic variant.

Q: What type of neuronal deposits are associated with the semantic variant of primary progressive aphasia caused by frontotemporal lobar degeneration?

A: There is a strong correlation between the semantic variant of primary progressive aphasia and a type of frontotemporal lobar degeneration that is linked to the presence of abnormal deposits of TAR DNA-binding protein 43 (TDP-43), an RNA-binding protein with a wide range of targets. TDP-associated frontotemporal lobar degeneration is further classified into subgroups defined according to the pattern of inclusions: type A is associated with the presence of many neuronal cytoplasmic inclusions and short dystrophic neurites, type B with the presence of some neuronal cytoplasmic inclusions and rare dystrophic neurites, and type C with the presence of rare neuronal cytoplasmic inclusions and long dystrophic neurites. Although the clinicopathological correlations are not exact, type C lesions often occur with the semantic variant of primary progressive aphasia, semantic dementia, or the behavioral variant of frontotemporal lobar degeneration.

Zika Virus Infection in Pregnant Women in Rio de Janeiro

Posted by • December 15th, 2016


Click to enlarge

Zika virus (ZIKV) is a flavivirus that was recently introduced into Brazil. Brasil et al. enrolled pregnant women in whom a rash had developed within the previous 5 days and tested blood and urine specimens for ZIKV by reverse-transcriptase–polymerase-chain-reaction assays. The authors followed women prospectively to obtain data on pregnancy and infant outcomes. This final report updates preliminary data on Zika virus infection among pregnant women in Rio de Janeiro. ZIKV infection during pregnancy was associated with fetal death, fetal growth restriction, and central nervous system abnormalities in a new Original Article.

Clinical Pearl

• Are there any clinical features that may be more common among ZIKV-positive women than ZIKV-negative women?

In the study by Brasil et al., a descending macular or maculopapular rash was the most common type of exanthem noted in ZIKV-positive women. The maculopapular rash was seen far more frequently in ZIKV-positive women than in ZIKV-negative women (P=0.02). The other prevalent finding was pruritus, which was seen in 90% of ZIKV-positive women in the study. Conjunctival injection was present in 58% of ZIKV-positive women and in a smaller percentage (40%) of ZIKV-negative women (P=0.03), which suggests that this symptom is a more specific clinical feature of ZIKV infection.

Clinical Pearl

• Does the time window for adverse outcomes in utero due to Zika virus infection occur only in early pregnancy?

With rubella, the time window for adverse outcomes in utero occurs in the first 16 weeks of pregnancy. In contrast, with ZIKV, the time window appears to be throughout pregnancy. ZIKV pathogenicity was evident in the Brasil study cohort even in the presence of a “control” group that was affected by chikungunya virus, which is also linked to adverse pregnancy outcomes, particularly fetal loss. Adverse outcomes after ZIKV infection occurred regardless of the timing of maternal infection; adverse outcomes occurred in 55% of pregnancies in which the mother was infected in the first trimester (11 of 20 ZIKV-infected pregnancies), in 52% of those in which the mother was infected in the second trimester (37 of 71 ZIKV-infected pregnancies), and in 29% of those in which the mother was infected in the last trimester of pregnancy (10 of 34 ZIKV-infected pregnancies).


Click to enlarge

Morning Report Questions

Q: How did pregnancy outcomes compare between ZIKV-positive and ZIKV-negative women in the study by Brasil et al.?

A: Despite the high rate of adverse outcomes in the control group of pregnant women with other infectious illnesses, the findings in the ZIKV-positive group were far more striking. Among 125 pregnancies in ZIKV-positive women, 58 adverse pregnancy outcomes were noted (46.4%); in contrast, 7 of the 61 pregnancies (11.5%) in the ZIKV-negative cohort resulted in adverse outcomes (P<0.001). Among 117 live births in the ZIKV-positive cohort, 49 infants (42%) were found to have abnormalities on clinical examination, imaging, or both; in contrast, among 57 live births in the ZIKV-negative cohort, 3 infants (5%) had such abnormalities (P<0.001). ZIKV-positive women were nearly 10 times as likely as ZIKV-negative women to have emergency cesarean sections performed owing to fetal distress (23.5% vs. 2.5%, P=0.003). Infants born to ZIKV-positive mothers were also nearly four times as likely to need critical care assistance immediately after birth (a finding that is reflective of fetal distress) as infants who had not been exposed to ZIKV (21% vs. 6%, P=0.01). There was no significant difference in the rate of fetal loss between ZIKV-positive mothers and ZIKV-negative mothers (7.2% and 6.6%, respectively; P=1.0).

Q: Was microcephaly the most common abnormality observed after ZIKV infection in the study by Brasil et al.?

A: Four infants in the ZIKV-positive group (3.4%) were noted to have microcephaly at birth; two were small-for-gestational-age infants with proportionate microcephaly (i.e., the head size is small but is proportional to the weight and length of the infant), and two had disproportionate microcephaly (i.e., the head size is small relative to the weight and length of the infant). None of the infants in the control group had microcephaly. Although microcephaly has been widely discussed in relation to ZIKV infection, it is important to note that other findings such as cerebral calcifications and fetal growth restriction were present more frequently.

Dupilumab versus Placebo in Atopic Dermatitis

Posted by • December 15th, 2016


Click to enlarge

For patients with moderate-to-severe atopic dermatitis, topical therapies have limited efficacy, and systemic treatments are associated with substantial toxic effects. Thus, there is an unmet need for effective and safe long-term medications for these patients. Simpson et al. reported the results of two phase 3 trials of dupilumab monotherapy (SOLO 1 and SOLO 2) in adults with moderate-to-severe atopic dermatitis whose disease was inadequately controlled by topical treatment or for whom topical treatment was medically inadvisable. In these placebo-controlled trials, dupilumab, a human monoclonal antibody against interleukin-4 receptor alpha, was effective in controlling the signs and symptoms of atopic dermatitis. A new Original Article explains.

Clinical Pearl

What are some of the features of atopic dermatitis?

Atopic dermatitis is a chronic, relapsing inflammatory skin disease that is characterized by the up-regulation of type 2 immune responses (including those involving type 2 helper T cells), an impaired skin barrier, and increased Staphylococcus aureus colonization. In patients with moderate-to-severe atopic dermatitis, skin lesions can encompass a large body-surface area and are frequently accompanied by intense, persistent pruritus, which leads to sleep deprivation, symptoms of anxiety or depression, and a poor quality of life.

Clinical Pearl

• Why might dupilumab be an effective therapy for atopic dermatitis?

Dupilumab is a fully human monoclonal antibody that binds specifically to the shared alpha chain subunit of the interleukin-4 and interleukin-13 receptors, thereby inhibiting the signaling of interleukin-4 and interleukin-13, which are type 2 inflammatory cytokines that may be important drivers of atopic or allergic diseases such as atopic dermatitis and asthma. In support of this premise, early-phase trials of dupilumab showed efficacy in patients with atopic dermatitis, those with asthma, and those with chronic sinusitis with nasal polyposis — all of which are conditions that have type 2 immunologic signatures.

Morning Report Questions

Q: Does dupilumab ameliorate the signs and symptoms of atopic dermatitis as compared to placebo?

A: In the SOLO trials, patients were randomly assigned in a 1:1:1 ratio to receive, for 16 weeks, weekly subcutaneous injections of dupilumab (300 mg) or placebo or the same dose of dupilumab every other week alternating with placebo. In SOLO 1 and SOLO 2, both dose regimens of dupilumab resulted in better results than placebo over 16 weeks of treatment across multiple outcome measures that reflected objective signs of atopic dermatitis, subjective symptoms (e.g., pruritus), important aspects of mental health (i.e., anxiety and depression), and quality of life. The mean efficacy results were similar for both dupilumab regimens. SOLO 1 and SOLO 2 were designed to provide replication of results, and patient populations and results were highly consistent in the two trials.

Q: What were some of the adverse events noted during the 16-week treatment period in the SOLO trials?

A: The most common adverse events in the two trials were exacerbations of atopic dermatitis, injection-site reactions, and nasopharyngitis. The incidence of nasopharyngitis was generally balanced across dupilumab and placebo groups. Exacerbations of atopic dermatitis and most types of skin infections were more common in the placebo groups. Injection-site reactions and conjunctivitis were more frequent in patients receiving dupilumab than in those receiving placebo.

In Coronary Artery Disease, Can Nurture Override Nature?

Posted by • December 14th, 2016


Click to enlarge

Mr. Locke, a 48-year-old man with prehypertension, comes to your office for a routine visit. He has a strong family history of coronary artery disease (CAD); his father and two brothers had myocardial infarctions in their 50s. His BMI is 31 kg/m2, he is a nonsmoker, but he does not exercise routinely. You suggest that exercising and focusing on a healthy diet might reduce his risk for CAD, but he wants to know by how much? Is his fate sealed by his family history or could a healthier lifestyle help him reduce the risk of CAD?

Previous observational studies have identified lifestyle factors associated with increased risk of CAD, including family history, smoking, obesity, unhealthy diet, and lack of physical activity. But how much of family history is genetic versus behavioral? How much can lifestyle choices modulate genetic risk? These gene-environment interactions have previously been difficult to quantify, but increasingly the collection and study of genetic data have contributed new insights.

In a study entitled “Genetic Risk, Adherence to a Healthy Lifestyle, and Coronary Disease” published in this week’s issue of NEJM, investigators examined gene-environment interactions in three large prospective cohorts — the Atherosclerosis Risk in Communities (ARIC) study, the Women’s Genome Health Study (WGHS), and the Malmö Diet and Cancer Study (MDCS) — as well as the Bio-Image cross-sectional study. To determine genetic risk, the authors derived a polygenic risk score using single-nucleotide polymorphisms (SNPs) previously associated with CAD in genome-wide association studies, and divided the patients into quintiles from lowest to highest risk. They also collected data on four healthy lifestyle factors (no current smoking, no obesity, physical activity at least once weekly, and a healthy diet pattern) and divided patients into three lifestyle risk categories: favorable (3-4 factors), intermediate (2 factors), and unfavorable (0 or 1 factor).

As expected, in more than 50,000 patients from the three prospective cohorts, the risk of CAD increased from the lowest to the highest quintiles (hazard ratio, 1.91 among participants with high vs. low genetic risk). A family history of CAD was a surrogate, although imperfect, for genetic risk. Likewise, the risk of coronary events was higher in those with unfavorable versus favorable lifestyle.

In this study, the investigators set out to understand not only the role of genes and environment on CAD risk, but also the interaction between genetic and lifestyle risk. To this end, they assessed the effect of lifestyle on the risk of CAD within each category of genetic risk and found that adherence to a favorable lifestyle was beneficial across all genetic risk groups: The relative risk associated with a favorable lifestyle (versus an unfavorable lifestyle) was 45% lower in patients with low genetic risk, 47% lower in those at intermediate genetic risk, and 46% lower in those at high genetic risk. Similarly, they found that the benefit of having a low genetic risk of CAD was offset by an unfavorable lifestyle. The full results are represented visually in the paper, as in the bar graph above. Similar results were found in the cross-sectional Bio-Image study using coronary artery calcification, rather than clinical outcomes, as a marker of subclinical coronary burden.

This study represents the most precise estimates to date of the relative contribution of genetics and lifestyle on risk of CAD. In a large cohort and using robust genetic data, genetic risk and lifestyle behaviors were each independently associated with CAD risk, but favorable or unfavorable lifestyle appeared to modulate genetic risk positively or negatively.

We do not routinely perform genetic testing on patients to calculate genetic risk scores. Therefore, our ability to use these specific risk assessments for counseling is limited. However, this study reinforces the beneficial effect of lifestyle modification on CAD risk, regardless of one’s genetic load. Public health efforts and individual patient counseling should continue to stress the importance of a healthy lifestyle, including smoking cessation, weight reduction, physical activity, and healthy diet, on reducing the risk of CAD.

Stents or Bypass Surgery for Left Main Coronary Artery Disease

Posted by • December 8th, 2016


Click to enlarge

EXCEL (Evaluation of XIENCE versus Coronary Artery Bypass Surgery for Effectiveness of Left Main Revascularization) was an international, open-label, multicenter randomized trial that compared everolimus-eluting stents with coronary-artery bypass grafting (CABG) in patients with left main coronary artery disease. A new Original Article explains how at 3 years, PCI was noninferior to CABG with respect to the rate of death, stroke, or myocardial infarction.

Clinical Pearl

• How are patients with obstructive left main coronary artery disease usually treated?

Left main coronary artery disease is associated with high morbidity and mortality owing to the large amount of myocardium at risk. European and U.S. guidelines recommend that most patients with left main coronary artery disease undergo CABG.

Clinical Pearl

• In what subgroup of patients with left main coronary artery disease might percutaneous coronary intervention (PCI) be an acceptable alternative to CABG?

Randomized trials have suggested that PCI with drug-eluting stents might be an acceptable alternative for selected patients with left main coronary disease. Specifically, in the subgroup of patients with left main coronary disease in the Synergy between PCI with Taxus and Cardiac Surgery (SYNTAX) trial, the rate of a composite of death, stroke, myocardial infarction, or unplanned revascularization at 5 years was similar among patients treated with paclitaxel-eluting stents and those treated with CABG. However, the outcomes of PCI were acceptable only in the patients with coronary artery disease of low or intermediate anatomical complexity, a hypothesis-generating subgroup observation that motivated the EXCEL trial.

Morning Report Questions

Q: Is PCI noninferior to CABG for left main coronary artery disease of low or intermediate anatomical complexity?

A: In the EXCEL trial involving patients with left main coronary artery disease and low or intermediate SYNTAX scores, PCI with everolimus-eluting stents was noninferior to CABG with respect to the primary composite end point of death, stroke, or myocardial infarction at 3 years. The primary composite end-point event of death, stroke, or myocardial infarction at 3 years occurred in 15.4% of the patients in the PCI group and in 14.7% of the patients in the CABG group (difference, 0.7 percentage points; upper 97.5% confidence limit, 4.0 percentage points; P=0.02 for noninferiority; hazard ratio, 1.00; 95% confidence interval [CI], 0.79 to 1.26; P=0.98 for superiority). The relative treatment effect for the primary end point was consistent across prespecified subgroups, including the subgroup defined according to the presence versus absence of diabetes.


Click to enlarge

Q: What changes in practice since the time of the SYNTAX trial might improve outcomes with PCI?

A: Since the time that the SYNTAX trial was conducted, changes in practice have occurred that would be expected to improve outcomes with PCI. In the EXCEL trial, the authors used everolimus-eluting stents almost exclusively; these stents are associated with a low rate of stent thrombosis. In addition, intravascular ultrasonographic imaging guidance was used in nearly 80% of the patients in the PCI group in the EXCEL trial, a practice that has been associated with higher event-free survival after left main coronary-artery stenting.