Influenza Vaccination

Posted by • September 29th, 2016

influenza-header

Click to enlarge

The effects of influenza traditionally have been assessed by comparing hospitalizations and deaths during an influenza season with a baseline model. These calculations suggest that seasonal influenza epidemics in the United States are responsible for between 55,000 and 431,000 hospitalizations due to pneumonia and influenza each year and as many as 49,000 deaths. The highest levels of influenza-attributable hospitalizations and deaths tend to occur in years in which H3N2 viruses predominate. Influenza vaccines confer considerable but incomplete protection and are recommended for everyone. The Advisory Committee on Immunization Practices does not endorse a specific vaccine but recommends against the live attenuated vaccine during 2016–2017 in the United States in a new Clinical Practice article.

Clinical Pearl

• At what time of the year are decisions made regarding the composition of each annual influenza vaccine?

The specific influenza viruses that will be included in the vaccine each year are determined by worldwide surveillance and antigenic characterization of human viral isolates by the Global Influenza Surveillance and Response System of the World Health Organization. Currently, the production process requires that these decisions be made in February to allow for the production of vaccines to be distributed in the Northern Hemisphere in the following fall.

Clinical Pearl

• Is the inactivated quadrivalent formulation of the influenza vaccine more effective than the inactivated trivalent formulation?

Since 1977, inactivated vaccines have contained three components — a recent H1N1 virus, an H3N2 virus, and an influenza B virus — in a so-called trivalent formulation (IIV3). Since approximately 1980, two antigenically distinct lineages of influenza B virus have cocirculated, and many inactivated vaccines now include both B lineages in a quadrivalent formulation (IIV4). Studies have shown that the addition of the fourth component does not interfere with the immune response to the other three components, but direct evidence of enhanced protection from IIV4, as compared with IIV3 formulations, is lacking.

influenza-table-1

Click to enlarge

Morning Report Questions

Q: Is the high-dose influenza vaccine more effective than the standard-dose vaccine?

A: Antibodies against the viral attachment protein hemagglutinin (HA) prevent entry of the virus into cells, neutralize virus in vitro, and are associated with protection in clinical studies. The serum HA-inhibition (HAI) assay is the primary means of assessing serum antibody responses to standard influenza vaccines. Higher levels of HAI antibodies are associated with increased protection against influenza, but no absolute value of antibodies uniformly predicts protection. Although the dose–response curve for IIVs is rather flat, administration of increased doses of HA protein does result in levels of postvaccination serum HAI antibodies that are higher than those with lower doses. In one very large, randomized, comparative trial, a vaccine containing approximately four times the standard dose of HA was shown to provide significantly greater protection than the standard-dose vaccine against laboratory-confirmed influenza in persons who were 65 years of age or older (incidence of influenza, 1.9% in the standard-dose group vs. 1.4% in the high-dose group). The enhanced protective effect was primarily against H3N2 viruses, the subtype with the greatest effect on older adults. Some, but not all, postmarketing studies of this high-dose vaccine (IIV3-HD) have confirmed the enhanced effectiveness of high-dose vaccine in older persons. Serious adverse events have not been more frequent with the high-dose vaccine than with the standard-dose vaccine, but pain at the injection site has been reported more often (36% vs. 24%).

Q: Have any recommendations been issued to date regarding vaccination for the 2016–2017 influenza season?

A: One live attenuated influenza vaccine (LAIV4) is licensed in the United States. LAIV is administered intranasally, and the limited replication of the vaccine viruses in the upper respiratory tract induces immunity against influenza. Observational studies have recently called into question the effectiveness of LAIV. Analysis of data collected from 2010 through 2014 showed similar levels of effectiveness of LAIV and IIV against H3N2 and B viruses, but a decreased effectiveness of LAIV, especially against H1N1 viruses in the 2010–2011 and 2013–2014 seasons. Preliminary data for 2015–2016 have also suggested minimal effectiveness of LAIV, and the Advisory Committee on Immunization Practices (ACIP) has recommended that LAIV not be used in the 2016–2017 vaccination season.

Azithromycin Prophylaxis for Cesarean Delivery

Posted by • September 29th, 2016

azithromycin-header

Click to enlarge

Cesarean delivery is the most common major surgical procedure and is associated with a rate of surgical-site infection (including endometritis and wound infection) that is 5 to 10 times the rate for vaginal delivery. Tita et al. assessed whether the addition of azithromycin to standard antibiotic prophylaxis before skin incision would reduce the incidence of infection after cesarean section among women who were undergoing nonelective cesarean delivery during labor or after membrane rupture. In this new Original Article involving women who received standard antibiotic prophylaxis for nonelective cesarean section, the risk of infection after surgery was lower with the addition of azithromycin than with placebo.

Clinical Pearl

• How does pregnancy-associated infection rank as a cause of maternal death in the United States?

Globally, pregnancy-associated infection is a major cause of maternal death and is the fourth most common cause in the United States.

Clinical Pearl

• How often do postoperative infections occur after nonelective cesarean delivery?

Despite routine use of antibiotic prophylaxis (commonly, a cephalosporin given before skin incision), infection after cesarean section remains an important concern, particularly among women who undergo nonelective procedures (i.e., unscheduled cesarean section during labor, after membrane rupture, or for maternal or fetal emergencies). As many as 60 to 70% of all cesarean deliveries are nonelective; postoperative infections occur in up to 12% of women undergoing nonelective cesarean delivery with standard preincision prophylaxis.

Morning Report Questions

Q: Does the addition of azithromycin to standard antibiotic prophylaxis reduce the frequency of infection after nonelective cesarean section?

A: In the study by Tita et al., the authors found that the addition of azithromycin to standard antibiotic prophylaxis significantly reduced the frequency of infection after nonelective cesarean section. The primary outcome was a composite of endometritis, wound infection, or other infections (abdominopelvic abscess, maternal sepsis, pelvic septic thrombophlebitis, pyelonephritis, pneumonia, or meningitis) occurring up to 6 weeks after surgery. The primary composite outcome occurred in 62 women (6.1%) who received azithromycin and in 119 (12.0%) who received placebo (relative risk, 0.51; 95% confidence interval [CI], 0.38 to 0.68; P<0.001). The use of azithromycin was associated with significantly lower rates of endometritis (3.8% vs. 6.1%; relative risk, 0.62; 95% CI, 0.42 to 0.92; P=0.02) and wound infections (2.4% vs. 6.6%; relative risk, 0.35; 95% CI, 0.22 to 0.56; P<0.001). The risks of other infections were low and did not differ significantly between groups.

azithromycin-table-3

Click to enlarge

Q: Does the addition of azithromycin to standard antibiotic prophylaxis for nonelective cesarean delivery increase the risk of serious neonatal complications? 

A: In the study by Tita et al., there was no significant between-group difference in a secondary neonatal composite outcome that included neonatal death and serious neonatal complications (14.3% vs. 13.6%, P=0.63).

Drug-Eluting or Bare-Metal Stents for Coronary Artery Disease

Posted by • September 26th, 2016

heart-stents

Click to enlarge

Mr. Patrick is a 59-year-old man admitted to the hospital with a non-ST elevation myocardial infarction (NSTEMI). You start him on an aspirin, P2Y12 inhibitor, statin, and heparin, and discuss the need for cardiac catheterization and possible stent placement. Mr. Patrick has heard that there are different types of stents available and wants to know which one is best. How do you answer Mr. Patrick’s question?

Percutaneous coronary intervention (PCI) began with balloon angioplasty and has since evolved with the development of bare-metal stents, followed by first-generation drug-eluting stents, and now second-generation drug-eluting stents. Second-generation drug-eluting stents have been shown to be associated with lower risk of stent restenosis than first-generation drug-eluting stents. However, the newest drug-eluting stents and bare-metal stents have only been compared in small studies with limited generalizability and in meta-analyses. Additionally, since their development three decades ago, bare-metal stent technology has improved, with changes in metal composition and thinner struts. A large randomized trial that compares these updated bare-metal stents with contemporary drug-eluting stents in the era of medical therapy with antiplatelet agents and statins is needed.

In the Norwegian Coronary Stent Trial (NORSTENT), published in this week’s NEJM, investigators screened all patients at all eight PCI centers in Norway from 2008 to 2011, and enrolled those with lesions in native coronary arteries or coronary-artery grafts. After excluding patients with prior stent placement, limited life expectancy, and contraindications to long-term dual anti-platelet therapy, 9013 patients were randomized to receive either drug-eluting stents or bare-metal stents. Patients in both groups received aspirin (75 mg daily) indefinitely and clopidogrel (75 mg daily for 9 months) after PCI.

At 6 years of follow-up, the rate of the primary composite outcome of death from any cause and nonfatal spontaneous myocardial infarction did not differ between the drug-eluting stent group and the bare-metal stent group (16.6% vs. 17.1%, P=0.66). The rate of any revascularization, a secondary endpoint, was lower in the drug-eluting stent arm (16.5% vs. 19.8%, P<0.001). Rates of definite stent thrombosis were 0.8% and 1.2%, respectively (P=0.0498). Measures of quality of life and disease-specific health status on the Seattle Angina questionnaire did not differ between the groups.

In an accompanying editorial, Dr. Eric Bates from the Division of Cardiovascular Diseases at University of Michigan Medical Center writes, “The outcomes with second-generation drug-eluting stents make them preferred in most clinical situations, and recent recommendations for shorter-duration dual-antiplatelet therapy make that choice even more attractive.” But he adds, “Nevertheless, the use of bare-metal stents remains an important option for PCI in some patients,” citing previous studies showing low restenosis rates in patients with large-vessel diameters, and in those who cannot take dual anti-platelet therapy for an extended period of time due to bleeding, cost, anticipated surgery, or need for anticoagulation for another indication.

The NORSTENT study answers important questions about outcomes for patients with modern coronary artery stents. The results represent a large number of patients from across an entire country, allowing for greater generalizability than results of prior trials. John Jarcho, deputy editor at NEJM, noted that, “NORSTENT confirmed that the principal advantage of drug-eluting stents compared to bare-metal stents is the lower rate of subsequent revascularization. It also found that rates of stent thrombosis, a major concern with first-generation drug-eluting stents, are low with both newer-generation drug-eluting stents and bare-metal stents.”

Reversal of Factor Xa Inhibitor-Associated Acute Major Bleeding

Posted by • September 23rd, 2016

factor-x

Click to enlarge

Use of anticoagulants

It is estimated that slightly more than 1 in 7 strokes is due to atrial fibrillation. The use of anticoagulants reduces this risk of thromboembolism. In recent years, a number of direct oral anticoagulants (such as apixaban, rivaroxaban, and edoxaban) that inhibit active coagulation factor X have been approved for stroke prevention in patients with atrial fibrillation, prevention and treatment of venous thromboembolism, and management of acute coronary syndrome. These drugs have more predictable pharmacokinetics than warfarin and obviate the need for prothrombin-time monitoring and multiple dose adjustments. However, many patients and clinicians have expressed the concern that reversal agents are not available, and therefore, have been hesitant to adopt the use of such drugs, despite data from clinical trials showing that factor Xa inhibitors are at least as efficacious as warfarin for prevention and treatment of thromboembolism complications, with lower risk of intracranial bleeding (see N Engl J Med 2011; 365:981; 2010; 363:2487; 2013; 369:799; 2011; 365:883; 2010; 363:2499; 2008; 358:2765; 2013; 369:2093; 2013; 369:1406).

What is the ANNEXA-4 study about?

In a multicenter, open-label, single-group study, investigators evaluated 67 patients who presented with acute major bleeding within 18 hours after the administration of a factor Xa inhibitor. The patients were treated with andexanet, a recombinant modified human factor Xa decoy protein that has been shown to reverse the inhibition of factor Xa in healthy volunteers. After receiving a bolus of andexanet followed by a 2-hour infusion of the drug, the patients were evaluated for pharmacodynamic outcomes (changes in the reduction of anti-factor Xa activity over time) and clinical outcomes (hemostasis efficacy and safety).

Who participated in the study?

The mean age of the patients was 77 years and the predominant indication for anticoagulation was atrial fibrillation (70%). Of the 67 patients, a majority had received rivaroxaban (n=32) or apixaban (n=31). The primary acute major bleeding sites were GI tract (49%) and intracranial (42%). The time from presentation to andexanet bolus was about 5 hours.

What were the results?

After the administration of andexanet bolus, the median anti-factor Xa activity was reduced by 89% among patients receiving rivaroxaban (95% CI, 58-94) and by 93% among those receiving apixaban (95% CI, 87-94). The reduction of anti-factor Xa activity was stable during the 2-hour infusion. Four hours after the conclusion of the infusion, the median anti-factor Xa activity was reduced by 39% (rivaroxaban) and 30% (apixaban), as compared to baseline measurements.

Twelve hours after the andexanet infusion, clinical hemostasis effectiveness was judged to be excellent or good in about 80% of patients. However, thrombotic events occurred in about 18% of patients at 30-day follow-up.

So, should I worry about safety? Does this potentially change my practice?

In an accompanying editorial, Beverley Hunt (St Thomas’ Hospital, UK) and Marcel Levi (University of Amsterdam, Netherlands) state, “It is impossible to know whether andexanet had an intrinsic prothrombotic effect or whether the high rate of thrombosis was related to the absence of an antithrombotic agent in a high-risk situation, since the presence of major bleeding alone is associated with an increased subsequent rate of venous thromboembolism.”

The editorialists believe that the actual need for antidote is likely to be small: “Because the half-lives of  direct oral anticoagulants are shorter than that of  warfarin, the effects of the drugs  wear off quickly, and unlike the case with warfarin, stopping the drug may be all that is required in most scenarios.”

What is my takeaway?

In patients with acute major bleeding associated with factor Xa inhibitor use, an initial bolus and 2-hour infusion with andexanet subsequently reduced anti-factor Xa activities with effective hemostasis in a vast majority of the patients. It is reassuring to have an effective antidote to the factor Xa inhibitors. Additional work will be necessary to sort out the optimal duration of reversal in patients with underlying increased clotting risk.

Primary Sclerosing Cholangitis

Posted by • September 22nd, 2016

cholangitis-header

Click to enlarge

Primary sclerosing cholangitis is an idiopathic, heterogeneous, cholestatic liver disease that is characterized by persistent, progressive biliary inflammation and fibrosis. There is no effective medical therapy for this condition. End-stage liver disease necessitating liver transplantation may ultimately develop in affected patients. This new Review Article summarizes the pathogenesis and management of this condition.

Clinical Pearl

How does primary sclerosing cholangitis typically present, and how is it diagnosed?

There are several subtypes of primary sclerosing cholangitis. The classic subtype, which involves the entire biliary tree, is present in approximately 90% of patients with primary sclerosing cholangitis. Primary sclerosing cholangitis is insidious; about half the patients with this condition do not have symptoms but receive a diagnosis after liver-function tests are found to be abnormal. When symptoms are present, abdominal pain (in 20% of patients), pruritus (in 10%), jaundice (in 6%), and fatigue (in 6%) predominate. Diagnostic criteria include an increased serum alkaline phosphatase level that persists for more than 6 months, cholangiographic findings of bile-duct strictures detected by means of either MRCP or ERCP, and exclusion of causes of secondary sclerosing cholangitis. A liver biopsy is not necessary for diagnosis unless small-duct primary sclerosing cholangitis or an overlap with autoimmune hepatitis is suspected.

cholangitis-table-2

Click to enlarge

cholangitis-table-1

Click to enlarge

Clinical Pearl

• What conditions are associated with primary sclerosing cholangitis?

A variety of coexisting conditions are associated with primary sclerosing cholangitis. Because inflammatory bowel disease (ulcerative colitis more often than Crohn’s disease) occurs in most patients with primary sclerosing cholangitis, colonoscopy is warranted in all patients who have received a new diagnosis. The risk of colon cancer among patients with primary sclerosing cholangitis and concomitant inflammatory bowel disease is four times as high as the risk among patients with inflammatory bowel disease alone and 10 times as high as the risk in the general population. Gallbladder disease (stones, polyps, and cancer) is common in patients with primary sclerosing cholangitis. In developed countries, primary sclerosing cholangitis is the most common risk factor for cholangiocarcinoma. Indeed, the risk of cholangiocarcinoma among patients with primary sclerosing cholangitis is 400 times as high as the risk in the general population.

Morning Report Questions

Q: Do treatment guidelines recommend ursodeoxycholic acid for primary sclerosing cholangitis?

A: Ursodeoxycholic acid has been widely studied as a therapy for primary sclerosing cholangitis. In one randomized, double-blind, placebo-controlled trial, patients who received ursodeoxycholic acid had decreased levels of serum liver enzymes, but they did not have higher rates of survival than the rates among patients who received placebo. In a randomized, double-blind, placebo-controlled trial, the risk of the primary end point (death, liver transplantation, minimal listing criteria for liver transplantation, cirrhosis, esophageal or gastric varices, and cholangiocarcinoma) was 2.3 times higher among patients who received high-dose ursodeoxycholic acid (at a dose of 25 mg per kilogram of body weight) than among those who received placebo (P<0.01). Thus, treatment guidelines for primary sclerosing cholangitis are conflicting: the American Association for the Study of Liver Diseases and the American College of Gastroenterology do not support the use of ursodeoxycholic acid, whereas the European Association for the Study of the Liver endorses the use of moderate doses (13 to 15 mg per kilogram). Several new treatments are being assessed in ongoing clinical trials.

Q: What percentage of patients with primary sclerosing cholangitis will eventually require liver transplantation?

A: Because of the progressive nature of primary sclerosing cholangitis, approximately 40% of patients with this disease will ultimately require liver transplantation. In fact, primary sclerosing cholangitis was the indication for approximately 6% of liver transplantations performed in the United States from 1988 through 2015. After liver transplantation for primary sclerosing cholangitis, the 1-year survival rate is approximately 85% and the 5-year survival rate is approximately 72%. Nevertheless, the disorder may recur in approximately 25% of patients after transplantation.

Craniectomy for Traumatic Intracranial Hypertension

Posted by • September 22nd, 2016

craniectomy-header

Click to enlarge

Intracranial hypertension after traumatic brain injury (TBI) is associated with an increased risk of death in most case series. The monitoring of intracranial pressure and the administration of interventions to lower intracranial pressure are routinely used in patients with TBI, despite the lack of level 1 evidence. Hutchinson et al. conducted the Randomised Evaluation of Surgery with Craniectomy for Uncontrollable Elevation of Intracranial Pressure (RESCUEicp) trial to assess the effectiveness of craniectomy as a last-tier intervention in patients with TBI and refractory intracranial hypertension. Evidence from the new Original Article shows that decompressive craniectomy resulted in lower mortality and higher rates of vegetative state and severe disability.

Clinical Pearl

What different types of craniectomy have been used to treat traumatic intracranial hypertension?

Decompressive craniectomy is a surgical procedure in which a large section of the skull is removed and the underlying dura mater is opened. Primary decompressive craniectomy refers to leaving a large bone flap out after the evacuation of an intracranial hematoma in the early phase after a TBI. A secondary decompressive craniectomy is used as part of tiered therapeutic protocols that are frequently used in intensive care units in order to control raised intracranial pressure and to ensure adequate cerebral perfusion pressure after TBI.

Clinical Pearl

What has been learned from randomized trials that assessed craniectomy as an early intervention for traumatic intracranial hypertension?

In the Decompressive Craniectomy (DECRA) trial, patients who had an intracranial pressure of more than 20 mm Hg for more than 15 minutes (continuously or intermittently) within a 1-hour period, despite optimized first-tier interventions, were randomly assigned to early bifrontal decompressive craniectomy and standard care or to standard care alone. The authors found that decompressive craniectomy was associated with more unfavorable outcomes than standard care alone.

Morning Report Questions

Q: What are the eight outcome categories that comprise the Extended Glasgow Outcome Scale (GOS-E)?

A: In the study by Hutchinson et al., the primary-outcome measure was assessed with the use of the Extended Glasgow Outcome Scale (GOS-E) at 6 months after randomization. The GOS-E is a global outcome scale assessing functional independence, work, social and leisure activities, and personal relationships. Its eight outcome categories are as follows: death, vegetative state (unable to obey commands), lower severe disability (dependent on others for care), upper severe disability (independent at home), lower moderate disability (independent at home and outside the home but with some physical or mental disability), upper moderate disability (independent at home and outside the home but with some physical or mental disability, with less disruption than lower moderate disability), lower good recovery (able to resume normal activities with some injury-related problems), and upper good recovery (no problems).

Q: What clinical outcomes are associated with craniectomy performed as a last-tier intervention for refractory traumatic intracranial hypertension? 

A: In the RESCUEicp trial, the authors found that the rate of death at 6 months was 26.9% in the surgical group and 48.9% in the medical group. The rate of vegetative state was 8.5% versus 2.1%; the rate of lower severe disability (dependent on others for care), 21.9% versus 14.4%; the rate of upper severe disability (independent at home), 15.4% versus 8.0%; the rate of moderate disability, 23.4% versus 19.7%; and the rate of good recovery, 4.0% versus 6.9%. At 12 months after randomization, 30.4% of the patients in the surgical group had died, as compared with 52.0% in the medical group. The rate of vegetative state was 6.2% in the surgical group versus 1.7% in the medical group; the rate of lower severe disability, 18.0% versus 14.0%; the rate of upper severe disability, 13.4% versus 3.9%; the rate of moderate disability, 22.2% versus 20.1%, and the rate of good recovery, 9.8% versus 8.4%.

craniectomy-fig-2

Click to enlarge

craniectomy-fig-1

Click to enlarge

craniectomy-table-1

Click to enlarge

A 31-Year-Old Woman with Infertility

Posted by • September 15th, 2016

infertility header

Click to enlarge

Tuberculous endometrial granulomas take a while to become caseated; women of reproductive age, who regularly shed their endometrial lining, may do so before caseation has had the opportunity to develop. In older women, who have longer cycles or do not have cycles, caseating granulomas in the endometrium are more likely to develop. A 31-year-old Nepalese woman presented with primary infertility. Two cycles of in vitro fertilization had been unsuccessful. A hysterosalpingogram showed abnormal narrowing and outpouching of the distal fallopian tubes.
Additional diagnostic procedures were performed in a new Case Record.

Clinical Pearl

• When granulomas are found in the endometrium, what is the most likely diagnosis?

When granulomas are found in the endometrium, tuberculosis (most commonly due to Mycobacterium tuberculosis) must be considered to be the most likely cause.

infertility table 2

Click to enlarge

infertility fig 2

Click to enlarge

Clinical Pearl

The endometrium is affected in what percentage of patients with genital tuberculosis?

The endometrium is affected in 50 to 75% of patients with genital tuberculosis. The infection is thought to spread through the blood, or possibly through the lymphatics, from the site of primary infection to the fallopian tubes, and from there it seeds the endometrium through direct drainage.

Morning Report Questions

Q: What changes may be seen on hysterosalpingography in women with genital tuberculosis? 

A: Salpingitis isthmica nodosa–like changes and tubal occlusion have been found on hysterosalpingography in women with genital tuberculosis. The involvement of the genital tract can be protean. Loss of tubal epithelial architecture due to infection (which results in a “pipe stem” appearance) and the presence of intraluminal filling defects possibly due to granulomas (which result in a “leopard skin” pattern) are suggestive of genital tuberculosis.

infertility fig 1

Click to enlarge

Q: How is genital tuberculosis diagnosed and treated?

A: Genitourinary tuberculosis is most often paucibacillary, and culture and other diagnostic tests (including nucleic acid testing) provide higher sensitivity than do special stains for acid-fast organisms. Short-course chemotherapy for a duration of at least 6 to 9 months has been associated with a risk of recurrent disease of less than 10%. Despite the administration of appropriate treatment, overall rates of live birth are still lower among women who have had genitourinary tuberculosis than in the general population.

Herpes Zoster Subunit Vaccine in Adults 70 Years of Age or Older

Posted by • September 15th, 2016

herpes header

Click to enlarge

An investigational herpes zoster subunit vaccine (HZ/su) is being evaluated for the prevention of herpes zoster and postherpetic neuralgia in adults 50 years of age or older. A previous trial (Zoster Efficacy Study in Adults 50 Years of Age or Older [ZOE-50]) showed that HZ/su had a vaccine efficacy against herpes zoster of 97.2%, which was consistent across all age groups. Although 24% of the participants in ZOE-50 were 70 years of age or older, the trial was not intended to definitively assess vaccine efficacy against herpes zoster or postherpetic neuralgia in this age group. Therefore, in parallel with ZOE-50, Cunningham et al. conducted a second trial involving only adults who were 70 years of age or older (Zoster Efficacy Study in Adults 70 Years of Age or Older [ZOE-70]) to assess vaccine efficacy against herpes zoster in this population; the authors also estimated vaccine efficacy against postherpetic neuralgia in the combined population (i.e., from ZOE-50 plus ZOE-70) of adults 70 years of age or older and adults 50 years of age or older. The results of this trial can be found in a new Original Article.

Clinical Pearl

• How is the HZ/su vaccine administered?

In the ZOE-70 trial, vaccine or placebo (0.9% saline solution) was administered (0.5 ml) into the deltoid muscle at month 0 and month 2.

Clinical Pearl

• What is the efficacy of the HZ/su vaccine among adults 70 years of age and older?

In the ZOE-70 total vaccinated cohort, 432 suspected episodes of herpes zoster were reported, 270 of which were confirmed as herpes zoster. Of the 270 confirmed cases, 246 occurred in the modified vaccinated cohort (the primary cohort for the efficacy analysis): 23 in HZ/su recipients and 223 in placebo recipients, after a mean follow-up period of 3.7 years. The incidence of herpes zoster per 1000 person-years was 0.9 in the HZ/su group and 9.2 in the placebo group, for an overall vaccine efficacy of 89.8% (95% confidence interval [CI], 84.2 to 93.7; P<0.001). In the pooled analysis of participants 70 years of age or older from ZOE-70 and ZOE-50, a total of 25 confirmed cases of herpes zoster occurred in HZ/su recipients, as compared with 284 cases in placebo recipients, which resulted in a vaccine efficacy of 91.3% against herpes zoster (95% CI, 86.8 to 94.5%).

herpes table 1

Click to enlarge

herpes table 2

Click to enlarge

Morning Report Questions

Q: Is the efficacy of HZ/su different for those between 70 and 79 years of age as compared to those 80 years of age or older?

A: Vaccine efficacy against herpes zoster in the pooled analysis was very similar in the two age groups studied (91.3% among participants 70 to 79 years of age and 91.4% among participants ≥80 years of age), which indicated that there was no decline in efficacy with age. This finding is consistent with the results of ZOE-50, in which vaccine efficacy against herpes zoster was found to be similar in all age groups (50 to 59, 60 to 69, and ≥70 years of age), but it contrasts with the efficacy of the approved live attenuated vaccine (Zostavax), which was found to decline with increasing age: 70% in adults 50 to 59 years of age, 64% in adults 60 to 69 years of age, 41% in adults 70 to 79 years of age, and 18% in adults 80 years of age or older.

Q: How does the HZ/su vaccine affect rates of postherpetic neuralgia?

A: In the analyses by Cunningham et al., the incidence of postherpetic neuralgia among HZ/su recipients with breakthrough herpes zoster did not differ significantly from that among placebo recipients (12.5% and 9.6%, respectively; P=0.54). Protection against postherpetic neuralgia appeared to be driven by the lower incidence of herpes zoster (91.3% vaccine efficacy against herpes zoster vs. 88.8% vaccine efficacy against postherpetic neuralgia in the pooled population of adults ≥70 years of age); there is no evidence for additional efficacy against postherpetic neuralgia among HZ/su recipients with breakthrough herpes zoster.

Adalimumab in Patients with Active Non-Infectious Uveitis

Posted by • September 8th, 2016

adalimumab

Click to enlarge

A 44-year-old woman with type 2 diabetes mellitus and sarcoidosis presents with a 3-week history of severe visual disturbance in both eyes. After referral to an ophthalmologist, she is diagnosed with idiopathic posterior uveitis. Her symptoms improved while receiving oral steroids over the course of 5 months. However, after completing steroid treatment, her glycemic control is compromised and she experiences symptoms suggesting peripheral myopathy. Soon after the first episode of uveitis and steroid cessation, the patient returns to see you with a recurrence of the same visual symptoms. She is concerned about the risks associated with continuous steroid exposure and resolution of her visual symptoms. How would you advise her?

Uveitis can threaten vision and is an established cause of blindness. It is characterized by intraocular inflammation that can be due to infectious causes or isolated ocular syndromes, or may be associated with systemic conditions such as Behcet’s disease. Oral steroids are beneficial for patients with uveitis, however, as in this case, a prolonged course can lead to systemic adverse effects. Tumor necrosis factor alpha may play a role in inflammation of the uvea; therefore, adalimumab— an inhibitor of tumor necrosis factor-alpha —is a therapeutic candidate. Adalimumab has been approved for treatment of numerous conditions including rheumatoid arthritis, inflammatory bowel disease, and psoriatic arthritis.

In this week’s issue of NEJM, Jaffe and colleagues compared the efficacy of adalimumab to placebo in 223 patients with active, vision-threatening, non-infectious uveitis in a phase 3 randomized, controlled, double-blind study conducted in 18 countries. Patients in the adalimumab group were treated with an 80-mg dose, followed by 40 mg every other week via subcutaneous injection for the duration of the study. At study entry, all patients received a burst of steroids that was reduced over time and stopped after 15 weeks. The primary endpoint was the time to treatment failure at or after 6 weeks. Treatment failure was defined as a new inflammatory lesion in the eye relative to baseline or measures of intraocular inflammation (e.g., anterior chamber cell, vitreous haze grades, or worsening of best corrected visual acuity).

Among the 217 patients included in the final analysis (110 patients in the adalimumab group and 107 in the placebo group), the median time to treatment failure was significantly longer in the adalimumab group than in the placebo group (24 vs. 13 weeks; hazard ratio, 0.5; 95% CI: 0.36-0.70; P<0.001). Treatment failure in the adalimumab group occurred after steroid cessation.

However, adalimumab was associated with more adverse and serious adverse events than placebo, and more adalimumab recipients discontinued treatment (18 vs. 7). Adverse events such as fatigue, blurred vision, and suicidal ideation were the main reasons for leaving the trial prematurely among those treated with adalimumab. Serious adverse events in the adalimumab group included two cases of malignancy (glioblastoma multiforme and carcinoid of gastrointestinal origin) and one report of tuberculosis.

In this study, treatment with adalimumab lead to early and sustained disease control following cessation of corticosteroids in patients with active non-infectious intermediate, posterior or panuveitis. The results suggest that adalimumab is a therapeutic option for patients who are concerned about prolonged treatment with corticosteroids; however, the risk of adverse events needs to be carefully reviewed with patients.

Acute Sinusitis in Adults

Posted by • September 8th, 2016

sinusitis fig 1

Click to enlarge

Acute bacterial sinusitis — involving purulent nasal discharge and nasal obstruction; facial pain, pressure, or fullness; or both — persists for 10 days or more with no improvement or worsens within 10 days after improvement. However, the natural history of acute sinusitis in adults is very favorable; approximately 85% of persons have a reduction or resolution of symptoms within 7 to 15 days without antibiotic therapy. Nonetheless, antibiotics are prescribed for 84 to 91% of patients with acute sinusitis that is diagnosed in emergency departments and outpatient settings. Watchful waiting and antibiotic therapy are described in a new Clinical Practice article.

Clinical Pearl

• How do temporal patterns of illness help distinguish between viral and bacterial sinusitis in adults?

The temporal pattern of a typical upper respiratory tract infection can be used as a proxy for acute viral sinusitis because nearly 90% of patients with colds have inflammation that extends to the mucous membranes in the paranasal sinuses. Viral upper respiratory symptoms generally peak rapidly, decline by the third day of illness, and end after 1 week, although in 25% of patients the symptoms last longer but decrease. In contrast, acute bacterial sinusitis persists for 10 days or longer without improvement or, less often, manifests with worsening of symptoms in the first 10 days after initial improvement, in a double-worsening pattern.

Clinical Pearl

What is the role of imaging in adults with acute sinusitis?

Purulent nasal discharge is associated with an increased likelihood of bacteria in the maxillary sinus and of radiographic evidence of acute sinusitis. However, neither this finding nor other individual signs or symptoms (e.g., fever or facial pain) can be used to accurately distinguish between bacterial and viral infection. Similarly, findings on plain radiographs and computed tomography cannot be used to distinguish between these two types of infection. Imaging studies are reserved for patients with suspected orbital or intracranial complications.

Morning Report Questions

Q: How does one choose between watchful waiting as compared to antibiotic therapy for the management of acute bacterial sinusitis?

A: The guidelines differ regarding watchful waiting in patients with acute bacterial sinusitis. Since some randomized trials include patients who have been ill for less than 10 days and who are likely to have viral sinusitis, there remains substantial uncertainty about which patients might benefit most from initial antibiotic therapy rather than watchful waiting. This uncertainty is compounded by restrictive inclusion criteria in many trials that exclude patients who are pregnant and those with diabetes and other coexisting conditions. There is also uncertainty about the course and relative incidence of suppurative complications among patients with acute bacterial sinusitis who do not receive antibiotic therapy as compared with those who do receive antibiotic therapy, since many trials include patients with viral sinusitis and exclude patients with severe illness, prolonged symptoms, or disease beyond the maxillary sinuses. Systematic reviews of placebo-controlled trials generally show a significantly higher rate of clinical improvement at 7 to 15 days (the primary outcome in most trials) with antibiotic therapy than with placebo, but they show small differences between groups. Success rates range from 77 to 88% with antibiotic therapy and from 73 to 85% with placebo. The numbers needed to treat with antibiotics (versus placebo) for 1 patient to have clinical improvement are high (7 to 18). The potential benefits of antibiotic therapy must be balanced against adverse effects, which may include allergic reactions and the emergence of drug-resistant bacteria. The numbers needed to harm (i.e., the numbers of patients who would have to receive antibiotics for one adverse effect to occur) range from 8 to 12; this indicates that adverse effects from antibiotics are as likely, or more likely, than benefits.

Q: What antibiotics are used to treat acute bacterial sinusitis?

A: Although up to 90% of patients with viral upper respiratory tract infections have concurrent acute viral sinusitis, only 0.5 to 2.0% have sinusitis that progresses to acute bacterial sinusitis. The most common pathogens in adults with acute bacterial sinusitis are Streptococcus pneumoniae, Haemophilus influenzae, Moraxella catarrhalis, and Staphylococcus aureus. Amoxicillin is the most commonly assessed antibiotic in placebo-controlled trials. Trials of the comparative efficacy of antibiotics have evaluated cefuroxime axetil, amoxicillin–clavulanate, levofloxacin, moxifloxacin, and clarithromycin. No differences in the comparative efficacy of antibiotics in the treatment of acute bacterial sinusitis have been reported, probably because of the high rate of spontaneous improvement and the noninferiority design of most trials. Comparative trials of amoxicillin versus amoxicillin–clavulanate are lacking; the argument for the use of amoxicillin–clavulanate is based on patterns of bacterial resistance. Current guidelines caution against the use of clarithromycin or azithromycin because of macrolide-resistant S. pneumoniae. In May 2016, a Food and Drug Administration advisory recommended that fluoroquinolone antibiotics (levofloxacin and moxifloxacin) be reserved for patients who do not have alternative treatment options. The potential serious side effects of these drugs can involve the tendons, muscles, joints, nerves, and central nervous system. Pregnant women may have nasal vascular engorgement (rhinitis of pregnancy) that can mimic acute sinusitis; this makes accurate diagnosis important. Acceptable antibiotics for the treatment of sinusitis in pregnant women include amoxicillin, amoxicillin–clavulanate, and, in patients who are allergic to penicillin (if the hypersensitivity to penicillin is not immediate [type I]), clindamycin plus cefixime or cefpodoxime. Patients with diabetes or other conditions that compromise the immune system are more likely than patients without these conditions to harbor resistant bacteria, and they should receive amoxicillin–clavulanate.

sinusitis table 1

Click to enlarge

sinusitis table 2

Click to enlarge