A Woman with a Skin Ulcer

Posted by Carla Rothaus • February 12th, 2016

2-9-2016 9-19-51 AMUlceroglandular tularemia is one of several clinical presentations of Francisella tularensis infection. In patients with ulceroglandular tularemia, an ulcer develops at the site of inoculation and is followed by fever, systemic symptoms, and regional lymphadenopathy. Although tularemia can occur year-round, it predominantly occurs during the summer months. The diagnosis of tularemia is quite rare; in the United States, 200 cases per year are reported to the Centers for Disease Control and Prevention.

A 58-year-old woman presented during the summer with fever, neck swelling, and an ulcerated lesion on the forehead. Imaging of the neck showed enlarged, centrally hypodense lymph nodes with infiltration of the surrounding fat. Tests were performed, and a diagnosis was made. A new Case Record summarizes.

Clinical Pearl

• What is included in the differential diagnosis for a patient with an ulcerated skin lesion and lymphadenopathy?

Bubonic plague, caused by Yersinia pestis, is endemic in the southwestern United States. The incubation period for this pathogen is 2 to 8 days, and untreated infection leads to rapid death from septic shock. Patients with rickettsialpox, caused by Rickettsia akari, typically present with a poxlike lesion that contains a central eschar, as well as a characteristic papulovesicular rash on the trunk, arms, or legs that occurs within a week after symptom onset. In patients with cat scratch disease, caused by Bartonella henselae infection, the inoculation site (usually a bite or scratch by an infected cat) rarely has ulceration. The presentation of cat scratch disease is dominated by lymphadenopathy and fever. Primary cutaneous Mycobacterium tuberculosis infection is acquired through direct entry of the pathogen through the skin and causes an ulcerated skin lesion with lymphadenopathy that develops slowly, over 3 to 4 weeks. Given the association of Bacillus anthracis with bioterrorism, it is important to remain familiar with the clinical features of cutaneous anthrax and to at least consider the diagnosis in patients who present with an ulcerated skin lesion, possible eschar, and lymphadenopathy. There has been no reported bioterrorism activity involving anthrax in the United States since 2001. Outside of a bioterrorism event, the diagnosis of cutaneous anthrax is unlikely in the absence of a history of international travel or exposure to animal products from areas where the disease is endemic, such as Africa, Pakistan, and Iraq.

Clinical Pearl

• Tularemia is most common in what region of the United States?

Tularemia most commonly occurs in the south central United States. However, rabbits from the south central United States were imported to the island of Martha’s Vineyard for gaming in 1937. Since then, the 5 to 10 cases of tularemia that are reported each year in Massachusetts occur almost exclusively on Martha’s Vineyard, where tularemia in the rabbit and tick population is endemic.

Morning Report Questions

Q: How is tularemia diagnosed?

A: Because the associated microorganism is fastidious, special laboratory safety considerations are required because infection can occur after exposure to a low dose and through inhalation; therefore, most diagnoses are made serologically. The definitive serologic diagnosis of tularemia requires either that the specific antibody titer must increase to four times as high as the normal level over a period of at least 14 days or that a single positive result must have a titer higher than 1:80 in the absence of a known previous exposure. In cases of ulceroglandular tularemia in which the test for antibodies during the acute phase is nondiagnostic, a culture of the ulcer specimen can provide the most rapid diagnosis; however, this technique is not sensitive. Rapid inoculation of medium and extended incubation may increase the chances of recovering the microorganism in culture. Some public health and reference laboratories also offer polymerase-chain-reaction (PCR) testing for the rapid detection of F. tularensis DNA in clinical specimens, but such testing is not widely available, and the clinical sensitivity of PCR testing for the diagnosis of tularemia has not been well characterized.

Q: What is the recommended treatment for tularemia?

A: The treatment of choice for moderate-to-severe cases of tularemia is an antibiotic agent in the aminoglycoside class, whereas milder cases can be treated with oral agents, including fluoroquinolones (typically ciprofloxacin) or tetracyclines (typically doxycycline), often in combination. These recommendations are supported by a number of recent case series. Among the aminoglycosides, intramuscular streptomycin is preferred; intravenous gentamicin is considered to be an acceptable alternative if streptomycin is unavailable. Some instances of treatment failure with gentamicin have been reported, and rescue therapy with streptomycin has been successful. As patients with moderate-to-severe tularemia improve with the administration of an aminoglycoside, transition to oral therapy with either a fluoroquinolone or a tetracycline is common.

Figure 2. Clinical Photographs.

Urinary Tract Infections in Older Men

Posted by Carla Rothaus • February 12th, 2016

2-9-2016 8-48-55 AMUrinary tract infection in men without indwelling catheters is uncommon among men younger than 60 years of age, but the incidence increases substantially among men 60 years of age or older.

Effective treatment of urinary tract infection in men requires determining whether the infection site is the kidney, bladder, or prostate; the duration and choice of therapy vary with presentation. Chronic bacterial prostatitis requires prolonged antimicrobial therapy. A new Clinical Practice article summarizes.

Clinical Pearl

• What organisms are commonly responsible for urinary tract infections in older men?

A gram-negative organism is isolated from 60 to 80% of samples from older men living in the community who have urinary tract infections. Escherichia coli is the most common organism; other Enterobacteriaceae such as Klebsiella pneumoniae and Proteus mirabilis are isolated less frequently. Enterococcus species are the most common gram-positive organisms. In men without indwelling urinary catheters who live in an institution and who have bacteriuria, E. coli is also the most common pathogen isolated, but P. mirabilis, Pseudomonas aeruginosa, and multidrug-resistant strains are increasingly frequent.

Clinical Pearl

• Does a urinary tract infection in an older man require a work up of upper and lower tracts?

For patients with a first urinary tract infection, evaluation of the upper and lower urinary tract is recommended, given the high prevalence of urologic abnormalities among men who present with urinary tract infection. Residual urine volume should be assessed by means of noninvasive ultrasonography. Patients with fever should have immediate assessment of the upper urinary tract by means of computed tomography (CT) with the use of contrast material or by means of renal ultrasonography to rule out obstruction or other abnormalities requiring source control.

Morning Report Questions

Q: What antibiotics are recommended for the treatment of urinary tract infection?

A: Culture of a urine specimen is essential for the management of suspected urinary tract infection. To limit the overtreatment of asymptomatic bacteriuria, urine specimens should be obtained only from men who have symptoms or signs that are potentially attributable to urinary tract infection. Antimicrobial treatment is selected on the basis of the clinical presentation, known or suspected infecting organism and susceptibilities, the side effect profile of the medication, and renal function. Agents with high levels of urinary excretion should be used. For cystitis, first-line therapies include nitrofurantoin, trimethoprim–sulfamethoxazole, and ciprofloxacin or levofloxacin, typically administered for 7 days. Nitrofurantoin is effective for the treatment of cystitis but has limited tissue penetration and is not effective for the treatment of pyelonephritis or bacterial prostatitis. Initial therapy of acute pyelonephritis is usually with ciprofloxacin or levofloxacin, ceftriaxone, or gentamicin. The duration of treatment is generally 7 to 14 days. A follow-up urine culture is not recommended unless symptoms persist or recur after therapy.

Q: What is the standard of care for bacterial prostatitis?

A: Acute bacterial prostatitis should be treated empirically with broad-spectrum parenteral antibiotics such as extended-spectrum penicillins, ceftriaxone with or without the addition of an aminoglycoside, or a fluoroquinolone. Inappropriate therapy can lead to rapid progression and even death. Approximately one quarter of patients with acute bacterial prostatitis have bacteremia, and 5 to 10% may have associated abscesses in the prostate. Routine ultrasonography of the prostate that is performed to identify a potential prostatic abscess is not recommended for patients who have a prompt response to antimicrobial therapy. Difficulty with urinating is frequently present, and alpha-blocker therapy may be considered; some patients temporarily require catheterization. Therapy should be tailored to the specific organism that has been isolated and should be continued so that a 4-week course of therapy (that includes both the parenteral and oral therapies) is completed. Chronic bacterial prostatitis develops after acute infection in approximately 5% of men. Chronic bacterial prostatitis is usually treated with a fluoroquinolone or trimethoprim–sulfamethoxazole for 30 days. A fluoroquinolone is usually first-line therapy; levofloxacin and ciprofloxacin are equally effective.

Table 1. Antimicrobial Therapy for the Treatment of Urinary Tract Infection and Prostatitis in Men.

Adjunctive Steroids and Harm in Cryptococcal Meningitis

Posted by Joshua Allen-Dicker • February 10th, 2016

2-8-2016 2-41-40 PMIn the recent best-selling book and award-nominated movie, The Martian, astronaut and botanist Mark Watney is stranded alone on Mars.  The story follows his attempts to defy certain death and, through creativity and scientific experimentation, use his limited resources to generate oxygen, grow food, and make it home to Earth.  Recently the medical community has found itself in a similar, albeit less Oscar-worthy, situation: there are almost 1 million cases of HIV-associated cryptococcal meningitis per year, with high rates of associated death and morbidity.  With limited treatment options and years until new therapies may be available, front-line clinicians have been challenged to think creatively.

In this week’s NEJM, Beardsley et al. report on a double-blinded, randomized and placebo-controlled trial that attempts to do just that.  Inspired by prior studies demonstrating a clinical benefit of adjunctive steroids for acute bacterial meningitis and tuberculous meningitis, Beardsley et al. hypothesized that dexamethasone might benefit patients undergoing treatment for cryptococcal meningitis.  Unfortunately, the study was stopped early when preliminary analysis raised concerns that dexamethasone use was associated with clinical harm.

Beginning in February 2013, study-sites in Indonesia, Laos, Thailand, Vietnam, Malawi, and Uganda enrolled adults with HIV, clinical concern for cryptococcal meningitis, and either (1) a positive CSF india ink stain, (2) a positive CSF or blood culture for Cryptococcus, or (3) a positive CSF cryptococcal antigen.

Two-hundred twenty-four patients were randomized to the dexamethasone group and 227 to the placebo group. In addition to 2 weeks of directly-observed intravenous amphotericin and fluconazole followed by 8 weeks of oral consolidation therapy, study participants received adjunctive dexamethasone or placebo for 6 weeks.

Following early study cessation at 22 months, subsequent data analysis did not reveal a significant difference in mortality between the groups at 10 weeks or 6 months, but did demonstrate several concerning findings among the secondary and safety outcomes.  There was a significant association between dexamethasone therapy and decreased rates of CSF cryptococcal clearance (p<0.001) and increased rates of a composite of death or disability at 10 weeks and 6 months (p<0.001, p<0.002).  Additionally, the dexamethasone group had significantly increased rates of clinical adverse events (p=0.01), including rates of AIDS-defining illnesses, infections, and gastrointestinal, renal/urinary and cardiac disorders.

The findings reported by Beardsley et al. provide evidence against the universal use of adjunctive dexamethasone in the treatment of HIV-associated cryptococcal meningitis.  However, this is not the end of the story for those clinicians on a mission to find better ways to treat this serious infection.  A brief review of clinicaltrials.gov revealed several ongoing studies of new approaches to therapy that utilize existing medications.  Additionally, there is ongoing work to improve access to current evidence-based antifungals.  Rather than discourage us, the results of Beardsley et al. reinforce the importance of pragmatic creativity in medicine. As protagonist Mark Watney says, “I guess you could call it a ‘failure,’ but I prefer the term ‘learning experience.’”

Amoxicillin for Severe Acute Malnutrition

Posted by Carla Rothaus • February 5th, 2016

1Severe acute malnutrition affects approximately 19 million children under 5 years of age worldwide and contributes substantially to mortality and the disease burden among children. Only one previous randomized trial has examined the routine use of antibiotics in the community-based treatment of severe acute malnutrition. Isanaka et al. conducted a randomized, double-blind, placebo-controlled trial in Niger that assessed the effect of routine amoxicillin use on nutritional recovery in children with severe acute malnutrition.

The role of routine antibiotic use in the treatment of severe acute malnutrition is unclear. In this randomized, placebo-controlled trial in Niger, amoxicillin did not significantly improve nutritional recovery in children with severe acute malnutrition. A new Original Article summarizes.

Clinical Pearl

• What is the historical reason for the routine use of antibiotics when treating children with acute severe malnutrition?

Bacterial infection can complicate advanced cases of severe acute malnutrition, and the risk of nosocomial infection in inpatient settings can be high. Therefore, in 1999, when all children with severe acute malnutrition were treated as inpatients, the World Health Organization (WHO) recommended routine use of broad-spectrum antibiotics for the management of severe acute malnutrition, irrespective of clinical indications.

Clinical Pearl

• Are children with severe acute malnutrition still routinely treated as inpatients?

In 2007, the WHO and the United Nations endorsed a community-based model for the management of malnutrition, in which children with uncomplicated severe acute malnutrition are treated at home with ready-to-use therapeutic food (RUTF). Community-based treatment emphasizes community mobilization and the finding of active cases, with the goal of reaching greater numbers of malnourished children before clinical complications arise.

Morning Report Questions

Q: Is the routine use of amoxicillin superior to placebo for nutritional recovery in children with uncomplicated severe acute malnutrition?

A: In the study by Isanaka et al., overall, 64% of the children enrolled in the study (1542 of 2399) recovered from severe acute malnutrition. There was no significant between-group difference in the likelihood of nutritional recovery (risk ratio with amoxicillin vs. placebo, 1.05; 95% confidence interval [CI], 0.99 to 1.12). Among children who recovered, the time to recovery was significantly shorter with amoxicillin than with placebo, with a mean treatment duration of 28 days versus 30 days. Amoxicillin had no significant effect among children with a confirmed bacterial infection at admission to the nutritional program and the effect did not vary significantly according to age or sex.

Table 2. Treatment Outcomes According to Study Group.

Q: How do the results of the study by Isanaka et al. compare with those of the only other randomized trial?

A: One other randomized study, from Malawi, evaluated the effect of routine antibiotic therapy for uncomplicated severe acute malnutrition. In that study, amoxicillin significantly reduced the risk of treatment failure (by 24%) and death (by 36%), as compared with placebo. The authors concluded that antibiotics should continue to be used routinely in areas where kwashiorkor and HIV infection are prevalent. Children with HIV infection, however, were not assessed separately, and it was not possible to confirm a benefit among children without HIV infection. In the Isanaka study in Niger, malnutrition was predominantly due to marasmus, and the prevalence of HIV infection was low. Differences in study findings may therefore be due to differences in study populations, as well as in the level of ancillary care and in the frequency of follow-up.

Hereditary Breast and Ovarian Cancer

Posted by Carla Rothaus • February 5th, 2016

2-1-2016 1-50-03 PMHereditary breast and ovarian cancer is a syndrome that involves an increased predisposition to breast cancer, ovarian cancer, or both and an autosomal dominant pattern of transmission. Risk-reducing mastectomy and risk-reducing salpingo-oophorectomy are options for the primary prevention of breast and ovarian cancers, and they have been shown in multiple studies to have efficacy.

The risk of breast and ovarian cancer among women with mutations such as BRCA1 and BRCA2 can be mitigated by operations to remove the organs at greatest risk. Data are presented to assist in deciding what operation should be performed and when it should occur. A new Review Article summarizes.

Clinical Pearl

• What are the estimated cancer risks among carriers of the BRCA1 and BRCA2 mutations?

Among female BRCA1 carriers, the average cumulative risk of breast cancer by 80 years of age is 67% and the average cumulative risk of ovarian cancer is 45%. Among BRCA2 carriers, these risks are 66% and 12%, respectively. By 70 years of age, the cumulative risk of breast cancer is approximately 1% among men with BRCA1 mutations and approximately 7% among men with BRCA2 mutations. The lifetime risk in the general male population is 0.1%.

Figure 1. Cumulative Risk of Breast and Ovarian Cancer.

Clinical Pearl

• How effective are prophylactic mastectomy and salpingo-oophorectomy in reducing the cancer risk in BRCA1/2 carriers?

From 1999 through 2004, the results of four retrospective and prospective observational studies were published. These studies compared breast-cancer outcomes in women who underwent prophylactic mastectomy with outcomes in women at similar risk who did not undergo surgery. Four studies showed a reduction of 90% or more in the risk of subsequent breast cancer among women who underwent prophylactic mastectomy. Updated reports and additional studies have confirmed these initial results; only one small study did not show a significant reduction in the risk of subsequent breast cancer after bilateral mastectomy. Seven efficacy studies of risk-reducing salpingo-oophorectomy for prevention of ovarian cancer and one meta-analysis showed a significant risk reduction of approximately 80% among BRCA1 and BRAC2 carriers. Follow-up times were relatively short, averaging approximately 4 years. Current guidelines recommend risk-reducing salpingo-oophorectomy for both BRCA1 and BRCA2 carriers between the ages of 35 and 40 years who have completed their childbearing.

Morning Report Questions

Q: Is salpingectomy with delayed oophorectomy an effective risk-reducing procedure for women with BRCA1/2 mutations?

A: The discovery that many pelvic serous cancers originate in the fallopian tubes raises the question of whether bilateral salpingectomy with delayed oophorectomy may be an option for premenopausal women who want to delay surgical menopause. Anecdotal reports indicate that this option is being used occasionally. However, data regarding the efficacy of this investigational approach are lacking.

Q: Can tamoxifen be used as an alternative to prophylactic mastectomy in BRCA1/2 carriers?

A: Currently, data on the use of tamoxifen for primary prevention of breast cancer in BRCA1 and BRAC2 carriers are very limited. To the authors’ knowledge, the only prospective data derive from the National Surgical Adjuvant Breast and Bowel Project P1 trial, in which mutation status was determined in the 288 women in whom breast cancer developed. The hazard ratios for the development of breast cancer among women who received tamoxifen were 1.67 (95% CI, 0.32 to 10.7) among BRCA1 carriers and 0.38 (95% CI, 0.06 to 1.56) among BRCA2 carriers. Although these results are limited by small sample sizes, they are consistent with an effect in BRCA2 carriers; approximately 77% of breast cancers in BRCA2 carriers are ER-positive. These results are uninformative for BRCA1 carriers. The major question is whether tamoxifen can provide primary prevention of breast cancer in BRCA1 carriers, in whom 75 to 80% of breast cancers are ER-negative. Currently, the authors think that the data are inadequate to support the use of tamoxifen for primary prevention of breast cancer in BRCA1 carriers. However, given the predominance of ER-positive disease that develops in BRCA2 carriers, tamoxifen is an option for this group.

Table 3. Suggested Approaches to Care of Patients with Hereditary Breast and Ovarian Cancer Syndrome.

Resident Work Hours and the FIRST Trial Results

Posted by Lisa Rosenbaum • February 4th, 2016

3Resident duty hours mean something different for everyone. Listen to residents, program directors, investigators, and ethicists discuss the results of the newly published FIRST trial, “National Cluster-Randomized Trial of Duty-Hour Flexibility in Surgical Training,” and its implications for the future of resident education. You’ll hear from the principal investigator of FIRST, Karl Bilimoria, David Asch, principal investigator of the iCOMPARE trial, as well as the surgical and medical residents in the trenches.

If you would like to engage in a discussion of the results of the FIRST trial with the authors, please join us on the NEJM Group Open Forum, now through February 17th.

This podcast is the first in a new series that reflects medicine’s most pressing issues through the eyes of residents. “The House” provides residents with a forum to share their stories from the bedside, where they are learning far more than the lessons of clinical medicine.  Lisa Rosenbaum is a cardiologist at Brigham and Women’s Hospital in Boston, and National Correspondent for the New England Journal of Medicine.  Dan Weisberg is an Internal Medicine Resident, also at Brigham and Women’s Hospital.


Residency Duty Hours: FIRST, do no harm

Posted by Andrea Merrill • February 2nd, 2016

2-1-2016 3-31-57 PMWhen I started general surgery residency in 2011, my training program was on probation for violating the 80 hour work week as mandated by the Accreditation Council for the Graduate Medical Education (ACGME).  In addition, new regulations were being introduced that year that limited the maximum number of hours an intern and resident could work (16 and 24 hours respectively) and increased the time off in between 24 hour shifts for residents.  Our program worked hard to get off probation with adoption of a night float system and strict recording of duty hours.  Even our attending surgeons, who trained when 120 hours a week was the norm, prodded us to comply with these new restrictions.  This was truly a new era for surgical residency.

My third month of residency I had the opportunity to rotate on one of the most coveted services: a busy private general surgery service run by one surgeon who performed operations that ran the gamut from run-of-the-mill hernias to Whipples (pancreaticoduodenectomies).  Towards the end of my rotation, I saw that he had scheduled an Ivor-Lewis esphagectomy, a complex and challenging case, as the last case of the day.  I had never seen one so I desperately wanted to scrub in for the case.  It had been a long day in the OR and the case didn’t start until close to 6PM- the usual time when interns were supposed to start wrapping things up so they could sign out to the night float team.  Regardless, I scrubbed in to the case with my senior resident and right when we were about to start the case, the attending surgeon turned to me and said, “It’s 6PM, shouldn’t you be signing out?”  For general surgeons who trained in an earlier era prior to mine, this would have represented heresy!  While I won’t reveal whether or not I stayed and violated the duty hours, I will admit that I have not seen another Ivor-Lewis esophagectomy in the 4 years since.

Our probation status was eventually repealed but the new ACGME rules remained.  Many surgeons worried that these new rules, and the 80 hour work week, would negatively affect our general surgery training, limiting our exposure to operating and our acquisition of the much needed experience required to operate independently after training.  There was also concern for patient safety with the new rules which is what had sparked the initiation of the first set of ACGME regulations in 2003.  In 2003 the concern was that overworked and overtired residents would lead to patient errors.  The stricter rules in 2011 tried to further mitigate resident fatigue; however there was a flip-side to reducing maximum shift lengths: loss of continuity of care.  Because we had to work less hours, there were more frequent “hand-offs” or “sign outs” to other interns.  Often you would sign out a patient to an intern or moonlighter who had never met any of the patients.  And in an effort to try to get out of the hospital on time, the likelihood of forgetting something, something important, increased which could also lead to patient error.  With so many questions and concerns regarding adequate residency training and patient safety it was only a matter of time before someone decided to study it.

Enter the Flexibility in Duty Hour Requirements for Surgical Trainees (FIRST) Trial published Online First this week in NEJM by Karl Billmoria et al.  This study is the first of its kind to study the effect of randomizing residency training programs to different duty hour regulations, which many thought could never be done.  Eligible general surgery residency programs and affiliated hospitals were enrolled in the FIRST trial and then stratified by tertiles based on a composite measure of death or serious morbidity.  Within each strata, programs and their hospital affiliates were then cluster randomized to the “Standard Policy” group, required to follow current ACGME rules, or the “Flexible Policy” group which allowed maximum intern and resident shifts to be extended and time off in between shifts to be reduced.  Both groups had to adhere to the ACGME mandated 80 hour work week, number of days off, and frequency of call regardless of group assignment.  Residents were not blinded to their program’s assignment.

The trial was conducted as a non-inferiority trial to look at both patient and resident outcomes.  Primary patient outcome was 30-day postoperative death or serious morbidity, obtained using the American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP®).  Secondary outcomes included other measures of death, morbidity and post-operative complications.  Primary resident outcomes were resident-reported satisfaction with overall quality of resident education and overall well-being.  Secondary resident outcomes included residents’ perceptions and satisfaction regarding the effect of their current duty hour regulations on various aspects of patient care and safety and resident quality of life and personal being.  Separate adherence analyses were performed by surveying program directors at the end of the study.  No data was collected on call schedules or resident duty hour logs.

Included in the final analysis were 115 programs (58 Standard Policy, 57 Flexible Policy) and 148 hospitals (70 Standard Policy, 78 Flexible Policy).  There was no difference in the patient primary outcome of rate of death/serious morbidity between the two groups (9.00% Standard Policy vs 9.06% Flexible Policy, p=0.921).  In adjusted analysis, the Flexible Policy group was noninferior to the Standard Policy for all secondary patient outcomes except for postoperative failure-to-rescue, renal failure, and 30-day postoperative pneumonia, all of which had non-significant differences but did not reach noninferiority.

With regards to resident primary outcomes, there was no difference in rates of dissatisfaction with overall education quality (10.7% Standard Policy vs 11% Flexible Policy) or overall well-being (12.1% Standard Policy vs 14.9% Flexible Policy) between the 2 groups.  There were some notable differences in resident secondary outcomes.  Residents in the Flexible Policy group were significantly less likely to be dissatisfied with continuity of care (OR=0.44 p<0.001) and quality of handoffs/transitions in care (OR=0.69 p=0.011) but were more dissatisfied with time for rest (OR= 1.41 p=0.020).  Flexible Policy residents were also less likely to perceive a negative effect of duty hour policies on patient safety, clinical and operative skills acquisition, OR time and learning/teaching activities (OR for all <1.00 and all p<0.001).  However, Flexible Policy residents did feel more effects of the duty hours on personal time away from the hospital with greater perception of negative effects on several measures (ORs all<1, all p<0.001).  There was no significant difference, though, in perceived effects of duty hours on job satisfaction or morale.

So what do the results of the FIRST trial mean for duty hours for surgical (and non-surgical) residents going forward?  In this study, flexible duty hour policies were noninferior to standard duty hour policies with regards to patient safety and overall resident well-being.  Should this outcome change current ACGME regulations?  The authors of the FIRST trial believe so, stating, “These results merit consideration in future debate and revision of duty hour policies.”  Others interpret the results in a different light.  In the accompanying editorial, Dr. John Birkmeyer respectfully applauds the authors of the FIRST trial for their “very ambitious, scientifically robust study.”  However, he contends that the results prove that “surgeons should stop fighting the ACGME duty hour rules and move on.  The FIRST trial effectively debunks concerns that patients will suffer as a result of increased hand-offs and breaks in continuity of care.”  He instead argues for improving our health systems to reduce dependence on “overworked resident physicians.”  He concludes by saying, “Although few surgical residents would ever acknowledge this publicly, I’m sure many love to hear, “We can take care of this case without you, go home, see your family, and come in fresh tomorrow.””

While this is likely not the end of the debate on resident duty hours, it certainly adds more data to the discussion.  The FIRST trial will spark interesting debate going forward, especially as internal medicine residencies undertake their own RCT, the iCOMPARE trial, to similarly study effects of a more flexible duty hour policy.

Watch the NEJM Quick Take Video: A Trial of Flexible Duty Hours in Surgical Training

(Call to action!)

A Girl with Abdominal Pain

Posted by Carla Rothaus • January 29th, 2016

NEJMcpc1413305_f1Case Records of the Massachusetts General Hospital summarizes torsion of an accessory spleen is most commonly reported in children but can also occur in adults. Patients with torsion may present with acute abdominal pain, but intermittent or chronic pain has also been described, and infarction or rupture leading to acute abdominal hemorrhage can occur.

A 9-year-old girl with chronic constipation was seen in the gastroenterology clinic because of increasingly frequent episodes of abdominal pain with associated nonbilious vomiting. A diagnosis was made.

Clinical Pearl

• Is heartburn the most common manifestation of gastroesophageal reflux disease in children?

Gastroesophageal reflux disease is typically thought to cause heartburn, but vomiting and abdominal pain may be more prevalent among children with this condition.

Clinical Pearl

• Is Helicobacter pylori commonly implicated among children who have chronic abdominal pain?

Disease associated with H. pylori is often considered in the differential diagnosis of abdominal pain, because infection with this bacterium can cause gastritis or peptic ulcer disease. However, a causative role for H. pylori in pediatric chronic abdominal pain has been difficult to prove. Current guidelines for screening pediatric patients for H. pylori–associated disease focus on patients with recalcitrant iron-deficiency anemia, a family history of gastric cancer, or the presence of a peptic ulcer on endoscopic examination.

Morning Report Questions

Q: What are the characteristic clinical manifestations of eosinophilic gastroenteritis?

A: Eosinophilic gastroenteritis is an allergic condition that causes abdominal pain, along with characteristic clinical manifestations that depend on the layer of the gastrointestinal tract (mucosal, serosal, or muscular) that is involved. Mucosal eosinophilic gastroenteritis often causes diarrhea and can be identified by means of endoscopic biopsy. Serosal eosinophilic gastroenteritis is usually manifested by eosinophilic ascites and is diagnosed through a combination of imaging studies and paracentesis. Muscular eosinophilic gastroenteritis often results in obstructive symptoms.

Q: Describe some of the features associated with accessory spleens.

A: Accessory spleens, which can be multiple, are a common finding. A recent single-institution analysis of consecutive abdominal CT scans showed a prevalence of accessory spleens of approximately 11%, whereas autopsy studies suggest that they are present in as many as 30% of persons. Most accessory spleens are smaller than 2 cm in greatest dimension. Accessory spleens are most commonly located around the splenic hilum but may be present along the splenorenal ligament, the gastrosplenic ligament, the tail of the pancreas, or the omentum. The differential diagnosis for ectopic splenic tissue also includes splenosis, which generally occurs after trauma to the spleen, and polysplenia, a rare syndrome in which patients have multiple small spleens and often also have congenital cardiac abnormalities. Most persons with accessory spleens are asymptomatic, but abdominal pain can occur with torsion.

Figure 2. Contrast-Enhanced Abdominal CT Scan.

Brain Disease Model of Addiction

Posted by Carla Rothaus • January 29th, 2016

A Girl with Abdominal Pain2Advances in neurobiology have begun to clarify the mechanisms underlying the profound disruptions in decision-making ability and emotional balance displayed by persons with drug addiction.

The neurobiology of addiction is pointing the way to potential methods of disrupting the neurocircuitry with both pharmaceutical and behavioral tools. Altering the reward and emotional circuits may prevent and treat the problem. A new Review Article summarizes.

Clinical Pearl

• What are some of the criticisms of the concept of addiction as a brain disease?

Although the brain disease model of addiction has yielded effective preventive measures, treatment interventions, and public health policies to address substance-use disorders, the underlying concept of substance abuse as a brain disease continues to be questioned, perhaps because the aberrant, impulsive, and compulsive behaviors that are characteristic of addiction have not been clearly tied to neurobiology. Additional criticisms of the concept of addiction as a brain disease include the failure of this model to identify genetic aberrations or brain abnormalities that consistently apply to persons with addiction and the failure to explain the many instances in which recovery occurs without treatment.

Clinical Pearl

• The onset of addiction is thought to occur predominantly during what period of life?

The findings from neurobiologic research show that addiction is a disease that emerges gradually and that has its onset predominantly during a particular risk period: adolescence. Adolescence is a time when the still-developing brain is particularly sensitive to the effects of drugs. Important factors contributing to the greater vulnerability to drug experimentation and addiction during adolescence reveal that this is a period of enhanced neuroplasticity during which the underdeveloped neural networks necessary for adult-level judgment (the prefrontal cortical regions) cannot yet properly regulate emotion.

Morning Report Questions

Q: What is the neurobiologic explanation for the diminished motivation for engaging in everyday activities that accompanies drug addiction?

A: For many years it was believed that over time persons with addiction would become more sensitive to the rewarding effects of drugs and that this increased sensitivity would be reflected in higher levels of dopamine in the circuits of their brains that process reward (including the nucleus accumbens and the dorsal striatum) than the levels in persons who never had a drug addiction. Although this theory seemed to make sense, research has shown that it is incorrect. In fact, clinical and preclinical studies have shown that drug consumption triggers much smaller increases in dopamine levels in the presence of addiction (in both animals and humans) than in its absence (i.e., in persons who have never used drugs). This attenuated release of dopamine renders the brain’s reward system much less sensitive to stimulation by both drug-related and non–drug-related rewards. As a result, persons with addiction no longer experience the same degree of euphoria from a drug as they did when they first starting using it. It is for this same reason that persons with addiction often become less motivated by everyday stimuli (e.g., relationships and activities) that they had previously found to be motivating and rewarding.

Q: How does the brain model of addiction explain the trouble an addicted person may have following through on a decision to stop taking drugs? 

A: The changes that occur in the reward and emotional circuits of the brain are accompanied by changes in the function of the prefrontal cortical regions, which are involved in executive processes. Specifically, the down-regulation of dopamine signaling that dulls the reward circuits’ sensitivity to pleasure also occurs in prefrontal brain regions and their associated circuits, seriously impairing executive processes, among which are the capacities for self-regulation, decision making, flexibility in the selection and initiation of action, attribution of salience (the assignment of relative value), and monitoring of error. The modulation of the reward and emotional circuits of prefrontal regions is further disrupted by neuroplastic changes in glutamatergic signaling. In persons with addiction, the impaired signaling of dopamine and glutamate in the prefrontal regions of the brain weakens their ability to resist strong urges or to follow through on decisions to stop taking the drug. These effects explain why persons with addiction can be sincere in their desire and intention to stop using a drug and yet simultaneously impulsive and unable to follow through on their resolve.

Figure 1. Stages of the Addiction Cycle.

Belatacept and Long-Term Outcome in Kidney Transplantation

Posted by Andrea Merrill • January 27th, 2016

1-27-2016 1-57-40 PMMedicine is a constant balance of risks and benefits.  The importance of maintaining this balance is especially evident in kidney transplantation.   While methods of immunosuppression in kidney transplantation have improved substantially since the use of total body irradiation to induce tolerance in the 1950s, current immunosuppressive agents such as calcineurin inhibitors still come with  clinically significant side effects and potential for harm.  One of the most worrisome side effects of calcineurin inhibitors, ironically, is nephrotoxicity, which can prematurely lead to graft dysfunction and even loss.

Thus, research efforts have focused on finding alternative immunosuppressive agents that do not confer nephrotoxicity.  One such study, the BENEFIT trial , previously published 3 year outcomes comparing death and graft loss in patients receiving belatacept, a selective costimulation blocker, to those outcomes in patients receiving cyclosporine A, a calcineurin inhibitor.  Initial trial results showed equivalent rates of death and graft loss between the 2 groups but a higher glomerular filtration rate (GFR) and a higher incidence of acute rejection in the belatacept group.  The authors now report 7 year outcomes of belatacept versus cyclosporine A in this week’s issue of NEJM.

The BENEFIT trial randomized 666 patients transplanted with a living or deceased standard donor criteria kidney in a 1:1:1 ratio to receive either high intensity betalacept-based therapy, low intensity belatacept-based therapy or cyclosporine A- based therapy for 36 months.  All patients received basiliximab induction, and adjunctive maintenance therapy with mycophenolate mofetil and glucocorticoids.  About 60-70% of the patients in each treatment arm remained on the assigned treatment through 7 years of follow-up.

While there was no statistically significant difference in the composite primary outcome of death or graft loss after 36 months of follow-up, differences emerged after prolonged follow-up.  At 84 months (7 years) of follow-up, Kaplan–Meier estimated rates of death or graft loss with more intense belatacept, less intense, belatacept, and cyclosporine A were about 13%, 13%, and 22%, respectively.  This corresponds to an HR of 0.57 comparing both the more intense betalacept to cyclosporine A (P=0.0225), and the less intense belatacept with cyclosporine A (P=0.0210).  There was equal contribution from both endpoints- death and graft failure.

Both belatacept regimens also appeared to have an advantage over the cyclosporine A regimen with respect to GFR over time.  After 7 years follow-up there was a significant difference in GFR, with GFR increasing over time in both belatacept groups and decreasing over time for the cyclosporine A group.  Additionally, fewer patients on belatacept developed donor-specific antibodies.   Acute rejection was, however, more common with belatacept with rates of about 25% vs. 18% vs. 11% for high belatacept, low belatacept, and cyclosporine A regimens, respectively. Frequencies of all other adverse events, including infection, malignancy and post-transplant lymphoproliferative disorder, were similar across all treatment groups.

The study indicates that belatacept, a relatively new immunosuppressive agent used in kidney transplantation, has long-term efficacy despite the higher rates of acute rejection with the agent.  Belatacept may offer a promising alternative to cyclosporine, given the possibility of longer preservation of graft function.  Belatacept may also increase adherence, as it is administered as a monthly infusion rather than a twice daily pill and does not require frequent venipunctures for drug level monitoring.  However, the monthly infusion may be seen as a barrier by some patients, and belatacept is more costly than cyclosporine.

Limitations of the study are addressed in the accompanying editorial by Drs. Eliot Heher and James Markmann.  They state, “Looking forward, several lines of inquiry beckon. First, we must define how best to combine belatacept with other available immunosuppressive agents to lower the risk of rejection mediated by memory T Cells resistant to costimulatory blockade.  Second, head-to-head comparisons of belatacept to tacrolimus are needed, as tacrolimus itself is associated with better transplant survival compared to cyclosporine.”  Despite such limitations, the editorialists are optimistic about belatacept’s future concluding, “as is typical with clinical research, we’re at the end of the beginning for belatacept, but with a lucky seven years of experience to guide us in answering the next set of critical questions.”