Amoxicillin for Severe Acute Malnutrition

Posted by Carla Rothaus • February 5th, 2016

1Severe acute malnutrition affects approximately 19 million children under 5 years of age worldwide and contributes substantially to mortality and the disease burden among children. Only one previous randomized trial has examined the routine use of antibiotics in the community-based treatment of severe acute malnutrition. Isanaka et al. conducted a randomized, double-blind, placebo-controlled trial in Niger that assessed the effect of routine amoxicillin use on nutritional recovery in children with severe acute malnutrition.

The role of routine antibiotic use in the treatment of severe acute malnutrition is unclear. In this randomized, placebo-controlled trial in Niger, amoxicillin did not significantly improve nutritional recovery in children with severe acute malnutrition. A new Original Article summarizes.

Clinical Pearl

• What is the historical reason for the routine use of antibiotics when treating children with acute severe malnutrition?

Bacterial infection can complicate advanced cases of severe acute malnutrition, and the risk of nosocomial infection in inpatient settings can be high. Therefore, in 1999, when all children with severe acute malnutrition were treated as inpatients, the World Health Organization (WHO) recommended routine use of broad-spectrum antibiotics for the management of severe acute malnutrition, irrespective of clinical indications.

Clinical Pearl

• Are children with severe acute malnutrition still routinely treated as inpatients?

In 2007, the WHO and the United Nations endorsed a community-based model for the management of malnutrition, in which children with uncomplicated severe acute malnutrition are treated at home with ready-to-use therapeutic food (RUTF). Community-based treatment emphasizes community mobilization and the finding of active cases, with the goal of reaching greater numbers of malnourished children before clinical complications arise.

Morning Report Questions

Q: Is the routine use of amoxicillin superior to placebo for nutritional recovery in children with uncomplicated severe acute malnutrition?

A: In the study by Isanaka et al., overall, 64% of the children enrolled in the study (1542 of 2399) recovered from severe acute malnutrition. There was no significant between-group difference in the likelihood of nutritional recovery (risk ratio with amoxicillin vs. placebo, 1.05; 95% confidence interval [CI], 0.99 to 1.12). Among children who recovered, the time to recovery was significantly shorter with amoxicillin than with placebo, with a mean treatment duration of 28 days versus 30 days. Amoxicillin had no significant effect among children with a confirmed bacterial infection at admission to the nutritional program and the effect did not vary significantly according to age or sex.

Table 2. Treatment Outcomes According to Study Group.

Q: How do the results of the study by Isanaka et al. compare with those of the only other randomized trial?

A: One other randomized study, from Malawi, evaluated the effect of routine antibiotic therapy for uncomplicated severe acute malnutrition. In that study, amoxicillin significantly reduced the risk of treatment failure (by 24%) and death (by 36%), as compared with placebo. The authors concluded that antibiotics should continue to be used routinely in areas where kwashiorkor and HIV infection are prevalent. Children with HIV infection, however, were not assessed separately, and it was not possible to confirm a benefit among children without HIV infection. In the Isanaka study in Niger, malnutrition was predominantly due to marasmus, and the prevalence of HIV infection was low. Differences in study findings may therefore be due to differences in study populations, as well as in the level of ancillary care and in the frequency of follow-up.

Hereditary Breast and Ovarian Cancer

Posted by Carla Rothaus • February 5th, 2016

2-1-2016 1-50-03 PMHereditary breast and ovarian cancer is a syndrome that involves an increased predisposition to breast cancer, ovarian cancer, or both and an autosomal dominant pattern of transmission. Risk-reducing mastectomy and risk-reducing salpingo-oophorectomy are options for the primary prevention of breast and ovarian cancers, and they have been shown in multiple studies to have efficacy.

The risk of breast and ovarian cancer among women with mutations such as BRCA1 and BRCA2 can be mitigated by operations to remove the organs at greatest risk. Data are presented to assist in deciding what operation should be performed and when it should occur. A new Review Article summarizes.

Clinical Pearl

• What are the estimated cancer risks among carriers of the BRCA1 and BRCA2 mutations?

Among female BRCA1 carriers, the average cumulative risk of breast cancer by 80 years of age is 67% and the average cumulative risk of ovarian cancer is 45%. Among BRCA2 carriers, these risks are 66% and 12%, respectively. By 70 years of age, the cumulative risk of breast cancer is approximately 1% among men with BRCA1 mutations and approximately 7% among men with BRCA2 mutations. The lifetime risk in the general male population is 0.1%.

Figure 1. Cumulative Risk of Breast and Ovarian Cancer.

Clinical Pearl

• How effective are prophylactic mastectomy and salpingo-oophorectomy in reducing the cancer risk in BRCA1/2 carriers?

From 1999 through 2004, the results of four retrospective and prospective observational studies were published. These studies compared breast-cancer outcomes in women who underwent prophylactic mastectomy with outcomes in women at similar risk who did not undergo surgery. Four studies showed a reduction of 90% or more in the risk of subsequent breast cancer among women who underwent prophylactic mastectomy. Updated reports and additional studies have confirmed these initial results; only one small study did not show a significant reduction in the risk of subsequent breast cancer after bilateral mastectomy. Seven efficacy studies of risk-reducing salpingo-oophorectomy for prevention of ovarian cancer and one meta-analysis showed a significant risk reduction of approximately 80% among BRCA1 and BRAC2 carriers. Follow-up times were relatively short, averaging approximately 4 years. Current guidelines recommend risk-reducing salpingo-oophorectomy for both BRCA1 and BRCA2 carriers between the ages of 35 and 40 years who have completed their childbearing.

Morning Report Questions

Q: Is salpingectomy with delayed oophorectomy an effective risk-reducing procedure for women with BRCA1/2 mutations?

A: The discovery that many pelvic serous cancers originate in the fallopian tubes raises the question of whether bilateral salpingectomy with delayed oophorectomy may be an option for premenopausal women who want to delay surgical menopause. Anecdotal reports indicate that this option is being used occasionally. However, data regarding the efficacy of this investigational approach are lacking.

Q: Can tamoxifen be used as an alternative to prophylactic mastectomy in BRCA1/2 carriers?

A: Currently, data on the use of tamoxifen for primary prevention of breast cancer in BRCA1 and BRAC2 carriers are very limited. To the authors’ knowledge, the only prospective data derive from the National Surgical Adjuvant Breast and Bowel Project P1 trial, in which mutation status was determined in the 288 women in whom breast cancer developed. The hazard ratios for the development of breast cancer among women who received tamoxifen were 1.67 (95% CI, 0.32 to 10.7) among BRCA1 carriers and 0.38 (95% CI, 0.06 to 1.56) among BRCA2 carriers. Although these results are limited by small sample sizes, they are consistent with an effect in BRCA2 carriers; approximately 77% of breast cancers in BRCA2 carriers are ER-positive. These results are uninformative for BRCA1 carriers. The major question is whether tamoxifen can provide primary prevention of breast cancer in BRCA1 carriers, in whom 75 to 80% of breast cancers are ER-negative. Currently, the authors think that the data are inadequate to support the use of tamoxifen for primary prevention of breast cancer in BRCA1 carriers. However, given the predominance of ER-positive disease that develops in BRCA2 carriers, tamoxifen is an option for this group.

Table 3. Suggested Approaches to Care of Patients with Hereditary Breast and Ovarian Cancer Syndrome.

Resident Work Hours and the FIRST Trial Results

Posted by Lisa Rosenbaum • February 4th, 2016

3Resident duty hours mean something different for everyone. Listen to residents, program directors, investigators, and ethicists discuss the results of the newly published FIRST trial, “National Cluster-Randomized Trial of Duty-Hour Flexibility in Surgical Training,” and its implications for the future of resident education. You’ll hear from the principal investigator of FIRST, Karl Bilimoria, David Asch, principal investigator of the iCOMPARE trial, as well as the surgical and medical residents in the trenches.

This podcast is the first in a new series that reflects medicine’s most pressing issues through the eyes of residents. “The House” provides residents with a forum to share their stories from the bedside, where they are learning far more than the lessons of clinical medicine.

Lisa Rosenbaum is a cardiologist at Brigham and Women’s Hospital in Boston, and National Correspondent for the New England Journal of Medicine.  Dan Weisberg is an Internal Medicine Resident, also at Brigham and Women’s Hospital.


Residency Duty Hours: FIRST, do no harm

Posted by Andrea Merrill • February 2nd, 2016

2-1-2016 3-31-57 PMWhen I started general surgery residency in 2011, my training program was on probation for violating the 80 hour work week as mandated by the Accreditation Council for the Graduate Medical Education (ACGME).  In addition, new regulations were being introduced that year that limited the maximum number of hours an intern and resident could work (16 and 24 hours respectively) and increased the time off in between 24 hour shifts for residents.  Our program worked hard to get off probation with adoption of a night float system and strict recording of duty hours.  Even our attending surgeons, who trained when 120 hours a week was the norm, prodded us to comply with these new restrictions.  This was truly a new era for surgical residency.

My third month of residency I had the opportunity to rotate on one of the most coveted services: a busy private general surgery service run by one surgeon who performed operations that ran the gamut from run-of-the-mill hernias to Whipples (pancreaticoduodenectomies).  Towards the end of my rotation, I saw that he had scheduled an Ivor-Lewis esphagectomy, a complex and challenging case, as the last case of the day.  I had never seen one so I desperately wanted to scrub in for the case.  It had been a long day in the OR and the case didn’t start until close to 6PM- the usual time when interns were supposed to start wrapping things up so they could sign out to the night float team.  Regardless, I scrubbed in to the case with my senior resident and right when we were about to start the case, the attending surgeon turned to me and said, “It’s 6PM, shouldn’t you be signing out?”  For general surgeons who trained in an earlier era prior to mine, this would have represented heresy!  While I won’t reveal whether or not I stayed and violated the duty hours, I will admit that I have not seen another Ivor-Lewis esophagectomy in the 4 years since.

Our probation status was eventually repealed but the new ACGME rules remained.  Many surgeons worried that these new rules, and the 80 hour work week, would negatively affect our general surgery training, limiting our exposure to operating and our acquisition of the much needed experience required to operate independently after training.  There was also concern for patient safety with the new rules which is what had sparked the initiation of the first set of ACGME regulations in 2003.  In 2003 the concern was that overworked and overtired residents would lead to patient errors.  The stricter rules in 2011 tried to further mitigate resident fatigue; however there was a flip-side to reducing maximum shift lengths: loss of continuity of care.  Because we had to work less hours, there were more frequent “hand-offs” or “sign outs” to other interns.  Often you would sign out a patient to an intern or moonlighter who had never met any of the patients.  And in an effort to try to get out of the hospital on time, the likelihood of forgetting something, something important, increased which could also lead to patient error.  With so many questions and concerns regarding adequate residency training and patient safety it was only a matter of time before someone decided to study it.

Enter the Flexibility in Duty Hour Requirements for Surgical Trainees (FIRST) Trial published Online First this week in NEJM by Karl Billmoria et al.  This study is the first of its kind to study the effect of randomizing residency training programs to different duty hour regulations, which many thought could never be done.  Eligible general surgery residency programs and affiliated hospitals were enrolled in the FIRST trial and then stratified by tertiles based on a composite measure of death or serious morbidity.  Within each strata, programs and their hospital affiliates were then cluster randomized to the “Standard Policy” group, required to follow current ACGME rules, or the “Flexible Policy” group which allowed maximum intern and resident shifts to be extended and time off in between shifts to be reduced.  Both groups had to adhere to the ACGME mandated 80 hour work week, number of days off, and frequency of call regardless of group assignment.  Residents were not blinded to their program’s assignment.

The trial was conducted as a non-inferiority trial to look at both patient and resident outcomes.  Primary patient outcome was 30-day postoperative death or serious morbidity, obtained using the American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP®).  Secondary outcomes included other measures of death, morbidity and post-operative complications.  Primary resident outcomes were resident-reported satisfaction with overall quality of resident education and overall well-being.  Secondary resident outcomes included residents’ perceptions and satisfaction regarding the effect of their current duty hour regulations on various aspects of patient care and safety and resident quality of life and personal being.  Separate adherence analyses were performed by surveying program directors at the end of the study.  No data was collected on call schedules or resident duty hour logs.

Included in the final analysis were 115 programs (58 Standard Policy, 57 Flexible Policy) and 148 hospitals (70 Standard Policy, 78 Flexible Policy).  There was no difference in the patient primary outcome of rate of death/serious morbidity between the two groups (9.00% Standard Policy vs 9.06% Flexible Policy, p=0.921).  In adjusted analysis, the Flexible Policy group was noninferior to the Standard Policy for all secondary patient outcomes except for postoperative failure-to-rescue, renal failure, and 30-day postoperative pneumonia, all of which had non-significant differences but did not reach noninferiority.

With regards to resident primary outcomes, there was no difference in rates of dissatisfaction with overall education quality (10.7% Standard Policy vs 11% Flexible Policy) or overall well-being (12.1% Standard Policy vs 14.9% Flexible Policy) between the 2 groups.  There were some notable differences in resident secondary outcomes.  Residents in the Flexible Policy group were significantly less likely to be dissatisfied with continuity of care (OR=0.44 p<0.001) and quality of handoffs/transitions in care (OR=0.69 p=0.011) but were more dissatisfied with time for rest (OR= 1.41 p=0.020).  Flexible Policy residents were also less likely to perceive a negative effect of duty hour policies on patient safety, clinical and operative skills acquisition, OR time and learning/teaching activities (OR for all <1.00 and all p<0.001).  However, Flexible Policy residents did feel more effects of the duty hours on personal time away from the hospital with greater perception of negative effects on several measures (ORs all<1, all p<0.001).  There was no significant difference, though, in perceived effects of duty hours on job satisfaction or morale.

So what do the results of the FIRST trial mean for duty hours for surgical (and non-surgical) residents going forward?  In this study, flexible duty hour policies were noninferior to standard duty hour policies with regards to patient safety and overall resident well-being.  Should this outcome change current ACGME regulations?  The authors of the FIRST trial believe so, stating, “These results merit consideration in future debate and revision of duty hour policies.”  Others interpret the results in a different light.  In the accompanying editorial, Dr. John Birkmeyer respectfully applauds the authors of the FIRST trial for their “very ambitious, scientifically robust study.”  However, he contends that the results prove that “surgeons should stop fighting the ACGME duty hour rules and move on.  The FIRST trial effectively debunks concerns that patients will suffer as a result of increased hand-offs and breaks in continuity of care.”  He instead argues for improving our health systems to reduce dependence on “overworked resident physicians.”  He concludes by saying, “Although few surgical residents would ever acknowledge this publicly, I’m sure many love to hear, “We can take care of this case without you, go home, see your family, and come in fresh tomorrow.””

While this is likely not the end of the debate on resident duty hours, it certainly adds more data to the discussion.  The FIRST trial will spark interesting debate going forward, especially as internal medicine residencies undertake their own RCT, the iCOMPARE trial, to similarly study effects of a more flexible duty hour policy.

Watch the NEJM Quick Take Video: A Trial of Flexible Duty Hours in Surgical Training

(Call to action!)

A Girl with Abdominal Pain

Posted by Carla Rothaus • January 29th, 2016

NEJMcpc1413305_f1Case Records of the Massachusetts General Hospital summarizes torsion of an accessory spleen is most commonly reported in children but can also occur in adults. Patients with torsion may present with acute abdominal pain, but intermittent or chronic pain has also been described, and infarction or rupture leading to acute abdominal hemorrhage can occur.

A 9-year-old girl with chronic constipation was seen in the gastroenterology clinic because of increasingly frequent episodes of abdominal pain with associated nonbilious vomiting. A diagnosis was made.

Clinical Pearl

• Is heartburn the most common manifestation of gastroesophageal reflux disease in children?

Gastroesophageal reflux disease is typically thought to cause heartburn, but vomiting and abdominal pain may be more prevalent among children with this condition.

Clinical Pearl

• Is Helicobacter pylori commonly implicated among children who have chronic abdominal pain?

Disease associated with H. pylori is often considered in the differential diagnosis of abdominal pain, because infection with this bacterium can cause gastritis or peptic ulcer disease. However, a causative role for H. pylori in pediatric chronic abdominal pain has been difficult to prove. Current guidelines for screening pediatric patients for H. pylori–associated disease focus on patients with recalcitrant iron-deficiency anemia, a family history of gastric cancer, or the presence of a peptic ulcer on endoscopic examination.

Morning Report Questions

Q: What are the characteristic clinical manifestations of eosinophilic gastroenteritis?

A: Eosinophilic gastroenteritis is an allergic condition that causes abdominal pain, along with characteristic clinical manifestations that depend on the layer of the gastrointestinal tract (mucosal, serosal, or muscular) that is involved. Mucosal eosinophilic gastroenteritis often causes diarrhea and can be identified by means of endoscopic biopsy. Serosal eosinophilic gastroenteritis is usually manifested by eosinophilic ascites and is diagnosed through a combination of imaging studies and paracentesis. Muscular eosinophilic gastroenteritis often results in obstructive symptoms.

Q: Describe some of the features associated with accessory spleens.

A: Accessory spleens, which can be multiple, are a common finding. A recent single-institution analysis of consecutive abdominal CT scans showed a prevalence of accessory spleens of approximately 11%, whereas autopsy studies suggest that they are present in as many as 30% of persons. Most accessory spleens are smaller than 2 cm in greatest dimension. Accessory spleens are most commonly located around the splenic hilum but may be present along the splenorenal ligament, the gastrosplenic ligament, the tail of the pancreas, or the omentum. The differential diagnosis for ectopic splenic tissue also includes splenosis, which generally occurs after trauma to the spleen, and polysplenia, a rare syndrome in which patients have multiple small spleens and often also have congenital cardiac abnormalities. Most persons with accessory spleens are asymptomatic, but abdominal pain can occur with torsion.

Figure 2. Contrast-Enhanced Abdominal CT Scan.

Brain Disease Model of Addiction

Posted by Carla Rothaus • January 29th, 2016

A Girl with Abdominal Pain2Advances in neurobiology have begun to clarify the mechanisms underlying the profound disruptions in decision-making ability and emotional balance displayed by persons with drug addiction.

The neurobiology of addiction is pointing the way to potential methods of disrupting the neurocircuitry with both pharmaceutical and behavioral tools. Altering the reward and emotional circuits may prevent and treat the problem. A new Review Article summarizes.

Clinical Pearl

• What are some of the criticisms of the concept of addiction as a brain disease?

Although the brain disease model of addiction has yielded effective preventive measures, treatment interventions, and public health policies to address substance-use disorders, the underlying concept of substance abuse as a brain disease continues to be questioned, perhaps because the aberrant, impulsive, and compulsive behaviors that are characteristic of addiction have not been clearly tied to neurobiology. Additional criticisms of the concept of addiction as a brain disease include the failure of this model to identify genetic aberrations or brain abnormalities that consistently apply to persons with addiction and the failure to explain the many instances in which recovery occurs without treatment.

Clinical Pearl

• The onset of addiction is thought to occur predominantly during what period of life?

The findings from neurobiologic research show that addiction is a disease that emerges gradually and that has its onset predominantly during a particular risk period: adolescence. Adolescence is a time when the still-developing brain is particularly sensitive to the effects of drugs. Important factors contributing to the greater vulnerability to drug experimentation and addiction during adolescence reveal that this is a period of enhanced neuroplasticity during which the underdeveloped neural networks necessary for adult-level judgment (the prefrontal cortical regions) cannot yet properly regulate emotion.

Morning Report Questions

Q: What is the neurobiologic explanation for the diminished motivation for engaging in everyday activities that accompanies drug addiction?

A: For many years it was believed that over time persons with addiction would become more sensitive to the rewarding effects of drugs and that this increased sensitivity would be reflected in higher levels of dopamine in the circuits of their brains that process reward (including the nucleus accumbens and the dorsal striatum) than the levels in persons who never had a drug addiction. Although this theory seemed to make sense, research has shown that it is incorrect. In fact, clinical and preclinical studies have shown that drug consumption triggers much smaller increases in dopamine levels in the presence of addiction (in both animals and humans) than in its absence (i.e., in persons who have never used drugs). This attenuated release of dopamine renders the brain’s reward system much less sensitive to stimulation by both drug-related and non–drug-related rewards. As a result, persons with addiction no longer experience the same degree of euphoria from a drug as they did when they first starting using it. It is for this same reason that persons with addiction often become less motivated by everyday stimuli (e.g., relationships and activities) that they had previously found to be motivating and rewarding.

Q: How does the brain model of addiction explain the trouble an addicted person may have following through on a decision to stop taking drugs? 

A: The changes that occur in the reward and emotional circuits of the brain are accompanied by changes in the function of the prefrontal cortical regions, which are involved in executive processes. Specifically, the down-regulation of dopamine signaling that dulls the reward circuits’ sensitivity to pleasure also occurs in prefrontal brain regions and their associated circuits, seriously impairing executive processes, among which are the capacities for self-regulation, decision making, flexibility in the selection and initiation of action, attribution of salience (the assignment of relative value), and monitoring of error. The modulation of the reward and emotional circuits of prefrontal regions is further disrupted by neuroplastic changes in glutamatergic signaling. In persons with addiction, the impaired signaling of dopamine and glutamate in the prefrontal regions of the brain weakens their ability to resist strong urges or to follow through on decisions to stop taking the drug. These effects explain why persons with addiction can be sincere in their desire and intention to stop using a drug and yet simultaneously impulsive and unable to follow through on their resolve.

Figure 1. Stages of the Addiction Cycle.

Belatacept and Long-Term Outcome in Kidney Transplantation

Posted by Andrea Merrill • January 27th, 2016

1-27-2016 1-57-40 PMMedicine is a constant balance of risks and benefits.  The importance of maintaining this balance is especially evident in kidney transplantation.   While methods of immunosuppression in kidney transplantation have improved substantially since the use of total body irradiation to induce tolerance in the 1950s, current immunosuppressive agents such as calcineurin inhibitors still come with  clinically significant side effects and potential for harm.  One of the most worrisome side effects of calcineurin inhibitors, ironically, is nephrotoxicity, which can prematurely lead to graft dysfunction and even loss.

Thus, research efforts have focused on finding alternative immunosuppressive agents that do not confer nephrotoxicity.  One such study, the BENEFIT trial , previously published 3 year outcomes comparing death and graft loss in patients receiving belatacept, a selective costimulation blocker, to those outcomes in patients receiving cyclosporine A, a calcineurin inhibitor.  Initial trial results showed equivalent rates of death and graft loss between the 2 groups but a higher glomerular filtration rate (GFR) and a higher incidence of acute rejection in the belatacept group.  The authors now report 7 year outcomes of belatacept versus cyclosporine A in this week’s issue of NEJM.

The BENEFIT trial randomized 666 patients transplanted with a living or deceased standard donor criteria kidney in a 1:1:1 ratio to receive either high intensity betalacept-based therapy, low intensity belatacept-based therapy or cyclosporine A- based therapy for 36 months.  All patients received basiliximab induction, and adjunctive maintenance therapy with mycophenolate mofetil and glucocorticoids.  About 60-70% of the patients in each treatment arm remained on the assigned treatment through 7 years of follow-up.

While there was no statistically significant difference in the composite primary outcome of death or graft loss after 36 months of follow-up, differences emerged after prolonged follow-up.  At 84 months (7 years) of follow-up, Kaplan–Meier estimated rates of death or graft loss with more intense belatacept, less intense, belatacept, and cyclosporine A were about 13%, 13%, and 22%, respectively.  This corresponds to an HR of 0.57 comparing both the more intense betalacept to cyclosporine A (P=0.0225), and the less intense belatacept with cyclosporine A (P=0.0210).  There was equal contribution from both endpoints- death and graft failure.

Both belatacept regimens also appeared to have an advantage over the cyclosporine A regimen with respect to GFR over time.  After 7 years follow-up there was a significant difference in GFR, with GFR increasing over time in both belatacept groups and decreasing over time for the cyclosporine A group.  Additionally, fewer patients on belatacept developed donor-specific antibodies.   Acute rejection was, however, more common with belatacept with rates of about 25% vs. 18% vs. 11% for high belatacept, low belatacept, and cyclosporine A regimens, respectively. Frequencies of all other adverse events, including infection, malignancy and post-transplant lymphoproliferative disorder, were similar across all treatment groups.

The study indicates that belatacept, a relatively new immunosuppressive agent used in kidney transplantation, has long-term efficacy despite the higher rates of acute rejection with the agent.  Belatacept may offer a promising alternative to cyclosporine, given the possibility of longer preservation of graft function.  Belatacept may also increase adherence, as it is administered as a monthly infusion rather than a twice daily pill and does not require frequent venipunctures for drug level monitoring.  However, the monthly infusion may be seen as a barrier by some patients, and belatacept is more costly than cyclosporine.

Limitations of the study are addressed in the accompanying editorial by Drs. Eliot Heher and James Markmann.  They state, “Looking forward, several lines of inquiry beckon. First, we must define how best to combine belatacept with other available immunosuppressive agents to lower the risk of rejection mediated by memory T Cells resistant to costimulatory blockade.  Second, head-to-head comparisons of belatacept to tacrolimus are needed, as tacrolimus itself is associated with better transplant survival compared to cyclosporine.”  Despite such limitations, the editorialists are optimistic about belatacept’s future concluding, “as is typical with clinical research, we’re at the end of the beginning for belatacept, but with a lucky seven years of experience to guide us in answering the next set of critical questions.”

Postmenopausal Osteoporosis

Posted by Carla Rothaus • January 22nd, 2016

1-15-2016 1-53-07 PMOsteoporosis results in 1.5 million fractures per year in the United States, with the vast majority occurring in postmenopausal women. Treatment is generally recommended in postmenopausal women who have a bone mineral density T score of −2.5 or less, a history of spine or hip fracture, or a Fracture Risk Assessment Tool (FRAX) score indicating increased fracture risk.

Management of postmenopausal osteoporosis includes nonpharmacologic treatment (e.g., weightbearing exercise and fall-prevention strategies) and pharmacologic treatment. Bisphosphonates are considered first-line treatment in most women. In a new Clinical Practice article, benefits and rare potential risks are discussed.

Clinical Pearl

• What is the Fracture Risk Assessment Tool (FRAX)?

The overriding goal in managing postmenopausal osteoporosis is the prevention of future fractures. Therefore, identifying women at the highest risk is a clinical priority. Low bone mineral density (BMD), particularly at the hip, is a strong risk factor for fracture: for each 1-SD decrement in BMD, the risk of fracture increases by a factor of 2 or 3. However, a more comprehensive assessment of clinical risk factors is helpful to define absolute risk for an individual and to select patients for treatment. The Fracture Risk Assessment Tool (FRAX), which was developed by the World Health Organization on the basis of data from several international cohorts, incorporates established risk factors and BMD at the femoral neck to predict individual 10-year risk of hip or major osteoporotic fracture; its use is endorsed by several professional organizations.

Table 1. Guidelines from Professional Organizations for the Treatment of Osteoporosis.

Clinical Pearl

• What are some of the nonpharmacologic therapies for patients with postmenopausal osteoporosis?

Resistance and weight-bearing exercise can increase muscle mass and can transiently increase BMD. Exercise and balance programs (e.g., yoga and tai chi) may result in improved balance and an increase in muscle tone and may secondarily reduce the risk of falls among some elderly persons. Besides exercise, assessment of the home for hazards, withdrawal of psychotropic medications (when possible), and the use of a multidisciplinary program to assess risk factors are prudent strategies for potentially reducing the risk of falls. Other measures should include counseling about cigarette smoking (which is linked to reduced BMD) and about excess alcohol intake (which can increase the risk of falls).

Morning Report Questions

Q: What class of drug is prescribed most often for the treatment of postmenopausal osteoporosis?

A: The bisphosphonates as a class represent the vast majority of prescriptions for osteoporosis treatment, and all are now available in generic form. Bisphosphonates inhibit bone remodeling. Several oral and intravenous bisphosphonates have been shown in randomized trials to reduce the risk of fractures. FDA-approved oral bisphosphonates include alendronate (the first one approved), risedronate, and ibandronate. Although data from randomized trials and clinical experience indicate that they are generally safe, mild hypocalcemia and muscle pain occur infrequently. Two rare but more serious adverse effects have also been observed. These are atypical femoral fractures (i.e., fractures in the subtrochanteric region that have a transverse orientation and noncomminuted morphologic features, show focal lateral cortical thickening, occur with minimal trauma, and may be bilateral) and osteonecrosis of the jaw, which is defined as exposed bone in the maxillofacial region that does not heal within 8 weeks. Use of bisphosphonates should be limited to persons who have an estimated creatinine clearance greater than 35 ml per minute and normal serum vitamin D levels; symptomatic hypocalcemia can develop in patients with low levels of 25-hydroxyvitamin D who receive concomitant treatment with bisphosphonates. Oral bisphosphonates should not be prescribed for patients with clinically significant esophageal disease (e.g., achalasia). Adherence to oral bisphosphonates is low, and it is estimated that less than 40% of persons who are prescribed oral medications are still taking them after 1 year. Intravenous bisphosphonates (ibandronate and zoledronic acid) are alternatives that do not require frequent patient use.

Table 2. Drugs Approved by the Food and Drug Administration for the Treatment and Prevention of Osteoporosis.

Q: What is the risk-benefit ratio associated with bisphosphonate treatment for postmenopausal osteoporosis?

A: The relative importance of the two rare adverse effects (atypical fractures and osteonecrosis of the jaw) versus the benefits of antiresorptive therapy is uncertain and remains controversial. The concerns of many women regarding these potential adverse effects have increasingly become a substantial barrier to initiation of antiosteoporosis therapy and to treatment adherence. Case–control and cohort studies and a few randomized trials have assessed the risk of atypical femoral fractures; in all the studies, the incidence of these fractures is low, ranging from approximately 1 in 100,000 to 5 in 10,000 among bisphosphonate users. Calculations based on recent reviews and meta-analyses suggest a highly favorable benefit-to-risk ratio associated with treatment for up to 5 years in women with osteoporosis, with fewer than 1 event caused per 100 fractures prevented. The incidence of osteonecrosis of the jaw is similarly very low (estimated at <1 case per 10,000 bisphosphonate users). Given concerns about an increased risk of atypical femur fractures with long-term treatment, the possibility of a drug holiday (temporary discontinuation for up to 5 years) has been suggested, although the preferred timing and duration of drug holidays with bisphosphonate therapy are uncertain. Randomized trials have indicated that with discontinuation of alendronate after 5 years of use or of zoledronic acid after 3 years of use, benefits (as determined primarily by assessment of BMD loss and changes in biochemical markers of bone turnover as compared with those with placebo) are generally retained for up to 5 years. The value of monitoring therapy after discontinuation with the use of biochemical markers of bone turnover or BMD to aid in clinical decision making about restarting bisphosphonates is controversial. These recommendations regarding drug holidays do not apply to risedronate or ibandronate, because these agents have not been systematically evaluated, or to other osteoporosis therapies, whose benefits are quickly lost after cessation.

Eluxadoline for Irritable Bowel Syndrome

Posted by Carla Rothaus • January 22nd, 2016

1-15-2016 1-44-35 PMThe irritable bowel syndrome (IBS) with diarrhea is a common functional gastrointestinal disorder that is characterized by recurring abdominal pain, bloating, and loose, frequent stools in the absence of structural, inflammatory, or biochemical abnormalities. Lembo et al. conducted two phase 3 trials to evaluate the clinical response of patients with IBS with diarrhea to eluxadoline, as compared with placebo, through 26 weeks and to evaluate the safety of eluxadoline up to 52 weeks. The authors randomly assigned 2427 adults who had IBS with diarrhea to eluxadoline (at a dose of 75 mg or 100 mg) or placebo twice daily for 26 weeks (IBS-3002 trial) or 52 weeks (IBS-3001 trial). The primary end point was the proportion of patients who had a composite response of decrease in abdominal pain and improvement in stool consistency on the same day for at least 50% of the days from weeks 1 through 12 and from weeks 1 through 26.

In these two randomized trials, eluxadoline was more effective than placebo in reducing abdominal pain and improving stool consistency in patients who had irritable bowel syndrome with diarrhea. Pancreatitis developed in 5 of 1666 patients (0.3%) who received eluxadoline.A new Original Article summarizes.

Clinical Pearl

• What treatment options are currently available for IBS with diarrhea?

Current treatment options for IBS with diarrhea are limited. Initial therapies include dietary and lifestyle modifications along with antidiarrheal agents; these therapies are frequently unsuccessful. A subgroup of patients with IBS with diarrhea may have a response to either rifaximin or alosetron. Alosetron has been approved by the Food and Drug Administration only for women with severe IBS with diarrhea who have not had a response to conventional therapy, although subsequent data suggest efficacy in men.

Clinical Pearl

• What is eluxadoline?

Opioid receptors (including μ-opioid receptors, δ-opioid receptors, and κ-opioid receptors) in the enteric circuitry of the gastrointestinal tract play a role in regulating gastrointestinal motility, secretion, and visceral sensation. Eluxadoline is a peripherally acting mixed μ-opioid receptor agonist–δ-opioid receptor antagonist and κ-opioid receptor agonist with minimal oral bioavailability. Nonclinical studies have shown that, unlike selective μ-opioid receptor agonists, eluxadoline reduces visceral hypersensitivity without completely disrupting intestinal motility. These data suggest that peripheral δ-opioid receptor antagonism may reduce μ-opioid receptor–mediated constipation and enhance μ-opioid receptor–mediated peripheral analgesia.

Morning Report Questions

Q: How effective is eluxadoline for the treatment of IBS with diarrhea?

A: In the studies by Lembo et al., eluxadoline was effective in simultaneously relieving the symptoms of abdominal pain and diarrhea. The primary outcome measure required simultaneous improvement in the daily scores for the worst abdominal pain and stool consistency on the same day for at least 50% of the days assessed; this end point is currently one of those recommended by the regulatory agencies in the United States and Europe to show treatment effect in trials involving patients with IBS and diarrhea. More patients who received eluxadoline than who received placebo reported significant improvement in the primary outcome measure over both intervals assessed (absolute differences for the two doses across the two studies ranged from 7 to 13 percentage points for weeks 1 through 12, and from 4 to 13 percentage points for weeks 1 through 26). Eluxadoline also resulted in significant improvements in global assessment scores (on measures of adequate relief of IBS symptoms, global symptoms, and quality of life), particularly at the 100-mg twice-daily dose, with improvements in measure of adequate relief of IBS symptoms that were similar to those reported with alosetron and rifaximin. Treatment with eluxadoline did not result in significantly higher rates than placebo of the prespecified secondary outcome of 30% improvement in the average score for the most severe abdominal pain.

Figure 1. Primary Efficacy End Point in the Eluxadoline and Placebo Groups in Each Trial and in the Pooled Trials.

Figure 2. Percentage of Patients Who Met the Daily Composite Response Criteria over Time.

Table 2. Secondary Efficacy End Points (Weeks 1–12).

Q: Are there any subgroups of patients for whom the use of eluxadoline carries specific risks? 

A: Five cases of pancreatitis (0.3%) and 8 cases of abdominal pain with elevated levels of hepatic enzymes (0.5%) occurred in the study by Lembo et al. Nine of these 13 cases were determined by the adjudication committee to be associated with spasm of the sphincter of Oddi. All the patients with pancreatitis were determined by the adjudication committee to have no organ failure or local or systemic complications. The pancreatitis, which occurred in patients with either biliary disorders (spasm of the sphincter of Oddi and biliary sludge) or alcohol use (3 of 5 cases), resolved within the first week. The presence of only mild cases does not preclude the risk of severe cases in the future, nor do these associations preclude other at-risk populations. Data are lacking from studies to assess whether the risk of pancreatitis can be reduced if treatment is restricted to patients with gallbladders or to those who abstain from excessive alcohol use. Identifying patients with IBS with diarrhea who are at risk for acute pancreatitis because of the absence of a gallbladder or excessive alcohol consumption is important before initiating therapy with eluxadoline. Any benefit will need to be considered in the context of side effects and risks.

Table 3. Common Adverse Events.

CDX2 as a Prognostic Biomarker in Colon Cancer

Posted by Chana Sacks • January 20th, 2016

123A few weeks apart, Mr. Green and Mrs. Brown presented to their primary care doctors with intermittent rectal bleeding.  Both were referred for colonoscopies, and each was found to have a colonic mass.  With great trepidation, they awaited the results of the pathology and the CT scans that followed.  Ultimately, both were diagnosed with stage-II colon cancers, confined to the wall of the colon without evidence of systemic spread.  Filled with anxiety, they awaited their physicians’ recommendations for treatment.

Determining treatment plans for these two patients with stage-II colon cancer is increasingly complicated.  Unlike in patients with stage-I cancers, for whom surgery is sufficient, or those with stage-III cancers, which have spread to regional lymph nodes and clearly benefit from systemic chemotherapy, patients with stage-II cancers face uncertainty:  some seem to do well with surgery alone, but a subset develops recurrent disease that might have benefited from adjuvant chemotherapy.  So far, we have lacked tools to differentiate these subtypes, and researchers are increasingly asking, can the genetics of individual tumors predict prognosis and guide treatment?

Now published in NEJM, a study by Dalerba and colleagues tackles this very question.  The investigators describe a novel approach used to identify a tumor marker called CDX2 that might predict prognosis and determine which patients with stage-II colon cancer might benefit from adjuvant chemotherapy, rather than surgery alone.

Investigators reasoned that tumor cells that closely resemble mature colonic epithelial tissue would be less aggressive; by contrast, those tumors that have characteristics of more immature undifferentiated colonic stem cells might behave more aggressively and result in worse outcomes.  So they set out to find markers of immature colonic tissue present in tumors. Using a novel bioinformatics approach, the researchers mined a database of more than 2,000 human colon gene-expression array experiments and identified a number of candidate genes, ultimately selecting the transcription factor CDX2. CDX2 is expressed in more mature cells and is not expressed in more primitive colon precursor cells.  With this candidate biomarker identified, the investigators conducted a retrospective analysis using about 2200 colon cancer tissue microarrays from different databases to determine if there was an association between CDX2 expression and either survival outcomes or response to chemotherapy.

The results:  the data demonstrate that the absence of CDX2 portends worse outcomes.  In the discovery dataset, for patients with stage-II colon cancer with CDX2-negative tumors, the 5-year disease-free survival was 49%, as compared with 87% for those CDX2-positive tumors (p=0.003).  The validation dataset yielded similar results.  A separate analysis of those with CDX2-negative tumors suggests that treatment with adjuvant chemotherapy was associated with improved disease-free survival.

In an accompanying editorial, Drs. Boland and Goel of Baylor University remind us of the limitations of small, retrospective analyses, and they call for prospective trials to confirm these results.  Still, they describe the fundamental importance of this study:  “This work provides an opportunity for oncologists to move beyond what has been an inadequate method of selecting Stage II colon cancer patients for adjuvant chemotherapy.”

NEJM Deputy Editor Dr. Dan Longo agrees: “While these data might prove important for the treatment of patients with Stage II colon cancer, the implications of this study are broader – as they suggest novel ways that we might harness our improving understanding of tumor biology to personalize treatment and cure disease.”

So, for Mr. Green and Mrs. Brown, further testing to determine whether their tumors express CDX2 might help guide treatment options and might result in very different treatment plans for what was thought to be the same disease.  To be sure, more data are needed, but this study points to the importance of this rapidly developing field of inquiry.  As Boland and Goel conclude, “it is likely that a combination of genetic and epigenetic panels will soon help refine and advance the field of predictive oncology.”