Simvastatin in the Acute Respiratory Distress Syndrome

Posted by Sara Fazio • October 31st, 2014

In a recent study, patients with acute respiratory distress syndrome who were not receiving statins were assigned to receive simvastatin or placebo. At 28 days, there were no significant between-group differences in survival or in the number of ventilator-free days.

The acute respiratory distress syndrome (ARDS) is a common, devastating clinical syndrome characterized by life-threatening respiratory failure requiring mechanical ventilation and by multiple organ failure. In ARDS there is an uncontrolled inflammatory response that results in alveolar damage, with the exudation of protein-rich pulmonary-edema fluid in the alveolar space that results in respiratory failure.

Clinical Pearls

What is the basis of interest in statins as a possible treatment for ARDS?

The inhibition of 3-hydroxy-3-methylglutaryl coenzyme A (HMG-CoA) reductase with statins has been shown to modify a number of the underlying mechanisms implicated in the development of ARDS. Statins decrease inflammation and histologic evidence of lung injury in murine models of ARDS. Simvastatin reduced pulmonary and systemic inflammatory responses in a human model of ARDS induced by lipopolysaccharide inhalation. In addition, in a small, single-center, randomized, placebo-controlled study, simvastatin ameliorated nonpulmonary organ dysfunction and was safe.

What were the outcomes of this study, which compared simvastatin to placebo for the treatment of ARDS?

The primary outcome, the number of ventilator-free days, did not differ significantly between the two study groups (12.6+/-9.9 days with simvastatin and 11.5+/-10.4 days with placebo; mean difference, 1.1 days [95% CI, −0.6 to 2.8]; P=0.21). The change from baseline to day 28 in the oxygenation index did not differ significantly between the two groups, nor did the Sequential Organ Failure Assessment (SOFA) score. There were no significant differences in the number of days free of nonpulmonary organ failure or in mortality at 28 days. Mortality at ICU discharge or hospital discharge was also not significantly different between the two groups.

Table 2. Main Clinical Outcomes.

Figure 2. Probabilities of Survival and Breathing without Assistance from Randomization to Day 28, According to Whether Patients Received Simvastatin or Placebo.

Morning Report Questions

Q: What were the study results with respect to simvastatin’s safety in this clinical setting?

A: Overall, adverse events related to the study drug were significantly more common in the simvastatin group than in the placebo group. The majority of the adverse events were related to elevated creatine kinase and hepatic aminotransferase levels. The numbers of serious adverse events (other than those reported as trial outcomes, such as death) were similar in the two groups. There was no significant between-group difference in the proportion of patients with nonpulmonary organ dysfunction, as measured by a SOFA score of less than 2 for each organ.

Q: How do study results compare with those of the recent Statins for Acutely Injured Lungs from Sepsis (SAILS) study?

A: The recent SAILS study, which involved patients with sepsis-associated ARDS, showed that rosuvastatin did not improve clinical outcomes, as compared with placebo, and was associated with fewer days free of renal and hepatic failure. The authors note that the data from the current study and the SAILS trial show that neither a lipophilic statin (simvastatin) nor a hydrophilic statin (rosuvastatin) is effective in the treatment of ARDS.

Focal Seizures and Progressive Weakness

Posted by Sara Fazio • October 31st, 2014

In the latest Case Record of the Massachusetts General Hospital, a 7-year-old boy was evaluated because of focal seizures, twitching of the right arm and the right side of the face, and progressive weakness. Imaging revealed progressive left cortical atrophy and a focal lesion in the left parietal cortex. A diagnostic procedure was performed.

Rasmussen’s encephalitis, which was first described in 1958 by Dr. Theodore Rasmussen, is a progressive neurologic disease of unknown cause.

Clinical Pearls

What are the manifestations of epilepsia partialis continua?

Epilepsia partialis continua is defined as almost continuous regular or irregular muscular clonic twitching affecting a limited part of the body. Consciousness is typically preserved, and the twitching most commonly involves the face, arms, or both. According to the definition of epilepsia partialis continua, the twitching must last at least 1 hour (but may last hours or years) and the twitches must occur at least once every 10 seconds, typically in isolation or in clusters of 1 to 2 Hz.

What is the differential diagnosis of epilepsia partialis continua?

The differential diagnosis for epilepsia partialis continua is divided into nonprogressive and progressive causes. Nonprogressive causes include vascular causes, metabolic causes, neoplasm, infectious or immunologic causes, cortical dysplasia, mitochondrial causes, perinatal central nervous system injury, and cryptogenic causes. A progressive cause is Rasmussen’s encephalitis. Much of the literature on epilepsia partialis continua focuses on adults, but a recent study involving 51 children with epilepsia partialis continua showed that Rasmussen’s encephalitis was the most common cause, with other common causes including immune or inflammatory processes (e.g., acute and subacute encephalitis and subacute sclerosing panencephalitis), metabolic disorders (e.g., mitochondrial disease and neuronal ceroid lipofuscinosis), cortical malformations, and vascular causes.

Table 1. Differential Diagnosis of Epilepsia Partialis Continua.

Morning Report Questions

Q: What is Rasmussen’s encephalitis?

A: Rasmussen’s encephalitis is a progressive neurologic disease of unknown cause. Patients typically present in childhood with focal-onset seizures, which then progress over a period of months to refractory epilepsy with progressive hemiparesis. Epilepsia partialis continua develops in approximately 50 to 90% of persons with Rasmussen’s encephalitis, and fixed hemiparesis typically occurs within 2 to 3 years after the onset of seizures. Seizures are typically refractory to medication, but glucocorticoids and intravenous immune globulin can be effective in controlling seizures. Various other immunosuppressive medications have been tried, but the most effective therapy remains hemispherectomy. Hemispheric procedures are associated with a high rate of success in stopping seizures and also halting progression of the disease.

Table 2. Diagnostic Criteria for Rasmussen’s Encephalitis.

Q: What techniques are available for disconnection of the affected hemispheres in Rasmussen’s encephalitis, and what are the functional outcomes?

A: Over time, a number of techniques have been developed, ranging from complete anatomical hemispherectomy to newer, less extensive procedures that disconnect the affected hemisphere from the opposite hemisphere and from the remainder of the nervous system. With seizure activity but the discharges cannot propagate and do not produce symptoms. The less extensive procedures are associated with less blood loss and fewer long-term complications but with a slightly lower likelihood of seizure control. Hemispherectomy and hemispheric disconnection can each result in varying degrees of contralateral weakness, language and other cognitive dysfunction, and hemianopsia, depending on the degree to which such functions already have been or can be subserved by the contralateral hemisphere. After a patient has undergone anatomical hemispherectomy or hemispheric disconnection, the long-term outcome typically includes spastic hemiplegia with return of ambulation (patients can walk but have a spastic-type limp), development of a “helper” arm without fine motor manipulative abilities (patients can use the arm to help lift objects but cannot perform fine motor functions with the hand), permanent hemianopsia, and various degrees of language and cognitive function (depending on preoperative status and age at surgery).

Feed Me: Early Nutritional Support in Intensive Care

Posted by Rena Xu • October 29th, 2014

What is the best way to feed a critically ill patient?  Nutrition can be delivered either parenterally — directly into the veins – or enterally, e.g., via a tube that runs from the nose to the stomach.  Both have potential and well-reported adverse consequences, along with potential benefits.  It’s commonly believed that, if given the option, you should go with enteral feeding — in addition to being less invasive and more physiologically intuitive, it’s been associated with lower rates of infection and other complications.

But what if the adverse consequences that have hampered parenteral nutrition in the past aren’t a reflection of faulty strategy per se, but rather of poor execution?  Rowan and her colleagues in the UK have hypothesized that, with advancements in feeding technology and better management of vascular access, parenteral nutrition may now be superior to enteral feeding, as it’s more likely to ensure delivery of the intended nutrition.

To test this theory, they conducted the CALORIES trial, enrolling 2400 adult patients with unexpected admissions to 33 intensive care units across England. The patients were randomized to receive nutritional support either parenterally (via a central venous catheter) or enterally (via a nasogastric or nasojejunal tube), initiated within 36 hours of admission and used exclusively for five days or until complete transition to oral feeding, discharge from the ICU, or death.

The results, published recently in NEJM, suggest there may not be a clear winner. All-cause mortality at 30 days, the primary outcome, was similar between the two groups: roughly a third of patients died (33.1% in the parenteral group, and 34.2% in the enteral group; relative risk 0.97, P=0.57).  Patients in the parenteral group were less likely to become hypoglycemic than patients in the enteral group (3.7% vs. 6.2% of patients; absolute risk reduction 2.5%; P=0.006).  They also had lower rates of vomiting (8.4% vs. 16.2%; absolute risk reduction 7.8%; P<0.001).  For all other secondary outcomes — including the rate of infection, length of ICU and overall hospital stays, and 90-day survival – no significant difference was found between the two groups.  The rate of adverse events was similar as well (4.9% in the parenteral group, and 4.8% in the enteral group; P=1.00).

“The reported increase in infectious complications that have been associated with the parenteral route was not observed,” the authors underscored.  This could in part reflect improvements in the formulation, delivery, and monitoring of parenteral nutrition — in other words, better execution as compared to older studies.

But implementation issues may still be undermining the effectiveness of parenteral nutrition.  The majority of patients in both study groups failed to achieve their targeted caloric intake (25kcal per kilogram per day).  This finding was consistent with the results of previous studies, but still somewhat surprising: parenteral nutrition is supposed to be more reliable at guaranteeing delivery.  The authors enumerated various logistical constraints that may have contributed to the shortfall, concluding, “There are substantial practical and organizational impediments for both routes of delivery, at least during an initial 5-day period.”

The CALORIES trial didn’t find parenteral nutrition to be superior to enteral nutrition as the investigators had hypothesized.  Even by demonstrating comparable outcomes across the two routes, however, the study invites debate.  The most important takeaway lesson from the study may be that for the critically ill patient, getting adequate nutrition early on — by any route — is hard.  As for whether route matters, and to what extent, it’s likely still too soon to tell.

What is your approach to nutritional supplementation for patients who require intensive care?  When enteral and parenteral nutrition are both available, what influences your decision to use one versus the other? How will the results of the CALORIES trial affect your practice?

 

60-Year-Old Man with Bone Pain

Posted by Sara Fazio • October 24th, 2014

In the latest Case Record of the Massachusetts General Hospital, a 60-year-old man was seen in the outpatient cancer center because of bone pain that had lasted for 2 months and the presence of lytic bone lesions on imaging studies. Biopsy specimens of bone marrow and bone lesions showed increased mast cells. A diagnostic procedure was performed.

In general, bone lesions can be divided into two major types according to their radiologic appearance. Lytic lesions have a characteristic “moth eaten” appearance on imaging studies, which is caused by the juxtaposition of degraded bone and unaffected, calcified bone. The process of bone degradation is mediated by osteoclasts. In contrast, blastic bone lesions reflect increased bone formation, a process mediated by increased osteoblastic activity.

Clinical Pearls

What tumors most commonly metastasize to bone, and what are their radiologic characteristics?

The presence of multiple widespread bone lesions suggests a metastatic tumor; most common are lung or prostate cancer, renal-cell carcinoma, and melanoma. Tumors that metastasize to the bone often have characteristic biologic and radiologic characteristics; prostate cancer, carcinoid tumors, small-cell lung cancer, Hodgkin’s lymphoma, and medulloblastoma often cause osteoblastic lesions, whereas renal-cell carcinoma, non-small-cell lung cancer, thyroid cancer, melanoma, and lymphomas predominantly cause osteolytic lesions. Many metastatic tumors, particularly sarcomas and cancers of breast and gastrointestinal origin, may cause both lytic and blastic lesions. In general, the lesions can manifest in various ways.

Table 1. Partial Differential Diagnosis of Bone Lesions.

What are the clinical features of plasma-cell (multiple) myeloma?

Plasma-cell myeloma is characterized by an increase in clonal plasma cells in the bone marrow and the presence of a monoclonal paraprotein in the serum, as well as an associated abnormal calcium level and associated hematologic, renal, and bone abnormalities, including lytic bone lesions. A rare variant, nonsecretory myeloma, may occur, in which lytic bone lesions are present but a monoclonal paraprotein is not detected.

Morning Report Questions

Q: What are the characteristics of an epithelioid hemangioendothelioma, and how may it be treated?

A: Epithelioid hemangioendothelioma is a rare malignant tumor that affects fewer than 300 patients per year in the United States and accounts for approximately 1% of all vascular neoplasms. Clinically, it is treated as a low-to-intermediate-grade angiosarcoma because, as compared with high-grade angiosarcomas, metastasis is less likely to develop, disease progression or time to relapse is slower, and survival is longer, even in cases of advanced disease. Most epithelioid hemangioendotheliomas follow an indolent clinical course, but it is estimated that approximately 15% of patients with epithelioid hemangioendothelioma die of the disease. Because epithelioid hemangioendothelioma is a neoplasm of vascular origin, therapies that target angiogenesis in this disease were thought to be promising. Epithelioid hemangioendothelioma is known to express a wide variety of ligands and receptors for vascular endothelial growth factor (VEGF) isoforms. Several published case reports have shown a clinical benefit of the VEGF inhibitor bevacizumab. Other antiangiogenic agents have also been reported to show evidence of activity when they are administered as single agents. These agents include sunitinib, thalidomide, lenalidomide, and interferon-(alpha).

Q: How are painful bone metastases best managed?

A: In the management of painful spine metastases, the most important initial step is correctly identifying the cause of pain. Common causes of back pain that is associated with bone metastases are mechanical instability, tumor-related inflammation, nerve-root involvement, or a combination of these causes. Before initiating therapy for pain relief, cord compression requiring urgent surgical intervention must be ruled out. Augmentation, such as vertebroplasty and kyphoplasty, is minimally invasive and involves the percutaneous injection of acrylic cement (such as methyl methacrylate) with the use of imaging guidance. Randomized trials of augmentation involving patients with osteoporotic fractures have not shown any benefit.  However, in selected cases of cancer-related fractures, augmentation has been associated with rapid and clinically significant pain relief. Pain that results from tumor-related inflammation (so-called biologic pain) is often unrelenting, subacute or chronic in onset, and responsive to glucocorticoids. Systemic therapy may offer pain relief in particularly responsive types of cancer. Radiation therapy is a common palliative strategy that offers partial pain relief in 50 to 80% of patients with tumor-related inflammation and complete pain relief in 20 to 40% of patients. Pain relief often does not occur until several weeks after completion of  radiation therapy.

Community-Acquired Pneumonia

Posted by Sara Fazio • October 24th, 2014

Community-acquired pneumonia is a commonly diagnosed illness in which no causative organism is identified in half the cases. Application of molecular diagnostic techniques has the potential to lead to more targeted therapy in the face of increasing antibiotic resistance. A new review article looks at this topic.

Community-acquired pneumonia (CAP) is a syndrome in which acute infection of the lungs develops in persons who have not been hospitalized recently and have not had regular exposure to the health care system.

Clinical Pearls

What are the most common causes of CAP?

Although pneumococcus remains the most commonly identified cause of CAP, the frequency with which it is implicated has declined, and it is now detected in only about 10 to 15% of inpatient cases in the United States. Other bacteria that cause CAP include Haemophilus influenzae, Staphylococcus aureus, Moraxella catarrhalis, Pseudomonas aeruginosa, and other gram-negative bacilli. Patients with chronic obstructive pulmonary disease (COPD) are at increased risk for CAP caused by H. influenzae and Mor. catarrhalis. P. aeruginosa and other gram-negative bacilli also cause CAP in persons who have COPD or bronchiectasis, especially in those taking glucocorticoids. There is a wide variation in the reported incidence of CAP caused by Mycoplasma pneumoniae and Chlamydophila pneumoniae (so-called atypical bacterial causes of CAP), depending in part on the diagnostic techniques that are used. During influenza outbreaks, the circulating influenza virus becomes the principal cause of CAP that is serious enough to require hospitalization, with secondary bacterial infection as a major contributor.

Table 1. Infectious and Noninfectious Causes of a Syndrome Consistent with Community-Acquired Pneumonia (CAP) Leading to Hospital Admission.

What evaluation do the authors recommend to determine the cause of community-acquired pneumonia in a hospitalized patient?

In hospitalized patients with CAP, the authors favor obtaining Gram’s staining and culture of sputum, blood cultures, testing for legionella and pneumococcal urinary antigens, and multiplex PCR assays for Myc. pneumoniae, Chl. pneumoniae, and respiratory viruses, as well as other testing as indicated in patients with specific risk factors or exposures. A low serum procalcitonin concentration (<0.1 microg per liter) can help to support a decision to withhold or discontinue antibiotics. Results on Gram’s staining and culture of sputum are positive in more than 80% of cases of pneumococcal pneumonia when a good-quality specimen (>10 inflammatory cells per epithelial cell) can be obtained before, or within 6 to 12 hours after, the initiation of antibiotics. Blood cultures are positive in about 20 to 25% of inpatients with pneumococcal pneumonia but in fewer cases of pneumonia caused by H. influenzae or P. aeruginosa and only rarely in cases caused by Mor. catarrhalis.

Morning Report Questions

Q: What are the guidelines for treating community-acquired pneumonia in outpatients and inpatients?

A: For outpatients without coexisting illnesses or recent use of antimicrobial agents, IDSA/ATS [Infectious Diseases Society of America and the American Thoracic Society] guidelines recommend the administration of a macrolide (provided that <25% of pneumococci in the community have high-level macrolide resistance) or doxycycline. For outpatients with coexisting illnesses or recent use of antimicrobial agents, the guidelines recommend the use of levofloxacin or moxifloxacin alone or a beta-lactam (e.g., amoxicillin-clavulanate) plus a macrolide. The authors argue, however, that a beta-lactam may be favored as empirical therapy for CAP in outpatients, since most clinicians do not know the level of pneumococcal resistance in their communities, and Str. pneumoniae is more susceptible to penicillins than to macrolides or doxycycline. Even though the prevalence of Str. pneumoniae as a cause of CAP has decreased, they raise concern about treating a patient with a macrolide or doxycycline to which 15 to 30% of strains of Str. pneumoniae are resistant. For patients with CAP who require hospitalization and in whom no cause of infection is immediately apparent, IDSA/ATS guidelines recommend empirical therapy with either a beta-lactam plus a macrolide or a quinolone alone.

Q: What is the appropriate duration of antibiotic therapy for community-acquired pneumonia?

A: Early in the antibiotic era, pneumonia was treated for about 5 days; the standard duration of treatment later evolved to 5 to 7 days. A meta-analysis of studies comparing treatment durations of 7 days or less with durations of 8 days or more showed no differences in outcomes, and prospective studies have shown that 5 days of therapy are as effective as 10 days and 3 days are as effective as 8. Nevertheless, practitioners have gradually increased the duration of treatment for CAP to 10 to 14 days. The authors argue that a responsible approach to balancing antibiotic stewardship with concern about insufficient antibiotic therapy would be to limit treatment to 5 to 7 days, especially in outpatients or in inpatients who have a prompt response to therapy. Pneumonia that is caused by Staph. aureus or gram-negative bacilli tends to be destructive, and concern that small abscesses may be present has led clinicians to use more prolonged therapy, depending on the presence or absence of coexisting illnesses and the response to therapy.

Less is not more for TB treatment

Posted by Rachel Wolfson • October 22nd, 2014

Shorter regimens fail to be non-inferior to the standard tuberculosis treatment plans

One third of the world’s population is currently infected with tuberculosis (TB), and, in 2012, there were 1.3 million TB-related deaths (Centers for Disease Control and Prevention). Moreover, in 2012, 450,000 people worldwide developed multi-drug resistant TB (MDR TB), which is resistant to at least rifampin and isoniazid, two of the first line antimicrobials for TB (World Health Organization). Although many cases of MDR TB are acquired from patients harboring these organisms, among the mechanisms for inducing drug resistance in TB are being on incorrect treatment or not completing a full antibiotic course. Inadequate compliance is particularly challenging in TB, because the standard treatment course lasts at least 6 months. Finding new treatment approaches that shorten the duration could help decrease the development of drug resistance and lower costs.

In this week’s NEJM, three groups published phase III trials investigating the efficacy of using fluoroquinolones in combination with other anti-TB drugs to treat patients with a four-month regimen, two months shorter than the standard of care. Merle et al. performed a randomized, open-label, controlled trial in which they compared the standard six-month treatment regimen (isoniazid, rifampin, pyrazinamide, and ethambutol) to a four-month regimen that replaced ethambutol with gatifloxacin, a fourth-generation fluoroquinolone. The trial enrolled just over 1800 patients across five African countries, and, despite the positive results from phase II trials and mouse studies, the four-month treatment regimen failed to demonstrate non-inferiority, with higher TB recurrences than that of the standard treatment (14.6% vs. 7.1%). Gillespie et al. found similar results when they randomized just over 1900 patients across nine countries to either one of two fourth-month regimens or the standard of care. In one of these shorter regimens, they used a combination of rifampin, isoniazid, pyrazinamide, and moxifloxacin, a fluoroquinolone, while, in the other, they used ethambutol in place of isoniazid. Both moxifloxacin-containing regimens failed to show non-inferiority compared to the control arm.

Finally, Jindani et al. performed a randomized controlled trial in which they enrolled just over 800 patients across four African countries. Compared to the control six-month regimen, they tested two treatment plans for non-inferiority: one four-month and one six-month regimen in which isoniazid was replaced by moxifloxacin. Similar to the other two trials, the four-month treatment plan also failed to show non-inferiority. The six-month moxifloxacin-containing treatment plan, in which treatment in the final four months was administered weekly, was as effective as the standard daily regimen. While this regimen does not decrease the overall treatment duration, it does decrease the frequency at which patients need to take medications. This advance may increase treatment adherence and is a step towards helping to decrease the development of MDR TB.

While these trials are steps in the right direction, the failure of these three trials to show non-inferiority highlights a major challenge in the field. Even though fluoroquinolones were effective at decreasing the treatment time in mouse models infected with TB, the discrepancy in human trials demonstrates again that differences in biology between mice and humans are major hurdles in the drug discovery pipeline. As Digby Warner, PhD, and Valerie Mizrahi, PhD, commented in an accompanying editorial, more effort will need to be focused on how to more effectively develop and test new therapies so that fewer drugs and regimens fail in phase III and IV trials.

Postherpetic Neuralgia

Posted by Sara Fazio • October 17th, 2014

Postherpetic neuralgia is more common with older age. Recommended treatments include topical agents (lidocaine or capsaicin) and systemic agents (in particular, gabapentin, pregabalin, or tricyclic antidepressants), but their efficacy tends to be suboptimal.  The latest Clinical Practice article is on this topic, and comes from University of Bristol’s Dr. Robert Johnson and Imperial College London’s Dr. Andrew Rice.

Postherpetic neuralgia is the most frequent chronic complication of herpes zoster and the most common neuropathic pain resulting from infection.

Clinical Pearls

What are the epidemiology of and risk factors for postherpetic neuralgia?

Postherpetic neuralgia is conventionally defined as dermatomal pain persisting at least 90 days after the appearance of the acute herpes zoster rash. The incidence and prevalence of postherpetic neuralgia vary depending on the definition used, but approximately a fifth of patients with herpes zoster report some pain at 3 months after the onset of symptoms, and 15% report pain at 2 years. Analysis of data from the United Kingdom General Practice Research Database showed that the incidence of postherpetic neuralgia (as defined by pain at 3 months) rose from 8% at 50 to 54 years of age to 21% at 80 to 84 years of age. Risk factors for postherpetic neuralgia include older age and greater severity of the prodrome, rash, and pain during the acute phase. The incidence is also increased among persons with chronic diseases such as respiratory disease and diabetes, and it may be increased among immunocompromised patients, although the evidence is sparse and inconsistent.

What is the typical clinical presentation and appropriate evaluation of a patient with postherpetic neuralgia?

Although a history of herpes zoster often cannot be confirmed with absolute certainty, the disorder has a characteristic clinical presentation, and thus postherpetic neuralgia rarely presents a diagnostic challenge. Clinical assessment of the patient with postherpetic neuralgia should follow the general principles of assessment of patients with peripheral neuropathic pain. Features of pain and associated sensory perturbations (e.g., numbness, itching, and paresthesias) should be assessed. Pain associated with postherpetic neuralgia occurs in three broad categories: spontaneous pain that is ongoing (e.g., continuous burning pain), paroxysmal shooting or electric shock-like pains, and evoked sensations that are pathologic amplifications of responses to light touch and other   innocuous stimuli (mechanical allodynia) or to noxious stimuli (mechanical hyperalgesia). The physical examination should include a comparison of sensory function in the affected dermatome with that on the contralateral side. Loss of sensory function in response to both mechanical and thermal stimuli is common in patients with postherpetic neuralgia, as are pathologic sensory amplifications (e.g., allodynia and hyperalgesia). In most cases, no additional evaluation is needed beyond the history taking (with concomitant disease and medications noted) and physical examination.

Morning Report Questions

Q: What is the appropriate treatment for postherpetic neuralgia?

A: Topical therapy alone is reasonable to consider as first-line treatment for mild pain. It is sometimes used in combination with systemic drugs when pain is moderate or severe, although data are lacking from randomized trials comparing combination topical and systemic therapy with either therapy alone. Patches containing 5% lidocaine are approved for the treatment of postherpetic neuralgia in Europe and the United States. However, evidence in support of their efficacy is limited. There is evidence to support the use of tricyclic antidepressants (off-label use) and the antiepileptic drugs gabapentin and pregabalin (Food and Drug Administration-approved) for the treatment of postherpetic neuralgia. Opioids, including tramadol, should generally be considered as third-line drugs for postherpetic neuralgia after consultation with a specialist and should be prescribed only with appropriate goals and close monitoring.

Q: What is the evidence for the effectiveness of preventive therapy for postherpetic neuralgia?

A: Placebo-controlled trials of antiviral drugs for acute herpes zoster have shown that they reduce the severity of acute pain and rash, hasten rash resolution, and reduce the duration of pain. These trials were not designed to assess the subsequent incidence of postherpetic neuralgia. Two randomized trials have shown that the addition of systemic glucocorticoids to antiviral drugs during the acute phase of herpes zoster does not reduce the incidence of postherpetic neuralgia. In one placebo-controlled trial, low-dose amitriptyline, started soon after the diagnosis of herpes zoster and continued for 90 days, significantly reduced the incidence of pain at 6 months. Further studies are required to confirm this finding. The only well-documented means of preventing postherpetic neuralgia is the  prevention of herpes zoster. A live attenuated VZV [varicella-zoster virus] vaccine has been available since 2006; it was initially licensed for immunocompetent persons 60 years of age or older but now is approved for persons 50 years of age or older. In a randomized trial in the older age group, its use reduced the incidence of herpes zoster by 51% and the incidence of postherpetic neuralgia by 66%. In patients 70 years of age or older as compared with those 60 to 69 years of age, the vaccine was less effective in reducing the risk of herpes zoster (38% reduction) but conferred similar protection against postherpetic neuralgia (67% reduction).

Chronic Sore Throat and a Tonsillar Mass

Posted by Sara Fazio • October 17th, 2014

In the latest Case Record of the Massachusetts General Hospital, a 78-year-old woman with rheumatoid arthritis was admitted to the Massachusetts Eye and Ear Infirmary because of a chronic sore throat, odynophagia, and a tonsillar mass. A diagnostic procedure was performed.

Histoplasma capsulatum is commonly found in soil in certain regions of the United States and South America, and 50 to 80% of people living in regions where the fungus is endemic have evidence of prior exposure.

Clinical Pearls

What is the differential diagnosis of acute vesicular pharyngitis?

Acute vesicular pharyngitis may be caused by coxsackievirus (and occasionally other enteroviruses), herpes simplex virus (HSV), and varicella-zoster virus (VZV). Coxsackievirus causes a bilateral pharyngitis that primarily affects young children and resolves in a few days. Primary HSV-associated stomatitis may be severe in immunocompromised patients and is bilateral. Recurrent HSV may produce atypical lesions in immunocompromised patients and may involve the pharynx and larynx; the lesions are usually bilateral. Patients with rheumatoid arthritis are at increased risk for herpes zoster, and patients with a zoster rash affecting the second or third divisions of cranial nerve V may have an accompanying ipsilateral pharyngitis or laryngitis, although this is rare. VZV pharyngitis or laryngitis may also develop without a concurrent facial zoster rash, but in these cases, one or more cranial neuropathies are almost always present.

Who is at greatest risk for histoplasmosis and disseminated histoplasmosis?

In the United States, histoplasmosis infections occur mainly in the regions of the Mississippi River Valley and Ohio River Valley, although a study of the geographic distribution of infection among older adults showed that 12% of cases occur in areas where the fungus is not endemic, including New England. Histoplasmosis is initially asymptomatic or produces a mild acute respiratory illness that resolves. Disseminated disease develops in 0.05% of patients, most of whom are immunocompromised. Dissemination occurs either soon after the primary infection or reinfection or after reactivation of previously unrecognized latent disease.

Morning Report Questions

Q: How does anti-tumor necrosis factor-alpha (TNF-alpha) therapy alter the risk of histoplasmosis?

A: Anti-TNF-alpha therapy increases the risk of histoplasmosis, and histoplasmosis is the most common invasive fungal infection in patients receiving these medications. In patients who are receiving TNF-alpha inhibitors, histoplasmosis is three times more common than tuberculosis, according to one report, and is associated with a mortality of 20%, with deaths often due to a delay in diagnosis and treatment. The risk of disease is higher among patients who are receiving infliximab than among those who are receiving etanercept.

Q: What are the features of oropharyngeal histoplasmosis?

A: Disseminated histoplasmosis may involve the throat, with the most common sites of involvement being the buccal mucosa, tongue, and palate; the larynx may also be involved. The lesions are often painful, ulcerated, and indurated, with heaped-up borders, and may mimic cancer. Oral and laryngeal lesions may be present simultaneously. Oropharyngeal or laryngeal lesions may be the only signs of disseminated disease, and fever occurs in only one third of patients with such disease.

Malpractice Reform and Emergency Department Care

Posted by Chana Sacks • October 15th, 2014

A 67-year-old woman presents to your Emergency Department (ED) with a headache for the last 48 hours.  She describes herself as “a headachy” person since her late teens, but this one is particularly bad, throbbing, associated with nausea and photophobia.  She is afebrile without neck stiffness. Your thorough neurologic exam reveals no focal deficits.

You form your differential diagnosis and debate your next step: do you send the patient home with a diagnosis of a migraine, a prescription, and a plan for close follow-up with her primary care physician? Do you pursue imaging with a CT scan or an MRI to rule out a more insidious cause of her symptoms?

You run the case – and your nagging uncertainty – by one of your colleagues.  “Just get the scan,” she advises.  She then tells you that one time a doctor she knows didn’t get an MRI on a patient with a headache who turned out to have a brain tumor.  He is still embroiled in that lawsuit, she says, her voice trailing off as she walks away.

You have heard many physicians give voice to this line of thinking: yes, we may be ordering some unnecessary tests, but we practice medicine in an exceptionally litigious U.S. society. We have to protect ourselves.

In a Special Article published in the NEJM this week, Waxman and colleagues examine whether this fear of malpractice lawsuits truly motivates physicians’ practices, resulting in extra tests and added costs.   Perhaps surprisingly, they conclude that it does not.

The authors used the real-life experiment provided by Georgia, Texas, and South Carolina to investigate this question. Between 2003 and 2005, these states each passed legislation changing the malpractice standard for emergency care to gross negligence, which the authors note is “widely considered to be a very high bar for plaintiffs.”

The investigators assessed the effect of this legislation that largely eliminated the threat of lawsuits by examining three outcomes: use of CT or MRI, admission to the hospital, and total ED charges per visit. In prior survey studies, emergency physicians had identified ordering CT/MRI imaging and deciding to admit a patient to the hospital as common, costly “defensive maneuvers” often motivated by fear of malpractice lawsuits.  Using a random sample of Medicare claims from 1997-2011, they examined the outcomes in the three states that passed the malpractice reforms – before and after the change in legislation – and in ten control states.

Their findings: in none of the three states was malpractice reform associated with a reduction in CT/MRI ordering or in the rates of hospital admission. In Texas and South Carolina, there was also no reduction in per-visit ED charges.  In Georgia, malpractice reform was associated with a 3.6% reduction in charges [95% CI -6.2% to -0.9% p = 0.010].   The authors conclude, “these strongly protective laws caused little (if any) change in practice intensity among physicians caring for Medicare patients in emergency departments.”

NEJM Deputy Editor Mary Beth Hamel commented, “The authors’ rigorous analyses suggest that legislation designed to reduce the risk of malpractice did not change emergency room clinicians’ decisions to order tests and admit patients to the hospital. It is not clear if the negative findings reflect the misperception that ‘defensive medicine’ is a substantial driver of health care costs or the intractability of the problem.”

To your patient and her headache – do you order an imaging test? Maybe. This study offers no guidance about when and whether ordering an MRI or choosing to admit a patient to the hospital is medically appropriate. However, if the authors’ conclusions are correct, that feeling driving you to pursue the additional test is probably not your fear of a lawsuit.

What is that force, then, that pushes physicians to order expensive tests? Perhaps it’s the uncomfortable uncertainty inherent in medicine. Maybe insecurity about the sensitivity of the physical exam. Or the fear of missing a rare or life-threatening diagnosis. Almost certainly, it is an amalgam of factors.  However, this study suggests that it is a force that malpractice reform – at least as enacted in these states – is unlikely to ameliorate.

Acid-Base Disturbances

Posted by Sara Fazio • October 10th, 2014

Acid–base homeostasis is fundamental for maintaining life. The first article in the new Disorders of Fluids and Electrolytes series reviews a stepwise method for the physiological approach to evaluation of acid–base status.

Internal acid-base homeostasis is fundamental for maintaining life. Accurate and timely interpretation of an acid-base disorder can be lifesaving, but establishment of a correct diagnosis may be challenging.

Clinical Pearls

What are some uses and limitations of the anion gap?

Lactic acidosis accounts for about half of the high anion gap cases, and is often due to shock or tissue hypoxia. The anion gap however, is a relatively insensitive reflection of lactic acidosis — roughly half the patients with serum lactate levels between 3.0 and 5.0 mmol per liter have an anion gap within the reference range. With a sensitivity and specificity below 80% in identifying elevated lactate levels, the anion gap cannot replace a serum lactate measurement. Nevertheless, lactate levels are not routinely drawn or always rapidly available, and a high anion gap can alert the physician that further evaluation is necessary. In addition, the anion gap should always be adjusted for the albumin concentration, because this weak acid may account for up to 75% of the anion gap. Without correction for hypoalbuminemia, the anion gap can fail to detect the presence of a clinically significant increase in anions (>5 mmol per liter) in more than 50% of cases. For every 1 g per deciliter decrement in serum albumin concentration, the calculated anion gap should be raised by approximately 2.3 to 2.5 mmol per liter.

Table 1. Primary Acid-Base Disturbances with a Secondary (“Compensatory”) Response.

What are the characteristics of a normal anion-gap (hyperchloremic) acidosis?

Chloride plays a central role in intracellular and extracellular acid-base regulation. A normal anion-gap acidosis will be found when the decrease in bicarbonate ions corresponds with an increase in chloride ions to retain electroneutrality, also called a hyperchloremic metabolic acidosis. This type of acidosis occurs from gastrointestinal loss of bicarbonate (e.g., because of diarrhea or ureteral diversion), from renal loss of bicarbonate that may occur in defective urinary acidification by the renal tubules (renal tubular acidosis), or in early renal failure when acid excretion is impaired. Hospital-acquired hyperchloremic acidosis is usually caused by the infusion of large volumes of normal saline (0.9%). Hyperchloremic acidosis should lead to increased renal excretion of ammonium, and measurement of urinary ammonium can therefore be used to differentiate between renal and extrarenal causes of normal anion-gap acidosis. However, since urinary ammonium is seldom measured, the urinary anion gap and urinary osmolal gap are often used as surrogate measures of excretion of urinary ammonium. The urine anion gap ([Na+] + [K+] – Cl-]) is usually negative in normal anion-gap acidosis, but it will become positive when excretion of urinary ammonium (NH4+) (as ammonium chloride [NH4Cl]) is impaired, as in renal failure, distal renal tubular acidosis, or hypoaldosteronism.

Morning Report Questions

Q: What is a useful approach to the analysis and treatment of a metabolic alkalosis?

A: The normal kidney is highly efficient at excreting large amounts of bicarbonate, and accordingly, the generation of metabolic alkalosis requires both an increase in alkali and impairment in renal excretion of bicarbonate. Gastric fluid loss and diuretic use account for the majority of metabolic alkalosis cases. By measuring chloride in urine, one can distinguish between chloride-responsive and chloride-resistant metabolic alkalosis. If the kidneys perceive a reduced “effective circulating volume,” they avidly reabsorb filtered sodium, bicarbonate, and chloride, largely through activation of the renin-angiotensin-aldosterone system, thus reducing the concentration of urinary chloride. A (spot sample) urinary chloride concentration of less than 25 mmol per liter is reflective of chloride-responsive metabolic alkalosis. Administration of fluids with sodium chloride (usually with potassium chloride) restores effective arterial volume, replenishes potassium ions, or both with correction of metabolic alkalosis. Metabolic alkalosis with a urinary chloride concentration of more than 40 mmol per liter is mainly caused by inappropriate renal excretion of sodium chloride, often reflecting mineralocorticoid excess or severe hypokalemia (potassium concentration <2 mmol per liter). The administration of sodium chloride does not correct this type of metabolic alkalosis, which, for that reason, is called “chloride-resistant.” Diuretic-induced metabolic alkalosis is an exception because the concentration of chloride in urine may increase initially, until the diuretic effect wanes, after which the concentration of chloride in the urine will fall below 25 mmol per liter.

Figure 2. Assessment of Alkalosis.

Q: How is the “delta anion gap” helpful in the evaluation of mixed metabolic acid-base disorders?

A: In high anion-gap metabolic acidosis, the magnitude of the anion gap increase (delta AG) is related to the decrease in the bicarbonate ions (delta[HCO3-]). To diagnose a high anion-gap acidosis with concomitant metabolic alkalosis or normal anion-gap acidosis, the so-called delta-delta may be used. The delta gap is the comparison between the increase (delta) in the anion gap above the upper reference value (e.g., 12 mmol per liter) and the change (delta) in the concentration of bicarbonate ions from the lower reference value of bicarbonate ions (e.g., 24 mmol per liter). In ketoacidosis, there is a 1:1 correlation between the rise in anion-gap and the fall in concentration of bicarbonate. In lactic acidosis, the decrease in concentration of bicarbonate is 0.6 times the increase in anion gap (e.g., if the anion gap raises 10 mmol per liter, the concentration of bicarbonate should decrease about 6.0 mmol per liter). This difference is probably due to the lower renal clearance of lactate compared with keto-anions. Hydrogen buffering in cells and bone takes time to reach completion. Accordingly, the ratio may be close to 1:1 with “very acute” lactic acidosis (as with seizures or exercise to exhaustion). If the delta AG – delta[HCO3-] in ketoacidosis or if 0.6 delta AG – delta[HCO3-] in lactic acidosis = 0+/-5 mmol per liter, simple anion-gap metabolic acidosis is present. A difference greater than 5 mmol per liter suggests a concomitant metabolic alkalosis, and if the difference is less than -5 mmol per liter, a concomitant normal anion-gap metabolic acidosis is diagnosed.