Learning to Modify the Chronic Neurologic Burden of Sickle Cell Anemia

Posted by Joshua Allen-Dicker • August 20th, 2014

The patient sitting in front of you is a smiling, well-adjusted 6-year old boy.  He has a supportive family, eats a healthy diet, and is physically active.  What if, despite all of this, you knew that he had more than a 30 percent chance of developing a condition that would hamper his educational attainment?  Every day, pediatricians caring for children with sickle cell anemia are faced with this reality: around 37% of patients with sickle cell disease develop silent cerebral infarcts during childhood, which place them at significantly increased risk for poor academic achievement, cognitive deficits and strokes.

In this week’s NEJM, the Silent Cerebral Infarct Transfusion (SIT) Trial examines whether regular blood transfusions in children with known silent cerebral infarcts can reduce recurrent cerebral infarcts.  This multi-center, randomized control trial recruited children aged 5 to 15 years with hemoglobin SS or hemoglobin Sβ0 thalassemia, and a silent infarct on screening head MRI.

Study participants were randomized to receive monthly transfusion to maintain target hemoglobin greater than 9g/dL and a percentage of hemoglobin S less than or equal to 30% of their total hemoglobin.  The primary endpoint was infarct recurrence, defined as presence of a new infarct or enlargement of a previously noted infarct on the brain MRI performed at study exit.  Additionally, SIT monitored cognitive outcomes and safety outcomes, including transfusion reaction occurrence, ferritin levels, development of alloantibodies, and need for permanent central venous catheter access.

Of the 196 children who met study entry criteria, 99 were assigned to the transfusion group and 97 were assigned to the observation group.  After a mean follow-up of about 3 years, the incidence rate of infarct recurrence was 4.8 per 100 person years in the observation group versus 2.0 per 100 person years in the transfusion group ( p=0.04).  The number needed to treat was 13.

There was no significant difference in measured cognitive outcomes between the two groups. Patients in the transfusion group had a five-fold higher rate of blood-transfusion reactions and fourteen-fold higher rate of elevated ferritin levels (greater than 1500ng/mL).  Additionally, 11 patients in the transfusion group required tunneled venous catheter placement, and 4 patients in the transfusion group developed alloantibodies, compared with 0 patients for both outcomes in the observation group.

So does the SIT trial prove that transfusion for secondary prevention of silent infarcts in children with sickle cell anemia is ready for prime time?  While they represent an important realization—the chronic neurologic burden of sickle cell anemia may be modifiable—SIT’s encouraging results should be viewed in the context of treatment risks and implementation uncertainties.

In his accompanying editorial, Dr. Martin Steinberg states, “The study raises important questions for pediatricians and especially for internists who see these patients after many years of treatment, when untoward consequences of chronic transfusion can be present.”  For patients undergoing transfusion for secondary prevention we will need long-term mitigation strategies for alloimmunization, complications of permanent central venous catheter access, and iron overload.  We also lack clarity on the most appropriate duration of transfusion therapy and on whether transfusion strategies can have meaningful effects on longitudinal neurocognitive function.

Despite these concerns, Dr. Steinberg’s commentary remains hopeful about the promise of future long-term implementation trials.  As NEJM Deputy Editor Dan Longo adds, “Other methods of suppressing sickle globin gene expression are on the horizon; however, for the time being, transfusion appears to be capable of halving the neurologic consequences of sickle cell disease.  The late effects of transfusion are either rare (alloimmunization) or treatable (iron overload).  The same cannot be said for strokes.”

Benefits and Risks of Salt Consumption

Posted by Karen Buckley • August 19th, 2014

Are we bartering our own good health for the seductive flavor of salt?  View the latest NEJM Quick Take, which looks at all current guidelines and statistics regarding what we know about salt and cardiovascular disease and death related to salt consumption.

NEJM Quick Takes are brief animations that summarize the key findings of an original research article and their broader implications, and are narrated by our Editor-in-Chief, Jeffrey Drazen. They are now gathered in one place for you to browse.

Watch all eight, and look for a new one in October.

Syndromes of Thrombotic Microangiopathy

Posted by Sara Fazio • August 15th, 2014

A new review article covers the diverse pathophysiological pathways that can lead to microangiopathic hemolytic anemia and a procoagulant state with or without damage to the kidneys and other organs. An interactive graphic shows the nine primary syndromes of thrombotic microangiopathy (TMA), and narrated animations describe the causes, clinical features, initial management, and underlying mechanisms for each syndrome.

The TMA syndromes are extraordinarily diverse. They may be hereditary or acquired. They occur in children and adults. The onset can be sudden or gradual. Despite their diversity, TMA syndromes are united by common, defining clinical and pathological features. The clinical features include microangiopathic hemolytic anemia, thrombocytopenia, and organ injury.

Clinical Pearls  

What are the features of ADAMTS 13 deficiency-mediated TMA?

Acquired TTP is an autoimmune disorder caused by autoantibody inhibition of ADAMTS13 activity. The incidence of acquired TTP is much greater in adults (2.9 cases per 1 million per year) than in children (0.1 cases per 1 million per year). Factors that are associated with an increased frequency of this disorder include an age of 18 to 50 years, black race, and female sex. Among the primary TMA syndromes, TTP is unique for rarely causing severe acute kidney injury. The clinical features of hereditary TTP are recurrent episodes of microangiopathic hemolytic anemia and thrombocytopenia, often with neurologic abnormalities or other signs of organ injury.

Figure 3. Algorithm for the Evaluation of Children and Adults Presenting with Microangiopathic Hemolytic Anemia and Thrombocytopenia.

Table 1. Primary Thrombotic Microangiopathy (TMA) Syndromes.  

What are the features of complement-mediated TMA?  

Complement-mediated TMA results from uncontrolled activation of the alternative pathway of complement. Unlike the other two pathways of complement activation, the alternative pathway is constitutively active as a result of spontaneous hydrolysis of C3 to C3b. In the absence of normal regulation, C3b deposition on tissues may increase markedly, resulting in increased formation of the C5b-9 terminal complement complex (also called the membrane-attack complex) and injury of normal cells. Acute kidney injury and hypertension are prominent abnormalities in complement-mediated TMA. Current diagnostic criteria are those that were used in clinical trials involving a total of 37 patients, which supported the approval of eculizumab (a humanized monoclonal antibody that blocks the generation of C5a and C5b) for the treatment of “atypical HUS” in 2011. These criteria include all of the following: a serum creatinine level at or above the upper limit of the normal range, microangiopathic hemolytic anemia, thrombocytopenia, ADAMTS13 activity of 5% or more, and negative stool tests for Shiga toxin-producing infection.

Morning Report Questions  

Q: What are the features of Shiga toxin-mediated TMA, also called hemolytic uremic syndrome?

A: Multiple E. coli strains produce Shiga toxin; E. coli O157:H7 is the most common pathogen associated with ST-HUS in Europe and the Americas. S. dysenteriae type 1 remains an endemic cause of ST-HUS in other countries. Although ST-HUS is popularized by large outbreaks, most occurrences are sporadic. ST-HUS is much more common among children (median age, 2 years), in whom mortality is 3%. ST-HUS in adults is more severe, with higher mortality. Severe abdominal pain and diarrhea, often bloody, begin several days after contaminated food is consumed. Thrombocytopenia and renal failure begin as gastrointestinal symptoms resolve. Shiga toxin is identified by means of stool analyses during the acute colitis phase but may not be identifiable when ST-HUS begins. Treatment remains supportive. Early, aggressive hydration has a renal protective role. Patients commonly require dialysis. The benefits of plasma exchange and anticomplement treatment are uncertain.

Q: What medications have been associated with TMA?        

A: Although many other drugs have been reported to be associated with TMA, only quinine-associated TMA has been supported by documentation of drug-dependent antibodies. Quetiapine and gemcitabine are the only other drugs for which association with acute episodes of TMA has been supported by recurrent acute episodes with repeated exposures.

37-Year-Old Man with Ulcerative Colitis and Bloody Diarrhea

Posted by Sara Fazio • August 15th, 2014

In the latest Case Record of the Massachusetts General Hospital, a 37-year-old man with ulcerative colitis was admitted to the hospital because of abdominal cramping, diarrhea, hematochezia, fever to a peak temperature of 38.8°C, and drenching night sweats. Several weeks earlier, he had performed home fecal transplantation.

Bloody diarrhea is characteristic of infections caused by Escherichia coli O157:H7 and shigella. Patients with enterohemorrhagic E. coli often do not have fevers; the associated hemolytic-uremic syndrome is more common in children.

Clinical Pearls

What are the features of fecal microbiota transplantation (FMT)?

The purpose of FMT is to transfer normal intestinal flora from a healthy donor to a person with intestinal dysbiosis. The most common and now widely accepted indication for this novel therapy is recurrent C. difficile colitis. Many observational series and a few randomized, controlled trials have shown that this procedure is more than 90% effective in resolving relapsing and recurring C. difficile infection. A fecal slurry made from healthy donor stool can be administered by colonoscope, enema, nasogastric or nasoduodenal tube, or capsule. Although there are accepted standards for screening donors and donor stool, there is very limited information related to transmission of infections through FMT.

What percentage of adults in the United States are seropositive for CMV?

Fifty to 90% of adults in the United States who are older than 35 years of age are seropositive for CMV.

Morning Report Questions

Q: What are the most common infections in patients with ulcerative colitis?

A: The most common infections in patients with ulcerative colitis are those due to C. difficile or CMV. A varying but substantial fraction of C. difficile infections that occur in patients with IBD may occur in the absence of traditional risk factors. CMV colitis may complicate underlying ulcerative colitis; this occurs in up to one third of patients with glucocorticoid-refractory disease. The majority of cases represent reactivation of previous CMV infection. Primary CMV infection is infrequent and often manifested by systemic signs of fever, malaise, and sweats. Laboratory findings may include atypical lymphocytes and elevated aminotransferase levels.

Q: In a patient of inflammatory bowel disease, what is the best way to diagnosis CMV colitis?

A: In a patient with IBD, there are several assays that may aid in the diagnosis of CMV colitis. These include blood-based assays, such as those that detect CMV antigenemia or detect CMV DNA by polymerase chain reaction (PCR). However, detection of the viral protein or DNA does not definitively discriminate between CMV infection and CMV colitis; the latter diagnosis relies on the detection of colonic injury. Furthermore, a negative blood test does not rule out gastrointestinal disease. Several methods could be used to detect CMV in colonic biopsy specimens, including hematoxylin and eosin staining, immunohistochemical staining, and detection of viral nucleic acids by PCR. Among these, hematoxylin and eosin staining is the least sensitive. In addition, viral cytopathic changes are     generally subtle, and hence, immunohistochemical staining is a crucial ancillary test for the detection of CMV. A PCR-based assay offers the highest level of sensitivity; however, the clinical significance of the detection of CMV DNA in the absence of detectable colonic injury is uncertain.

Visit NEJM.org for the Latest on the Ebola Outbreak

Posted by Karen Buckley • August 14th, 2014

Be sure to visit the NEJM Infectious Disease page, which brings together new and recent articles on Ebola, and features a healthmap that tracks publicly-reported, confirmed and suspected cases of Ebola throughout the world. The map is automatically updated to provide up-to-the-minute data on the outbreak.

A new Perspective article from NIH’s Antony Fauci suggests we need sound public health practices, engagement with affected communities,  and considerable international assistance and global solidarity to defeat Ebola in West Africa.

Another Perspective article from NIAID’s Heinz Feldmann says the outbreak has demonstrated our public health systems’ limited ability to respond to rare, highly virulent communicable diseases.  A Brief Report looks at emergence of the outbreak of Ebola in Guinea back in March.

We’ll be posting more content in this space as it becomes available, so check back regularly.

A Salty Subject: Sodium Consumption and Cardiovascular Health

Posted by Rena Xu • August 13th, 2014

Salt has long been a staple of life.  Once upon a time, it was a form of currency; roads were built to transport it; cities arose to produce and trade it.  And, of course, people ate it.  Today, we continue to consume it the whole world over.

It’s hard to believe something so integral to the human experience, spanning centuries and civilizations, might be bad for the body.  But as the Mrs. Dashes of the world would have us know, salt consumption has been linked to high blood pressure, which in turn has been linked to heart disease — a reason, perhaps, to keep one’s salt intake in check.

Just how much salt are we consuming?  And how much damage can we attribute to the delicious granules that top our pretzels, coat our fries, and cure our meats?  A group of researchers from the Global Burden of Diseases Nutrition and Chronic Diseases Expert Group (NUTRICODE) used survey data and statistical models to estimate the sodium consumption of adults from 187 nations.  They also performed a meta-analysis of over a hundred randomized interventional studies to calculate the effects of sodium on blood pressure and analyzed pooled data from cohort studies to calculate the effects of high blood pressure, in turn, on cardiovascular mortality.

Their findings, published in this week’s NEJM, might make you think twice before reaching for the saltshaker.  In 2010, the average daily sodium consumption was 3.95 grams, well above the World Health Organization recommendation of 2 grams per day.  In fact, 181 out of the 187 nations — accounting for a whopping 99.2% of adults — exceeded the recommended intake level.  The consequence?  Nearly one in every ten deaths from cardiovascular causes in 2010, the authors estimated, could be attributed to excess sodium consumption.  That translated to 1.65 million deaths globally.  What’s worse, over 40% of these deaths occurred prematurely (i.e., in people younger than 70 years of age).

While no region of the world was spared, there was considerable variability across geographies in sodium-associated cardiovascular mortality rates.  The highest rates were in Central Asia and Eastern and Central Europe.  Among individual countries, Georgia had the highest rate (1967 deaths per 1 million adults per year), while Kenya had the lowest (4 deaths per 1 million adults per year).  Perhaps most striking, more than 80% of sodium-related cardiovascular deaths around the world — four out of every five — occurred in low- or middle-income countries.

There may be a silver lining.  The authors also found that reductions in sodium intake were linked proportionally to reductions in blood pressure (P<0.001 for a linear dose-response relationship).  For a white, normotensive 50-year-old, for instance, a reduction of 2.3 grams per day lowered systolic blood pressure by 3.74 mmHg (95% CI: 2.29 to 5.18).  The exact effect varied by age and race, with greater reductions in older people (vs. younger people), blacks (vs. whites), and people with hypertension (vs. without).

Does that mean low sodium intake translates to better health?  Not necessarily, according to two other articles published in this week’s NEJM.  The Prospective Urban Rural Epidemiology (PURE) study collected fasting morning urine specimens from over 100,000 adults, representing 18 different countries and a range of income levels, and calculated 24-hour urinary sodium excretion (a proxy of sorts for sodium consumption).  This was correlated with blood pressure.

The investigators found a non-linear relationship.  For people who excreted a lot of sodium (in this study, defined as over five grams per day), there was a strong association, meaning each additional gram of sodium was linked to a steep rise in blood pressure (2.58 mmHg per gram).  In contrast, for people with low sodium excretion (less than three grams per day), the association with blood pressure was not statistically significant (0.74 mmHg per gram; P=0.19).  Remarkably, only ten percent of study participants fell into this low-sodium category, and only four percent had sodium excretion levels consistent with the current U.S. guidelines for sodium intake.

Based on these results, Dr. Suzanne Oparil of the University of Alabama at Birmingham suggests in an accompanying editorial, a low-sodium diet might not be the most useful public health recommendation.  She writes, “The authors concluded from the findings that a very small proportion of the worldwide population consumes a low-sodium diet and that sodium intake is not related to blood pressure in these persons, calling into question the feasibility and usefulness of reducing dietary sodium as a population-based strategy for reducing blood pressure.”

The PURE study investigators also tested the correlation between sodium excretion and a composite outcome of death and major cardiovascular events (heart attacks, strokes, heart failure).  They observed that, compared to a reference sodium excretion range of four to six grams per day, a higher level of sodium excretion (seven or more grams per day) was linked to a greater risk of the composite outcome (odds ratio 1.15; 95% CI, 1.02 to 1.30).  But, when they looked at people with very low sodium excretion (below three grams per day), they found an increased risk as well (odds ratio, 1.27; 95% CI, 1.12 to 1.44).

Complicating matters further, the authors also looked at the urinary excretion of potassium.  Compared to a reference level of 1.5 grams per day, a higher potassium excretion level was linked to a reduced risk of the composite outcome.  “The alternative approach of recommending high-quality diets rich in potassium might achieve greater health benefits, including blood-pressure reduction, than aggressive sodium reduction alone,” Oparil writes.

To the scholars in us, these findings are an inspiration for further research.  But how should we apply this knowledge to our daily lives, when we’re sitting in diners wondering whether to shake or not to shake?  It may be presumptuous to read too much into the findings, to infer causality where it has yet to be established.  Still, as a species, we’re better equipped today than we’ve ever been to make smart choices about our diet.  And until more data become available, moderation may be the way to go.

The results of this new research are also summarized in a short animation.

View the latest NEJM Quick Take now.

A Gut Instinct

Posted by Sara Fazio • August 8th, 2014

In the latest Clinical Problem-Solving article, a 30-year-old female physician presented to the emergency department in mid-August, with a 4-day history of anorexia, nausea, vomiting, and diarrhea. She had no fever or respiratory symptoms but had mild abdominal discomfort.

Tuberculosis is a well-recognized occupational hazard for those involved in health care. In a systematic review that included 15 studies conducted in high-income countries, health care workers had a median annual risk of pulmonary tuberculosis infection of 1.1% (range, 0.2 to 12), as compared with a risk of 0.1 to 0.2% in the general population.

Clinical Pearls

How has the incidence of extrapulmonary tuberculosis changed in the United States?

There has been a steady decline in cases of tuberculosis in the United States, from 52.6 cases per 100,000 population in 1953 to 3.6 cases per 100,000 population in 2010. However, the proportion of cases of extrapulmonary tuberculosis per total cases of tuberculosis increased from 7.6% in 1962 to 20% or more since the late 1990s. The risk of extrapulmonary tuberculosis is reported to be increased among women, Asian and foreign-born persons, and health care workers.

What are the epidemiology and features of intraabdominal tuberculosis?

Intraabdominal tuberculosis, including peritoneal and mesenteric lymph-node involvement, is the sixth most common type of extrapulmonary tuberculosis reported in the United States. The diagnosis of intraabdominal tuberculosis is challenging, owing to its protean and nonspecific manifestations. In a large case series, up to two thirds of patients had negative tuberculin skin tests, and many did not have respiratory symptoms. Systemic and constitutional symptoms are frequent, as are abdominal pain and distention. The majority of patients with tuberculous peritonitis have ascites, which results from fluid exudation from peritoneal surfaces; only about 10% of patients present with the “dry type” of tuberculous peritonitis, characterized by a doughy abdomen, adhesions, fibrosis, and the absence of ascites. Often there is a long delay between symptom onset and diagnosis.

Morning Report Questions

Q: What is the reported sensitivity of tuberculin skin testing in cases of active tuberculosis?  

A: In the case of active tuberculosis, the reported sensitivity of tuberculin skin testing is highly variable and is generally estimated from culture-confirmed cases; false negative test results are reported in up to 25% of cases. Tuberculin skin tests should not be considered to be reliable tests for the diagnosis of active disease.

Q: What are the guidelines for treatment of extrapulmonary tuberculosis?        

A: Guidelines for the treatment of extrapulmonary tuberculosis closely mirror those for the treatment of pulmonary tuberculosis. Currently, for susceptible tuberculosis, four-drug therapy for 2 months is recommended, followed by two-drug therapy for 4 months or longer, depending on the extrapulmonary site (e.g., the meninges require longer therapy).

Kidney Transplantation in Children

Posted by Sara Fazio • August 8th, 2014

A new review discusses unique aspects of kidney transplantation in children that necessitate specialized approaches and have resulted in clinical advances leading to higher success rates in young children than in any other age group.

The most common primary causes of kidney failure are congenital or inherited disorders such as renal dysplasia, obstructive uropathies, or reflux nephropathy in young children and acquired glomerular diseases such as focal segmental glomerulosclerosis and lupus nephritis in older children. In contrast, the most common primary renal diseases that lead to end-stage kidney disease in adults are diabetic nephropathy, hypertension, and autosomal dominant polycystic kidney disease, which rarely cause end-stage kidney disease in children.

Clinical Pearls

• How should immunizations be handled in the pediatric transplant recipient?

Children require multiple vaccinations during early childhood to protect them from preventable infectious diseases. However, vaccines may not be effective if administered to an immunocompromised patient. Therefore, a vigorous effort to immunize children completely before transplantation is critical. Because children with end-stage kidney disease often have a suboptimal immune response and reduced duration of immunity, higher initial doses, extra doses, and antibody titer monitoring with booster doses of vaccines may be needed. In the period after transplantation, the administration of live vaccines is generally avoided, but other immunizations may be given after immunosuppressive medications have reached low maintenance levels, typically at 6 to 12 months after transplantation. Injectable influenza vaccine should be given annually.

• Is size and age matching required in pediatric kidney transplant recipients?

Unlike heart and liver allografts, the kidney allograft is placed in a different location from the failed native organ, and the native organ is often left in place. Thus, size and age matching is generally not required in kidney transplantation. In fact, matching very young donors to very young recipients was associated previously with a very high rate of graft loss, often due to thrombosis. On the basis of those adverse results, pediatric programs now transplant adult kidneys into small children once the recipient has reached a sufficient size, typically 6.5 to 10.0 kg of body weight. An infant’s peritoneal cavity has enough space to accommodate an adult kidney without compressing the allograft. However, the youngest pediatric recipients have an allograft-size mismatch that leads to a high glomerular filtration rate and makes interpretation of serum creatinine results more difficult, since acute rejection may initially occur without an elevation of the serum creatinine level. Kidneys from deceased donors who were very small children are no longer allocated to small children but are, in fact, now transplanted en bloc (both kidney together, attached to a single segment of the aorta and vena cava) into adults with excellent results.

Morning Report Questions

Q: What concerns are there about linear growth in children who have chronic kidney failure and receive a kidney transplant?

A: Children are in a state of active growth. Chronic kidney failure can lead to severe growth failure, often with associated loss of self-esteem. Children with kidney failure were once approximately 2.5 SD below the expected height for their age at the time of transplantation. Improved nutrition before transplantation and aggressive use of recombinant human growth hormone have reduced, although not eliminated, this height deficit. Renal transplantation generally improves linear growth but does not completely restore it. The greatest recovery in growth is seen in the youngest children, and   the least is seen in adolescents. The use of glucocorticoid withdrawal or avoidance protocols and the administration of growth hormone after transplantation may further improve growth recovery.

Q: What have clinical trials shown about pediatric immunosuppression, and how do medications differ from those given to adults?

A: Trials have shown that high doses of immunosuppressive drugs to compensate for glucocorticoid withdrawal can lead to unacceptable rates of post-transplantation lymphoproliferative disorder (PTLD), that glucocorticoid avoidance is not immunologically detrimental, although it does not ameliorate chronic histologic damage, and that tacrolimus is associated with a significantly lower rate of acute rejection at 6 months than is cyclosporine. Pharmacokinetic studies showed that cyclosporine has a shorter half-life in children than in adults and requires dosing three times daily. Similarly, sirolimus has a shorter half-life in children than in adults and often requires twice-daily dosing. The area under the curve of dose-normalized mycophenolic acid is higher in children than has been commonly observed in adults. In glucocorticoid-free protocols as compared with protocols that include glucocorticoids, the use of mycophenolate mofetil is associated with more frequent and severe leukopenia, anemia, and gastrointestinal disturbances. B-cell depletion with the lytic chimeric mouse-human anti-CD20 antibody rituximab is used increasingly in pediatric kidney-transplant recipients. Data have shown full recovery of the B-cell pool 15 months after rituximab treatment in children, whereas recovery began only at 24 months in adults and was never complete.

Novel Ways to Detect Creutzfeld-Jacob Disease?

Posted by Daniela Lamas • August 6th, 2014

Anyone who has eaten a burger has had a sneaking fear, however irrational: Could I have been exposed to mad cow? And how could I possibly find out before symptoms of the disease take hold?

As it is, the way to confirm the diagnosis of the fatal disease is by direct examination of brain tissue, which requires a brain biopsy, or by autopsy. But two studies in this week’s issue of NEJM report preliminary data for novel ways to detect the abnormally folded prion protein, in urine and nasal brushings of affected patients. They offer hope for non-invasive methods to diagnose both variant Creutzfeld-Jacob Disease (vCJD)– so-called ‘mad cow disease’—and the sporadic form of CJD.

In one study, Christina Orrú and colleagues investigated how accurate an amplification technique called “real-time quaking induced conversion” is to diagnose sporadic CJD in living patients. This technique is an assay that works by causing the diseased prion protein to react with recombinant prion protein to form amyloid, which is detected with a dye. The technique has shown promise for detecting CJD in cerebrospinal fluid samples, with a sensitivity of 80 to 90 percent. At the same time, the abnormal prion protein is known to accumulate in the olfactory epithelium of patients with CJD – leading investigators to question whether that might be an easier-to-reach diagnostic site.

With that background, the authors enrolled 31 patients with rapidly progressive dementia due to definite or probable CJD and 43 patients with other neurodegenerative disorders. The investigators took nasal brushing samples and CSF samples from these patients and found that the sensitivity to detect CJD was 97 percent in nasal brushings, compared to 77 percent in CSF samples. Both tests were highly specific for CJD.

And olfactory mucosa isn’t the only unexpected place to find the misfolded prion proteins, an accompanying paper in this week’s NEJM suggests. Fabio Moda and colleagues took urine samples from 14 patients with variant CJD and more than 200 patients with other neurologic diseases, including the sporadic form of CJD Using a technique that allows amplification of “minute” samples of prion protein, the authors detected abnormal prion proteins in 13 of the 14 variant CJD patients and none of the other patients with other diseases.

Although questions remain — for instance, how accurate will these tests prove to be in making the diagnosis and how early in the disease process can the prion protein be detected –  both studies suggest promise for new ways to diagnose both variant and sporadic CJD in affected patient’s olfactory mucosa and urine. These findings could point the way toward further research into a non-invasive way to establish a definitive diagnosis of this devastating disease.

For more on this topic, see the accompanying editorial from the University of Melbourne’s Dr. Colin Masters.

Brain Abscess

Posted by Sara Fazio • August 1st, 2014

Despite advances in diagnostic techniques and treatment, brain abscess remains a challenging clinical problem with substantial case fatality rates. Delays in diagnosis and treatment can result in a poor outcome. A new review summarizes current approaches to effective treatment.

Despite advances in imaging techniques, laboratory diagnostics, surgical interventions, and antimicrobial treatment, brain abscess remains a challenging clinical problem with substantial case fatality rates. Brain abscess can be caused by bacteria, mycobacteria, fungi, or parasites (protozoa and helminths), and the reported incidence ranges from 0.4 to 0.9 cases per 100,000 population. Rates are increased in immunosuppressed patients.

Clinical Pearls

What is the epidemiology of brain abscesses?

Severe immunocompromise, resulting from immunosuppressive therapy in patients who have undergone solid-organ or hematopoietic stem-cell transplantation or from HIV infection, is often associated with tuberculosis or nonbacterial causes of infection, such as fungi or parasites. HIV infection is associated with brain abscess caused by Toxoplasma gondii, but HIV infection also predisposes patients to infection with Mycobacterium tuberculosis. Patients who have received solid-organ transplants are at risk not only for nocardial brain abscess but also for fungal abscess (e.g., resulting from infection by aspergillus or candida species). Abscess formation may occur after neurosurgical procedures or head trauma. In these cases, infection is often caused by skin-colonizing bacteria, such as Staphylococcus aureus and S. epidermidis, or gram-negative bacilli. Brain abscess due to contiguous spread from parameningeal foci of infection (e.g., the middle ears, mastoids, and sinuses) is frequently caused by streptococcus species, but staphylococcal and polymicrobial abscesses (including those caused by anaerobes and gram-negative bacilli) also occur. Staphylococcus and streptococcus species are often identified in brain abscesses after hematogenous spread. The microbial flora of brain abscesses resulting from paranasal sinus or dental infection are often polymicrobial.

What is the typical presentation of a patient with a brain abscess?

The most frequent clinical manifestation of brain abscess is headache; fever and altered level of consciousness are frequently absent. Neurologic signs depend on the site of the abscess and can be subtle for days to weeks. Behavioral changes may occur in patients with abscesses in the frontal or right temporal lobes. Patients with abscesses in the brain stem or cerebellum may present with cranial-nerve palsy, gait disorder, or either headache or altered  mental status owing to hydrocephalus. Up to 25% of patients present  with seizures.

Morning Report Questions

Q: What are the most appropriate diagnostic tools in cases of suspected brain abscess?

A: Cranial imaging should be performed in all patients with suspected brain abscess. Computed tomographic (CT) scanning with contrast enhancement provides a rapid means of detecting the size, number, and localization of abscesses. Magnetic resonance imaging (MRI), combined with diffusion-weighted and apparent-diffusion-coefficient images, is a valuable diagnostic tool in differentiating brain abscess from primary, cystic, or necrotic tumors. One prospective study involving 115 patients with 147 cystic brain lesions, which included 97 patients with brain abscess, showed that diffusion-weighted imaging had a sensitivity and specificity for the differentiation of brain abscesses from primary or metastatic cancers of 96% (positive predictive value, 98%; negative predictive value, 92%). Cultures of blood and cerebrospinal fluid identify the causative pathogen in approximately one quarter of patients. Cultures of cerebrospinal fluid may be valuable in patients with coexisting meningitis.  However, the risk of brain herniation must be considered in these patients.

Figure 3. Imaging Studies of Brain Abscess. 

Q: How should a brain abscess be managed?

A: Since 27% of brain abscesses are polymicrobial, broad-spectrum antimicrobial therapy is advised until the results of culture of the abscess itself are known or until repeated aerobic and anaerobic cultures from blood or other sites of infection show no other pathogen. An abscess size of more than 2.5 cm in diameter has been recommended as an indication for neurosurgical intervention, but data from comparative studies are lacking, and this size cannot be regarded as a definitive indication for aspiration. Anticonvulsant treatment is not routinely indicated in patients with brain abscess. Focal neurologic deficits may develop in response to abscess growth or surrounding edema. Adjunctive glucocorticoid therapy may reduce cerebral edema and is used in about half of patients with brain abscess. Since data from randomized studies are lacking and glucocorticoids may reduce passage of antimicrobial agents into the central nervous system, their use should be limited to patients with profound edema that is likely to lead to cerebral herniation.