VIDEO: Withdrawing antipsychotics is safe and feasible in long-term care

Article Type
Changed
Fri, 01/18/2019 - 16:10
Display Headline
VIDEO: Withdrawing antipsychotics is safe and feasible in long-term care

TORONTO – Antipsychotics can be safely withdrawn from many dementia patients in long-term care facilities, two new studies from Australia and Canada have determined.

When the drugs were withdrawn and supplanted with behavior-centered care in the Australian study, 80% of patients experienced no relapse of symptoms, Henry Brodaty, MD, DSc, said at the Alzheimer’s Association International Conference 2016.

 

Dr. Henry Brodaty

“We saw no significant changes at all in agitation, aggression, delusions, or hallucinations,” Dr. Brodaty, the Scientia Professor of Ageing and Mental Health, University of New South Wales, Australia, said in an interview. “Were we surprised at this? No. Because for the majority of these patients, the medications were inappropriately prescribed.”

The 12-month Australian study is still in the process of tracking outcomes after antipsychotic withdrawal. But the Canadian study found great benefits, said Selma Didic, an improvement analyst with the Canadian Foundation for Healthcare Improvement in Ottawa. “We saw falls decrease by 20%. The incidence of verbal abuse and socially disruptive behavior actually decreased as well.”

In fact, she said, patients who discontinued the medications actually started behaving better than the comparator group that stayed on them.

The Australian experience

Dr. Brodaty discussed the HALT (Halting Antipsychotic Use in Long-Term Care) study. HALT is a single-arm, 12-month longitudinal study carried out in 23 nursing homes in New South Wales.

The study team worked with nursing leadership in each facility to identify patients who might be eligible for the program. In order to enroll, each patient’s family and general physician had to agree to a trial of deprescribing. Physicians were instructed to wean patients off the medication by decreasing the dose by half once a week. Most patients were able to stop within a couple of weeks, Dr. Brodaty said.

Getting buy-in wasn’t always easy, he noted. “Some families didn’t want to rock the boat, and some physicians were resistant,” to the idea. Overall, “Families and nurses were very, very worried” about the prospect of dropping drugs that were seen as helpful in everyday patient management.

But getting rid of the medications was just half the picture. Training nurses and care staff to intervene in problematic behaviors without resorting to drugs was just as important. A nurse-leader at each facility received training in person-centered care, and then trained the rest of the staff. This wasn’t always an easy idea to embrace, either, Dr. Brodaty said, especially since nursing staff often leads the discussion about the need for drugs to manage behavioral problems.

“Nursing staff are very task oriented, focused on dressing, bathing, eating, and toileting. They work very hard, and they don’t always have time to sit down and talk to resistant patients. It takes a much different attitude to show that you can actually save time by spending time and engaging the patient.”

He related one of his favorite illustrative stories – the milkman who caused a ruckus at bath time. “He got upset and aggressive every night when being put to bed and every morning when being given a shower. The staff spoke to his wife about it. She said that for 40 years, he was accustomed to getting up at 4 a.m. to deliver the milk. He would take a bath at night and get on his track suit and go to bed. Then at 4 a.m., he would get up and be ready to jump in the truck and go.”

When the staff started letting him shower at night and go to bed in his track suit, the milkman’s behavior improved without the need for antipsychotic medications.

“This is what we mean by ‘person-centered care,’ ” Dr. Brodaty said. “We use the ABC paradigm: Addressing the antecedent to the behavior, then the behavior, and then the consequences of the behavior.”

The intervention cohort comprised 139 patients with a mean age of 85 years; most were women. The vast majority (93%) had a diagnosis of dementia. About one-third had Alzheimer’s and one-third vascular dementia. The remainder had other diagnoses, including frontotemporal dementia, Lewy body dementia, and Parkinson’s disease. Common comorbid conditions included depression (56%) and previous stroke (36%). None of the patients had a diagnosis of psychosis.

Risperidone was the most common antipsychotic medication (85%). Other medications were olanzapine, quetiapine, and haloperidol. About 30% had come to the facility on the medication; the others had received it since admission.

Despite the national recommendation to review antipsychotic use every 12 weeks, patients had been on their current antipsychotic for an average of 2 years, and on their current dose for 1 year. In reviewing medications, Dr. Brodaty also found a “concerning” lack of informed consent. In Australia, informed consent for antipsychotic drugs can be given by a family member, but 84% of patients had no documented consent at all.

 

 

Of the original group, 125 entered the deprescribing protocol. Of these, 26 (21%) have since resumed their medications, but 79% have done well and are without a relapse of their symptoms or problematic behaviors. An ongoing medication review suggests there has been no concomitant upswing in other psychotropic medications, including benzodiazepines.

Neuropsychiatric symptoms remained stable from baseline. The mean total group score on the Neuropsychiatric Index (NPI) has not changed from its baseline of 30. The mean agitation/aggression NPI subscale has remained about 6, and the mean group score on the Cohen-Mansfield Agitation Inventory about 56. The NPI delusion subscale increased, but the change was nonsignificant, Dr. Brodaty said. The NPI hallucinations subscale decreased slightly, but again the change was nonsignificant.

“Look, we all know antipsychotics are bad for old people, and we all know they are overprescribed,” he said. “Inappropriate use of these medications is an old story, yet we’re still talking about it. Why is this? We have the knowledge now, and we have to build on this knowledge so that we can change practice.”

The Canadian experience

Ms. Didic shared a year-long quality improvement process at 24 long-term care facilities that wanted to improve antipsychotic prescribing for their dementia patients.

The program, which was sponsored by the Canadian Foundation for Healthcare Improvement, used a “train-the-trainer” approach to spread support for antipsychotic deprescribing.

 

Selma Didic

The foundation deployed 15 interdisciplinary teams, which comprised 180 members, including physicians, nurses, pharmacists, recreational therapists, and “clinical champions” who took the methodology directly into participating facilities. Interactive webinars on patient-centered care and deprescribing protocols were part of the process, Ms. Didic said.

In all, 416 patients were included in the outcomes report. Within 12 months, antipsychotics were eliminated in 74 patients (18%) and in 148 (36%), the dosage was reduced.

The benefits of these changes were striking, Ms. Didic said. There were fewer falls and reductions in verbal abuse, care resistance, and socially inappropriate behaviors. These issues either remained the same or got worse in patients who did not decrease antipsychotics. Again, there was no concomitant increase in other psychotropic medications.

The results show that changing the focus from medication-first to behavior-first care is institutionally feasible, Ms. Didic said.

Staff members’ assessments of the program and its personal and institutional impact were positive:

• 91% said they instituted regular medication reviews for every resident.

• 92% said old ways of doing things were adjusted to accommodate the new type of care.

• 94% said the new person-centered care was now a standard way of working.

• 84% said the project improved their ability to lead.

• 80% said it improved their ability to communicate.

“Currently, our teams are now spreading and sharing these resources and tools, serving as advisers, and organizing clinical training and workshops,” for other Canadian nursing homes that want to adopt the strategy.

Dr. Richard Caselli, professor of neurology at the Mayo Clinic, Scottsdale, Ariz., commented on the issues surrounding antipsychotic prescribing in long-term care facilities in a video interview.

Neither Ms. Didic nor Dr. Brodaty had any financial declarations.

 

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

 

msullivan@frontlinemedcom.com

On Twitter @alz_gal

Meeting/Event
Publications
Topics
Legacy Keywords
AAIC 2016, antipsychotic, dementia, overuse
Sections
Meeting/Event
Meeting/Event

TORONTO – Antipsychotics can be safely withdrawn from many dementia patients in long-term care facilities, two new studies from Australia and Canada have determined.

When the drugs were withdrawn and supplanted with behavior-centered care in the Australian study, 80% of patients experienced no relapse of symptoms, Henry Brodaty, MD, DSc, said at the Alzheimer’s Association International Conference 2016.

 

Dr. Henry Brodaty

“We saw no significant changes at all in agitation, aggression, delusions, or hallucinations,” Dr. Brodaty, the Scientia Professor of Ageing and Mental Health, University of New South Wales, Australia, said in an interview. “Were we surprised at this? No. Because for the majority of these patients, the medications were inappropriately prescribed.”

The 12-month Australian study is still in the process of tracking outcomes after antipsychotic withdrawal. But the Canadian study found great benefits, said Selma Didic, an improvement analyst with the Canadian Foundation for Healthcare Improvement in Ottawa. “We saw falls decrease by 20%. The incidence of verbal abuse and socially disruptive behavior actually decreased as well.”

In fact, she said, patients who discontinued the medications actually started behaving better than the comparator group that stayed on them.

The Australian experience

Dr. Brodaty discussed the HALT (Halting Antipsychotic Use in Long-Term Care) study. HALT is a single-arm, 12-month longitudinal study carried out in 23 nursing homes in New South Wales.

The study team worked with nursing leadership in each facility to identify patients who might be eligible for the program. In order to enroll, each patient’s family and general physician had to agree to a trial of deprescribing. Physicians were instructed to wean patients off the medication by decreasing the dose by half once a week. Most patients were able to stop within a couple of weeks, Dr. Brodaty said.

Getting buy-in wasn’t always easy, he noted. “Some families didn’t want to rock the boat, and some physicians were resistant,” to the idea. Overall, “Families and nurses were very, very worried” about the prospect of dropping drugs that were seen as helpful in everyday patient management.

But getting rid of the medications was just half the picture. Training nurses and care staff to intervene in problematic behaviors without resorting to drugs was just as important. A nurse-leader at each facility received training in person-centered care, and then trained the rest of the staff. This wasn’t always an easy idea to embrace, either, Dr. Brodaty said, especially since nursing staff often leads the discussion about the need for drugs to manage behavioral problems.

“Nursing staff are very task oriented, focused on dressing, bathing, eating, and toileting. They work very hard, and they don’t always have time to sit down and talk to resistant patients. It takes a much different attitude to show that you can actually save time by spending time and engaging the patient.”

He related one of his favorite illustrative stories – the milkman who caused a ruckus at bath time. “He got upset and aggressive every night when being put to bed and every morning when being given a shower. The staff spoke to his wife about it. She said that for 40 years, he was accustomed to getting up at 4 a.m. to deliver the milk. He would take a bath at night and get on his track suit and go to bed. Then at 4 a.m., he would get up and be ready to jump in the truck and go.”

When the staff started letting him shower at night and go to bed in his track suit, the milkman’s behavior improved without the need for antipsychotic medications.

“This is what we mean by ‘person-centered care,’ ” Dr. Brodaty said. “We use the ABC paradigm: Addressing the antecedent to the behavior, then the behavior, and then the consequences of the behavior.”

The intervention cohort comprised 139 patients with a mean age of 85 years; most were women. The vast majority (93%) had a diagnosis of dementia. About one-third had Alzheimer’s and one-third vascular dementia. The remainder had other diagnoses, including frontotemporal dementia, Lewy body dementia, and Parkinson’s disease. Common comorbid conditions included depression (56%) and previous stroke (36%). None of the patients had a diagnosis of psychosis.

Risperidone was the most common antipsychotic medication (85%). Other medications were olanzapine, quetiapine, and haloperidol. About 30% had come to the facility on the medication; the others had received it since admission.

Despite the national recommendation to review antipsychotic use every 12 weeks, patients had been on their current antipsychotic for an average of 2 years, and on their current dose for 1 year. In reviewing medications, Dr. Brodaty also found a “concerning” lack of informed consent. In Australia, informed consent for antipsychotic drugs can be given by a family member, but 84% of patients had no documented consent at all.

 

 

Of the original group, 125 entered the deprescribing protocol. Of these, 26 (21%) have since resumed their medications, but 79% have done well and are without a relapse of their symptoms or problematic behaviors. An ongoing medication review suggests there has been no concomitant upswing in other psychotropic medications, including benzodiazepines.

Neuropsychiatric symptoms remained stable from baseline. The mean total group score on the Neuropsychiatric Index (NPI) has not changed from its baseline of 30. The mean agitation/aggression NPI subscale has remained about 6, and the mean group score on the Cohen-Mansfield Agitation Inventory about 56. The NPI delusion subscale increased, but the change was nonsignificant, Dr. Brodaty said. The NPI hallucinations subscale decreased slightly, but again the change was nonsignificant.

“Look, we all know antipsychotics are bad for old people, and we all know they are overprescribed,” he said. “Inappropriate use of these medications is an old story, yet we’re still talking about it. Why is this? We have the knowledge now, and we have to build on this knowledge so that we can change practice.”

The Canadian experience

Ms. Didic shared a year-long quality improvement process at 24 long-term care facilities that wanted to improve antipsychotic prescribing for their dementia patients.

The program, which was sponsored by the Canadian Foundation for Healthcare Improvement, used a “train-the-trainer” approach to spread support for antipsychotic deprescribing.

 

Selma Didic

The foundation deployed 15 interdisciplinary teams, which comprised 180 members, including physicians, nurses, pharmacists, recreational therapists, and “clinical champions” who took the methodology directly into participating facilities. Interactive webinars on patient-centered care and deprescribing protocols were part of the process, Ms. Didic said.

In all, 416 patients were included in the outcomes report. Within 12 months, antipsychotics were eliminated in 74 patients (18%) and in 148 (36%), the dosage was reduced.

The benefits of these changes were striking, Ms. Didic said. There were fewer falls and reductions in verbal abuse, care resistance, and socially inappropriate behaviors. These issues either remained the same or got worse in patients who did not decrease antipsychotics. Again, there was no concomitant increase in other psychotropic medications.

The results show that changing the focus from medication-first to behavior-first care is institutionally feasible, Ms. Didic said.

Staff members’ assessments of the program and its personal and institutional impact were positive:

• 91% said they instituted regular medication reviews for every resident.

• 92% said old ways of doing things were adjusted to accommodate the new type of care.

• 94% said the new person-centered care was now a standard way of working.

• 84% said the project improved their ability to lead.

• 80% said it improved their ability to communicate.

“Currently, our teams are now spreading and sharing these resources and tools, serving as advisers, and organizing clinical training and workshops,” for other Canadian nursing homes that want to adopt the strategy.

Dr. Richard Caselli, professor of neurology at the Mayo Clinic, Scottsdale, Ariz., commented on the issues surrounding antipsychotic prescribing in long-term care facilities in a video interview.

Neither Ms. Didic nor Dr. Brodaty had any financial declarations.

 

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

 

msullivan@frontlinemedcom.com

On Twitter @alz_gal

TORONTO – Antipsychotics can be safely withdrawn from many dementia patients in long-term care facilities, two new studies from Australia and Canada have determined.

When the drugs were withdrawn and supplanted with behavior-centered care in the Australian study, 80% of patients experienced no relapse of symptoms, Henry Brodaty, MD, DSc, said at the Alzheimer’s Association International Conference 2016.

 

Dr. Henry Brodaty

“We saw no significant changes at all in agitation, aggression, delusions, or hallucinations,” Dr. Brodaty, the Scientia Professor of Ageing and Mental Health, University of New South Wales, Australia, said in an interview. “Were we surprised at this? No. Because for the majority of these patients, the medications were inappropriately prescribed.”

The 12-month Australian study is still in the process of tracking outcomes after antipsychotic withdrawal. But the Canadian study found great benefits, said Selma Didic, an improvement analyst with the Canadian Foundation for Healthcare Improvement in Ottawa. “We saw falls decrease by 20%. The incidence of verbal abuse and socially disruptive behavior actually decreased as well.”

In fact, she said, patients who discontinued the medications actually started behaving better than the comparator group that stayed on them.

The Australian experience

Dr. Brodaty discussed the HALT (Halting Antipsychotic Use in Long-Term Care) study. HALT is a single-arm, 12-month longitudinal study carried out in 23 nursing homes in New South Wales.

The study team worked with nursing leadership in each facility to identify patients who might be eligible for the program. In order to enroll, each patient’s family and general physician had to agree to a trial of deprescribing. Physicians were instructed to wean patients off the medication by decreasing the dose by half once a week. Most patients were able to stop within a couple of weeks, Dr. Brodaty said.

Getting buy-in wasn’t always easy, he noted. “Some families didn’t want to rock the boat, and some physicians were resistant,” to the idea. Overall, “Families and nurses were very, very worried” about the prospect of dropping drugs that were seen as helpful in everyday patient management.

But getting rid of the medications was just half the picture. Training nurses and care staff to intervene in problematic behaviors without resorting to drugs was just as important. A nurse-leader at each facility received training in person-centered care, and then trained the rest of the staff. This wasn’t always an easy idea to embrace, either, Dr. Brodaty said, especially since nursing staff often leads the discussion about the need for drugs to manage behavioral problems.

“Nursing staff are very task oriented, focused on dressing, bathing, eating, and toileting. They work very hard, and they don’t always have time to sit down and talk to resistant patients. It takes a much different attitude to show that you can actually save time by spending time and engaging the patient.”

He related one of his favorite illustrative stories – the milkman who caused a ruckus at bath time. “He got upset and aggressive every night when being put to bed and every morning when being given a shower. The staff spoke to his wife about it. She said that for 40 years, he was accustomed to getting up at 4 a.m. to deliver the milk. He would take a bath at night and get on his track suit and go to bed. Then at 4 a.m., he would get up and be ready to jump in the truck and go.”

When the staff started letting him shower at night and go to bed in his track suit, the milkman’s behavior improved without the need for antipsychotic medications.

“This is what we mean by ‘person-centered care,’ ” Dr. Brodaty said. “We use the ABC paradigm: Addressing the antecedent to the behavior, then the behavior, and then the consequences of the behavior.”

The intervention cohort comprised 139 patients with a mean age of 85 years; most were women. The vast majority (93%) had a diagnosis of dementia. About one-third had Alzheimer’s and one-third vascular dementia. The remainder had other diagnoses, including frontotemporal dementia, Lewy body dementia, and Parkinson’s disease. Common comorbid conditions included depression (56%) and previous stroke (36%). None of the patients had a diagnosis of psychosis.

Risperidone was the most common antipsychotic medication (85%). Other medications were olanzapine, quetiapine, and haloperidol. About 30% had come to the facility on the medication; the others had received it since admission.

Despite the national recommendation to review antipsychotic use every 12 weeks, patients had been on their current antipsychotic for an average of 2 years, and on their current dose for 1 year. In reviewing medications, Dr. Brodaty also found a “concerning” lack of informed consent. In Australia, informed consent for antipsychotic drugs can be given by a family member, but 84% of patients had no documented consent at all.

 

 

Of the original group, 125 entered the deprescribing protocol. Of these, 26 (21%) have since resumed their medications, but 79% have done well and are without a relapse of their symptoms or problematic behaviors. An ongoing medication review suggests there has been no concomitant upswing in other psychotropic medications, including benzodiazepines.

Neuropsychiatric symptoms remained stable from baseline. The mean total group score on the Neuropsychiatric Index (NPI) has not changed from its baseline of 30. The mean agitation/aggression NPI subscale has remained about 6, and the mean group score on the Cohen-Mansfield Agitation Inventory about 56. The NPI delusion subscale increased, but the change was nonsignificant, Dr. Brodaty said. The NPI hallucinations subscale decreased slightly, but again the change was nonsignificant.

“Look, we all know antipsychotics are bad for old people, and we all know they are overprescribed,” he said. “Inappropriate use of these medications is an old story, yet we’re still talking about it. Why is this? We have the knowledge now, and we have to build on this knowledge so that we can change practice.”

The Canadian experience

Ms. Didic shared a year-long quality improvement process at 24 long-term care facilities that wanted to improve antipsychotic prescribing for their dementia patients.

The program, which was sponsored by the Canadian Foundation for Healthcare Improvement, used a “train-the-trainer” approach to spread support for antipsychotic deprescribing.

 

Selma Didic

The foundation deployed 15 interdisciplinary teams, which comprised 180 members, including physicians, nurses, pharmacists, recreational therapists, and “clinical champions” who took the methodology directly into participating facilities. Interactive webinars on patient-centered care and deprescribing protocols were part of the process, Ms. Didic said.

In all, 416 patients were included in the outcomes report. Within 12 months, antipsychotics were eliminated in 74 patients (18%) and in 148 (36%), the dosage was reduced.

The benefits of these changes were striking, Ms. Didic said. There were fewer falls and reductions in verbal abuse, care resistance, and socially inappropriate behaviors. These issues either remained the same or got worse in patients who did not decrease antipsychotics. Again, there was no concomitant increase in other psychotropic medications.

The results show that changing the focus from medication-first to behavior-first care is institutionally feasible, Ms. Didic said.

Staff members’ assessments of the program and its personal and institutional impact were positive:

• 91% said they instituted regular medication reviews for every resident.

• 92% said old ways of doing things were adjusted to accommodate the new type of care.

• 94% said the new person-centered care was now a standard way of working.

• 84% said the project improved their ability to lead.

• 80% said it improved their ability to communicate.

“Currently, our teams are now spreading and sharing these resources and tools, serving as advisers, and organizing clinical training and workshops,” for other Canadian nursing homes that want to adopt the strategy.

Dr. Richard Caselli, professor of neurology at the Mayo Clinic, Scottsdale, Ariz., commented on the issues surrounding antipsychotic prescribing in long-term care facilities in a video interview.

Neither Ms. Didic nor Dr. Brodaty had any financial declarations.

 

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

 

msullivan@frontlinemedcom.com

On Twitter @alz_gal

Publications
Publications
Topics
Article Type
Display Headline
VIDEO: Withdrawing antipsychotics is safe and feasible in long-term care
Display Headline
VIDEO: Withdrawing antipsychotics is safe and feasible in long-term care
Legacy Keywords
AAIC 2016, antipsychotic, dementia, overuse
Legacy Keywords
AAIC 2016, antipsychotic, dementia, overuse
Sections
Article Source

EXPERT ANALYSIS FROM AAIC 2016

Disallow All Ads
Alternative CME

VIDEO: Smell test reflects brain pathologies, risk of Alzheimer’s progression

Article Type
Changed
Fri, 01/18/2019 - 16:10
Display Headline
VIDEO: Smell test reflects brain pathologies, risk of Alzheimer’s progression

TORONTO – A scratch-and-sniff test that asks subjects to identify 40 odors and ranks olfaction is almost as powerful a predictor of Alzheimer’s disease as is a positive test for amyloid.

Low scores on the University of Pennsylvania Smell Identification Test (UPSIT) are linked to a thinning of the entorhinal cortex – the brain region where amyloid plaques are thought to first appear as Alzheimer’s disease takes hold, Seonjoo Lee, PhD, reported at the Alzheimer’s Association International Conference.

“The findings indirectly suggest that impairment in odor identification may precede thinning in the entorhinal cortex in the early clinical stage of AD,” she concluded.

According to William Kriesl, MD, poor UPSIT sores are also related to brain levels of amyloid beta and are almost as predictive of cognitive decline.

These sensory changes appear to be one of the earliest manifestations of Alzheimer’s disease, the researchers said at the Alzheimer’s Association International Conference. Their studies also suggest that the test has a place in the clinic as an easy and inexpensive screening tool for patients with memory complaints, Dr. Kreisl said at the Alzheimer’s Association International Conference.

The amyloid biomarker tests currently available are not suitable for wide dissemination. Amyloid brain scans are currently investigative; they are also invasive, expensive, and not covered by Medicare or any private insurance. Lumbar punctures are also invasive and expensive, and almost universally disliked by patients. Additionally, there is little consensus on how to interpret CSF amyloid levels.

“We need easy, noninvasive biomarkers that can be deployed in the clinic for patients who are concerned about their risk of memory decline,” said Dr. Kreisl of Columbia University Medical Center, New York. “Odor identification testing may provide to be a useful tool in helping physicians counsel patients who are concerned about this.”

His study concluded that the UPSIT predicted Alzheimer’s disease almost as well as invasive amyloid biomarkers. The scratch-and-sniff test asks subjects to identify 40 odors and ranks olfaction as normal, or mildly, moderately or severely impaired.

Dr. Kreisl examined the relationship between UPSIT and brain amyloid beta in 84 subjects, 58 of whom had mild cognitive impairment (MCI) at baseline. All of these subjects had either an amyloid brain scan or a lumbar puncture to measure amyloid in cerebrospinal fluid. They were followed for at least 6 months.

At follow-up, 67% of the group of participants showed cognitive decline. After correcting for age, gender, and education, patients who were amyloid-positive on imaging or in CSF were more than 7 times as likely to have experienced cognitive decline [Odds Ratio (OR) 7.3]. Overall, UPSIT score alone didn’t predict cognitive decline, Dr. Kreisl said. However, when it was imputed as a continuous variable, patients with a score of less than 35 on the 40-item test were four times more likely to show cognitive decline than those with a score of 35 or higher (OR 4).

In fact, these low UPSIT scores were much more common among amyloid-positive patients. Of the 38 patients who were positive for amyloid beta on either diagnostic test, 32 had an UPSIT score of less than 35 while six had a score of 35 or higher. Among the 46 amyloid-negative patients, 28 had low UPSIT scores and 18 had normal UPSIT scores.

Combining amyloid status and UPSIT in a single predictive model didn’t increase accuracy above that of either variable alone, which suggests olfactory dysfunction is not being completely driven by amyloid brain pathology.

“This makes sense because other factors like neurofibrillary tangle burden and other neurodegeneration are also involved in influencing how the UPSIT score predicts memory decline,” he said.

In a separate study, Dr. Lee examined the relationship of UPSIT performance to entorhinal cortical thickness in 397 cognitively normal subjects who were involved in the Washington Heights-Inwood Columbia Aging Project. These subjects took the UPSIT and had magnetic resonance brain imaging both at baseline and at 4 years’ follow-up. Over that time, 50 transitioned to dementia, and 49 of them were diagnosed with Alzheimer’s disease. Another 79 subjects experienced cognitive decline, which was defined as a decline of at least one standard deviation in the average of the three cognitive composite scores of memory, language and visuospatial domains.

In comparing the groups with and without dementia, Dr. Lee found significant differences in the follow-up UPSIT score (23 vs. 27) and entorhinal cortical thickness (2.9 vs. 3.1 mm).

One standard deviation in performance on the UPSIT score was associated with a significant 47% increase in the risk of dementia, while one standard deviation in entorhinal cortical thickness was associated with a 22% increase in the risk. However, Dr. Lee said, the interaction of entorhinal thickness and UPSIT score was only significant in the group of subjects who transitioned to dementia.

 

 

Dr. Richard Caselli, professor of neurology at the Mayo Clinic, Scottsdale, Ariz., commented on the limited clinical utility of the UPSIT in a video interview.

Neither Dr. Lee nor Dr. Kreisl had any financial declarations.

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

 

msullivan@frontlinemedcom.com

On Twitter @alz_gal

Meeting/Event
Publications
Topics
Legacy Keywords
AIC 2016, UPSIT, Alzheimer's, brain pathology, amyloid
Sections
Meeting/Event
Meeting/Event

TORONTO – A scratch-and-sniff test that asks subjects to identify 40 odors and ranks olfaction is almost as powerful a predictor of Alzheimer’s disease as is a positive test for amyloid.

Low scores on the University of Pennsylvania Smell Identification Test (UPSIT) are linked to a thinning of the entorhinal cortex – the brain region where amyloid plaques are thought to first appear as Alzheimer’s disease takes hold, Seonjoo Lee, PhD, reported at the Alzheimer’s Association International Conference.

“The findings indirectly suggest that impairment in odor identification may precede thinning in the entorhinal cortex in the early clinical stage of AD,” she concluded.

According to William Kriesl, MD, poor UPSIT sores are also related to brain levels of amyloid beta and are almost as predictive of cognitive decline.

These sensory changes appear to be one of the earliest manifestations of Alzheimer’s disease, the researchers said at the Alzheimer’s Association International Conference. Their studies also suggest that the test has a place in the clinic as an easy and inexpensive screening tool for patients with memory complaints, Dr. Kreisl said at the Alzheimer’s Association International Conference.

The amyloid biomarker tests currently available are not suitable for wide dissemination. Amyloid brain scans are currently investigative; they are also invasive, expensive, and not covered by Medicare or any private insurance. Lumbar punctures are also invasive and expensive, and almost universally disliked by patients. Additionally, there is little consensus on how to interpret CSF amyloid levels.

“We need easy, noninvasive biomarkers that can be deployed in the clinic for patients who are concerned about their risk of memory decline,” said Dr. Kreisl of Columbia University Medical Center, New York. “Odor identification testing may provide to be a useful tool in helping physicians counsel patients who are concerned about this.”

His study concluded that the UPSIT predicted Alzheimer’s disease almost as well as invasive amyloid biomarkers. The scratch-and-sniff test asks subjects to identify 40 odors and ranks olfaction as normal, or mildly, moderately or severely impaired.

Dr. Kreisl examined the relationship between UPSIT and brain amyloid beta in 84 subjects, 58 of whom had mild cognitive impairment (MCI) at baseline. All of these subjects had either an amyloid brain scan or a lumbar puncture to measure amyloid in cerebrospinal fluid. They were followed for at least 6 months.

At follow-up, 67% of the group of participants showed cognitive decline. After correcting for age, gender, and education, patients who were amyloid-positive on imaging or in CSF were more than 7 times as likely to have experienced cognitive decline [Odds Ratio (OR) 7.3]. Overall, UPSIT score alone didn’t predict cognitive decline, Dr. Kreisl said. However, when it was imputed as a continuous variable, patients with a score of less than 35 on the 40-item test were four times more likely to show cognitive decline than those with a score of 35 or higher (OR 4).

In fact, these low UPSIT scores were much more common among amyloid-positive patients. Of the 38 patients who were positive for amyloid beta on either diagnostic test, 32 had an UPSIT score of less than 35 while six had a score of 35 or higher. Among the 46 amyloid-negative patients, 28 had low UPSIT scores and 18 had normal UPSIT scores.

Combining amyloid status and UPSIT in a single predictive model didn’t increase accuracy above that of either variable alone, which suggests olfactory dysfunction is not being completely driven by amyloid brain pathology.

“This makes sense because other factors like neurofibrillary tangle burden and other neurodegeneration are also involved in influencing how the UPSIT score predicts memory decline,” he said.

In a separate study, Dr. Lee examined the relationship of UPSIT performance to entorhinal cortical thickness in 397 cognitively normal subjects who were involved in the Washington Heights-Inwood Columbia Aging Project. These subjects took the UPSIT and had magnetic resonance brain imaging both at baseline and at 4 years’ follow-up. Over that time, 50 transitioned to dementia, and 49 of them were diagnosed with Alzheimer’s disease. Another 79 subjects experienced cognitive decline, which was defined as a decline of at least one standard deviation in the average of the three cognitive composite scores of memory, language and visuospatial domains.

In comparing the groups with and without dementia, Dr. Lee found significant differences in the follow-up UPSIT score (23 vs. 27) and entorhinal cortical thickness (2.9 vs. 3.1 mm).

One standard deviation in performance on the UPSIT score was associated with a significant 47% increase in the risk of dementia, while one standard deviation in entorhinal cortical thickness was associated with a 22% increase in the risk. However, Dr. Lee said, the interaction of entorhinal thickness and UPSIT score was only significant in the group of subjects who transitioned to dementia.

 

 

Dr. Richard Caselli, professor of neurology at the Mayo Clinic, Scottsdale, Ariz., commented on the limited clinical utility of the UPSIT in a video interview.

Neither Dr. Lee nor Dr. Kreisl had any financial declarations.

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

 

msullivan@frontlinemedcom.com

On Twitter @alz_gal

TORONTO – A scratch-and-sniff test that asks subjects to identify 40 odors and ranks olfaction is almost as powerful a predictor of Alzheimer’s disease as is a positive test for amyloid.

Low scores on the University of Pennsylvania Smell Identification Test (UPSIT) are linked to a thinning of the entorhinal cortex – the brain region where amyloid plaques are thought to first appear as Alzheimer’s disease takes hold, Seonjoo Lee, PhD, reported at the Alzheimer’s Association International Conference.

“The findings indirectly suggest that impairment in odor identification may precede thinning in the entorhinal cortex in the early clinical stage of AD,” she concluded.

According to William Kriesl, MD, poor UPSIT sores are also related to brain levels of amyloid beta and are almost as predictive of cognitive decline.

These sensory changes appear to be one of the earliest manifestations of Alzheimer’s disease, the researchers said at the Alzheimer’s Association International Conference. Their studies also suggest that the test has a place in the clinic as an easy and inexpensive screening tool for patients with memory complaints, Dr. Kreisl said at the Alzheimer’s Association International Conference.

The amyloid biomarker tests currently available are not suitable for wide dissemination. Amyloid brain scans are currently investigative; they are also invasive, expensive, and not covered by Medicare or any private insurance. Lumbar punctures are also invasive and expensive, and almost universally disliked by patients. Additionally, there is little consensus on how to interpret CSF amyloid levels.

“We need easy, noninvasive biomarkers that can be deployed in the clinic for patients who are concerned about their risk of memory decline,” said Dr. Kreisl of Columbia University Medical Center, New York. “Odor identification testing may provide to be a useful tool in helping physicians counsel patients who are concerned about this.”

His study concluded that the UPSIT predicted Alzheimer’s disease almost as well as invasive amyloid biomarkers. The scratch-and-sniff test asks subjects to identify 40 odors and ranks olfaction as normal, or mildly, moderately or severely impaired.

Dr. Kreisl examined the relationship between UPSIT and brain amyloid beta in 84 subjects, 58 of whom had mild cognitive impairment (MCI) at baseline. All of these subjects had either an amyloid brain scan or a lumbar puncture to measure amyloid in cerebrospinal fluid. They were followed for at least 6 months.

At follow-up, 67% of the group of participants showed cognitive decline. After correcting for age, gender, and education, patients who were amyloid-positive on imaging or in CSF were more than 7 times as likely to have experienced cognitive decline [Odds Ratio (OR) 7.3]. Overall, UPSIT score alone didn’t predict cognitive decline, Dr. Kreisl said. However, when it was imputed as a continuous variable, patients with a score of less than 35 on the 40-item test were four times more likely to show cognitive decline than those with a score of 35 or higher (OR 4).

In fact, these low UPSIT scores were much more common among amyloid-positive patients. Of the 38 patients who were positive for amyloid beta on either diagnostic test, 32 had an UPSIT score of less than 35 while six had a score of 35 or higher. Among the 46 amyloid-negative patients, 28 had low UPSIT scores and 18 had normal UPSIT scores.

Combining amyloid status and UPSIT in a single predictive model didn’t increase accuracy above that of either variable alone, which suggests olfactory dysfunction is not being completely driven by amyloid brain pathology.

“This makes sense because other factors like neurofibrillary tangle burden and other neurodegeneration are also involved in influencing how the UPSIT score predicts memory decline,” he said.

In a separate study, Dr. Lee examined the relationship of UPSIT performance to entorhinal cortical thickness in 397 cognitively normal subjects who were involved in the Washington Heights-Inwood Columbia Aging Project. These subjects took the UPSIT and had magnetic resonance brain imaging both at baseline and at 4 years’ follow-up. Over that time, 50 transitioned to dementia, and 49 of them were diagnosed with Alzheimer’s disease. Another 79 subjects experienced cognitive decline, which was defined as a decline of at least one standard deviation in the average of the three cognitive composite scores of memory, language and visuospatial domains.

In comparing the groups with and without dementia, Dr. Lee found significant differences in the follow-up UPSIT score (23 vs. 27) and entorhinal cortical thickness (2.9 vs. 3.1 mm).

One standard deviation in performance on the UPSIT score was associated with a significant 47% increase in the risk of dementia, while one standard deviation in entorhinal cortical thickness was associated with a 22% increase in the risk. However, Dr. Lee said, the interaction of entorhinal thickness and UPSIT score was only significant in the group of subjects who transitioned to dementia.

 

 

Dr. Richard Caselli, professor of neurology at the Mayo Clinic, Scottsdale, Ariz., commented on the limited clinical utility of the UPSIT in a video interview.

Neither Dr. Lee nor Dr. Kreisl had any financial declarations.

The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel

 

msullivan@frontlinemedcom.com

On Twitter @alz_gal

Publications
Publications
Topics
Article Type
Display Headline
VIDEO: Smell test reflects brain pathologies, risk of Alzheimer’s progression
Display Headline
VIDEO: Smell test reflects brain pathologies, risk of Alzheimer’s progression
Legacy Keywords
AIC 2016, UPSIT, Alzheimer's, brain pathology, amyloid
Legacy Keywords
AIC 2016, UPSIT, Alzheimer's, brain pathology, amyloid
Sections
Article Source

AT AAIC 2016

Disallow All Ads

H. pylori’s relationship with gastric cancer? It’s complicated

Article Type
Changed
Wed, 05/26/2021 - 13:54
Display Headline
H. pylori’s relationship with gastric cancer? It’s complicated

CHICAGO – Does eradicating Helicobacter pylori prevent gastric cancer?

The answer is yes, sometimes, but it depends on where you live, and what other bacteria coexist in your gut microbiome.

Dr. Richard M. Peek Jr.

The overall view is a positive one, Richard M. Peek Jr., MD, said at the at the meeting sponsored by the American Gastroenterological Association. A very large, recent meta-analysis confirms it (Gastroenterology. 2016. doi:10.1053/j.gastro.2016.01.028). Comprising 24 studies and 48,000 subjects, the meta-analysis determined that eradicating the bacteria in infected people cut gastric cancer incidence significantly.

That’s great news – but there’s a big caveat, said Dr. Peek of Vanderbilt University, Nashville. “The benefit was dependent on what your baseline risk was. For those with a high baseline risk, the benefit was tremendous. For those with a low baseline risk, it was not statistically significant.”

There are long-term data suggesting that treating H. pylori sooner rather than later is the way to go. A 2005 study followed more than 700 patients with preneoplastic gastric lesions for 12 years. It found that the treatment effect was cumulative: The longer the patient was free of H. pylori, the more reliably healing occurred (Gut. 2005. doi:10.1136/gut.2005.072009).

At baseline, the patients were randomized to nutritional supplements or to a combination of amoxicillin, metronidazole, and bismuth subsalicylate. At 6 years, the trial was unblinded and all patients were offered treatment. Patients were followed for another 6 years. Those who were H. pylori negative at 12 years had 15% more regression and 14% less progression than subjects who were positive at 12 years. Among those who received anti–H. pylori treatment at the 6-year mark, the effect was smaller and nonsignificant.

Perhaps surprisingly, though, the biggest bang for H. pylori treatment is seen in the antrum of the stomach, not in the corpus. Another meta-analysis, this one of 16 studies, found very consistent reductions in the severity of intestinal metaplasia in the antrum after antibiotic treatment – but no difference at all in corpus metaplasia. The reason for that finding isn’t at all clear, the authors of that paper noted (World J Gastro. 2014. doi:10.3748/wjg.v20.i19.5903).

The bacteria-metaplasia cancer link gets even more complicated when H. pylori is viewed as a contributing member of society, rather than a hermit. The bacterium seems to be a bully in the neighborhood, radically altering the normal gastric microbiome, Dr. Peek said.

In the absence of H. pylori, the gastric microbiome is much more diverse, consisting of about 50% Actinobacteria and 25% Firmicutes species. Bacteroides and Proteobacteria species make up the remainder, with a small population of Cyanobacteria as well. In its presence, Proteobacteria – a gram-negative genus that includes a wide variety of pathogens – almost completely subsume beneficial bacteria.

Researchers saw this change in action in 2011, when a group at the Massachusetts Institute of Technology, Cambridge, inoculated two mouse populations with H. pylori and followed them for gastric neoplasms (Gastroenterology. 2011. doi:10.1053/j.gastro.2010.09.048). All the mice were genetically engineered to overexpress human gastrin, a characteristic that invariably leads them to develop gastric cancers.

One group comprised germ-free mice raised in sterile environments. The control group was free of pathogens, but lived in a conventional environment and so had normal gastric flora. Both groups were inoculated with H. pylori.

By 11 months, the microbiome of the control group was strikingly different. It showed a significant increase in the number of Firmicutes bacteria in the stomach, with an associated decrease in the number and variety of other bacteria including Bacteroides. This was especially interesting when viewed in relation to the rate of gastric neoplasia, Dr. Peek said.

These mice are programmed to develop gastric cancer by 6 months of age – and this is what happened in the control mice, which had H. pylori plus other gastric microbes. But the germ-free mice who were monoinfected with H. pylori showed a much different progression of disease. At 7 months, most showed only a mild hypergastrinemia. Conversely, at 7 months, all of the H. pylori–infected control mice had developed gastric intraepithelial neoplasia, 80% of it high grade. Only 10% of the monoinfected mice developed cancer, and all of it was low grade.

“It looks like there is active collaboration between H. pylori and other bacteria in the stomach,” resulting in this increased cancer risk, Dr. Peek said.

It’s a collaboration that reaches deep into the tumors themselves, he said. “A very interesting study a couple of years ago searched cancer genomes for the presence of bacterial DNA, and found that gastric cancers incorporated the second-highest amount of microbial DNA into their cancer genomes. But it wasn’t just H. pylori. Many other species had integrated their DNA into these tumors.”

 

 

That study, published in 2013, was the first to prove that bacterial DNA can impact carcinogenesis. Acute myeloid leukemia showed the highest integration of bacterial DNA, but gastric adenocarcinoma was a close second. Most of the species were of the Proteobacteria lineages (83%), with a third of that represented by Pseudomonas, particularly P. fluorescens and P. aeruginosa. Both of those species have been shown to promote gastric tumorigenesis in rats. All of the DNA integrations occurred in five genes; four of these are already known to be upregulated in gastric cancer (PLOS Comp Biol. 2013;9[6]:e1003107).

Interestingly, only a few of the sample reads turned up DNA integration with H. pylori.

This reduction in gastric microbial diversity could be an important key to H. pylori’s relation to gastric cancer, Dr. Peek said. He examined this in residents of two towns in Colombia, South America: Tumaco, where the risk of gastric cancer is low, and Tuquerres, where it’s 25 times higher (Sci Rep. 2016. doi:10.1038/srep18594).

What was different was the gastric microbiome of residents. Those living in low-risk Tumaco had much more microbial diversity: 361 varieties, compared with 194 in Tuquerres. And 16 of these groups – representative of what’s usually considered a healthy microbiome – were absent in the high-risk subjects. But Tuquerres residents had two bacteria that weren’t found in Tumaco residents, including Leptorichia wadei, which has been associated with necrotizing enterocolitis.

There was no difference, however, in the prevalence of H. pylori between these high- and low-risk groups.

These new findings illustrate an increasingly complicated interplay of bacteria and gastric cancer, Dr. Peek said. But they also provide a new direction for research.

“We have a framework now where we can move forward and try to understand how some of these other strains impact gastric cancer risk,” he said.

Dr. Peek had no relevant financial disclosures.

msullivan@frontlinemedcom.com

On Twitter @Alz_Gal

References

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
Freston Conference 2016, gastric cancer, H. pylori, DNA
Sections
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

CHICAGO – Does eradicating Helicobacter pylori prevent gastric cancer?

The answer is yes, sometimes, but it depends on where you live, and what other bacteria coexist in your gut microbiome.

Dr. Richard M. Peek Jr.

The overall view is a positive one, Richard M. Peek Jr., MD, said at the at the meeting sponsored by the American Gastroenterological Association. A very large, recent meta-analysis confirms it (Gastroenterology. 2016. doi:10.1053/j.gastro.2016.01.028). Comprising 24 studies and 48,000 subjects, the meta-analysis determined that eradicating the bacteria in infected people cut gastric cancer incidence significantly.

That’s great news – but there’s a big caveat, said Dr. Peek of Vanderbilt University, Nashville. “The benefit was dependent on what your baseline risk was. For those with a high baseline risk, the benefit was tremendous. For those with a low baseline risk, it was not statistically significant.”

There are long-term data suggesting that treating H. pylori sooner rather than later is the way to go. A 2005 study followed more than 700 patients with preneoplastic gastric lesions for 12 years. It found that the treatment effect was cumulative: The longer the patient was free of H. pylori, the more reliably healing occurred (Gut. 2005. doi:10.1136/gut.2005.072009).

At baseline, the patients were randomized to nutritional supplements or to a combination of amoxicillin, metronidazole, and bismuth subsalicylate. At 6 years, the trial was unblinded and all patients were offered treatment. Patients were followed for another 6 years. Those who were H. pylori negative at 12 years had 15% more regression and 14% less progression than subjects who were positive at 12 years. Among those who received anti–H. pylori treatment at the 6-year mark, the effect was smaller and nonsignificant.

Perhaps surprisingly, though, the biggest bang for H. pylori treatment is seen in the antrum of the stomach, not in the corpus. Another meta-analysis, this one of 16 studies, found very consistent reductions in the severity of intestinal metaplasia in the antrum after antibiotic treatment – but no difference at all in corpus metaplasia. The reason for that finding isn’t at all clear, the authors of that paper noted (World J Gastro. 2014. doi:10.3748/wjg.v20.i19.5903).

The bacteria-metaplasia cancer link gets even more complicated when H. pylori is viewed as a contributing member of society, rather than a hermit. The bacterium seems to be a bully in the neighborhood, radically altering the normal gastric microbiome, Dr. Peek said.

In the absence of H. pylori, the gastric microbiome is much more diverse, consisting of about 50% Actinobacteria and 25% Firmicutes species. Bacteroides and Proteobacteria species make up the remainder, with a small population of Cyanobacteria as well. In its presence, Proteobacteria – a gram-negative genus that includes a wide variety of pathogens – almost completely subsume beneficial bacteria.

Researchers saw this change in action in 2011, when a group at the Massachusetts Institute of Technology, Cambridge, inoculated two mouse populations with H. pylori and followed them for gastric neoplasms (Gastroenterology. 2011. doi:10.1053/j.gastro.2010.09.048). All the mice were genetically engineered to overexpress human gastrin, a characteristic that invariably leads them to develop gastric cancers.

One group comprised germ-free mice raised in sterile environments. The control group was free of pathogens, but lived in a conventional environment and so had normal gastric flora. Both groups were inoculated with H. pylori.

By 11 months, the microbiome of the control group was strikingly different. It showed a significant increase in the number of Firmicutes bacteria in the stomach, with an associated decrease in the number and variety of other bacteria including Bacteroides. This was especially interesting when viewed in relation to the rate of gastric neoplasia, Dr. Peek said.

These mice are programmed to develop gastric cancer by 6 months of age – and this is what happened in the control mice, which had H. pylori plus other gastric microbes. But the germ-free mice who were monoinfected with H. pylori showed a much different progression of disease. At 7 months, most showed only a mild hypergastrinemia. Conversely, at 7 months, all of the H. pylori–infected control mice had developed gastric intraepithelial neoplasia, 80% of it high grade. Only 10% of the monoinfected mice developed cancer, and all of it was low grade.

“It looks like there is active collaboration between H. pylori and other bacteria in the stomach,” resulting in this increased cancer risk, Dr. Peek said.

It’s a collaboration that reaches deep into the tumors themselves, he said. “A very interesting study a couple of years ago searched cancer genomes for the presence of bacterial DNA, and found that gastric cancers incorporated the second-highest amount of microbial DNA into their cancer genomes. But it wasn’t just H. pylori. Many other species had integrated their DNA into these tumors.”

 

 

That study, published in 2013, was the first to prove that bacterial DNA can impact carcinogenesis. Acute myeloid leukemia showed the highest integration of bacterial DNA, but gastric adenocarcinoma was a close second. Most of the species were of the Proteobacteria lineages (83%), with a third of that represented by Pseudomonas, particularly P. fluorescens and P. aeruginosa. Both of those species have been shown to promote gastric tumorigenesis in rats. All of the DNA integrations occurred in five genes; four of these are already known to be upregulated in gastric cancer (PLOS Comp Biol. 2013;9[6]:e1003107).

Interestingly, only a few of the sample reads turned up DNA integration with H. pylori.

This reduction in gastric microbial diversity could be an important key to H. pylori’s relation to gastric cancer, Dr. Peek said. He examined this in residents of two towns in Colombia, South America: Tumaco, where the risk of gastric cancer is low, and Tuquerres, where it’s 25 times higher (Sci Rep. 2016. doi:10.1038/srep18594).

What was different was the gastric microbiome of residents. Those living in low-risk Tumaco had much more microbial diversity: 361 varieties, compared with 194 in Tuquerres. And 16 of these groups – representative of what’s usually considered a healthy microbiome – were absent in the high-risk subjects. But Tuquerres residents had two bacteria that weren’t found in Tumaco residents, including Leptorichia wadei, which has been associated with necrotizing enterocolitis.

There was no difference, however, in the prevalence of H. pylori between these high- and low-risk groups.

These new findings illustrate an increasingly complicated interplay of bacteria and gastric cancer, Dr. Peek said. But they also provide a new direction for research.

“We have a framework now where we can move forward and try to understand how some of these other strains impact gastric cancer risk,” he said.

Dr. Peek had no relevant financial disclosures.

msullivan@frontlinemedcom.com

On Twitter @Alz_Gal

CHICAGO – Does eradicating Helicobacter pylori prevent gastric cancer?

The answer is yes, sometimes, but it depends on where you live, and what other bacteria coexist in your gut microbiome.

Dr. Richard M. Peek Jr.

The overall view is a positive one, Richard M. Peek Jr., MD, said at the at the meeting sponsored by the American Gastroenterological Association. A very large, recent meta-analysis confirms it (Gastroenterology. 2016. doi:10.1053/j.gastro.2016.01.028). Comprising 24 studies and 48,000 subjects, the meta-analysis determined that eradicating the bacteria in infected people cut gastric cancer incidence significantly.

That’s great news – but there’s a big caveat, said Dr. Peek of Vanderbilt University, Nashville. “The benefit was dependent on what your baseline risk was. For those with a high baseline risk, the benefit was tremendous. For those with a low baseline risk, it was not statistically significant.”

There are long-term data suggesting that treating H. pylori sooner rather than later is the way to go. A 2005 study followed more than 700 patients with preneoplastic gastric lesions for 12 years. It found that the treatment effect was cumulative: The longer the patient was free of H. pylori, the more reliably healing occurred (Gut. 2005. doi:10.1136/gut.2005.072009).

At baseline, the patients were randomized to nutritional supplements or to a combination of amoxicillin, metronidazole, and bismuth subsalicylate. At 6 years, the trial was unblinded and all patients were offered treatment. Patients were followed for another 6 years. Those who were H. pylori negative at 12 years had 15% more regression and 14% less progression than subjects who were positive at 12 years. Among those who received anti–H. pylori treatment at the 6-year mark, the effect was smaller and nonsignificant.

Perhaps surprisingly, though, the biggest bang for H. pylori treatment is seen in the antrum of the stomach, not in the corpus. Another meta-analysis, this one of 16 studies, found very consistent reductions in the severity of intestinal metaplasia in the antrum after antibiotic treatment – but no difference at all in corpus metaplasia. The reason for that finding isn’t at all clear, the authors of that paper noted (World J Gastro. 2014. doi:10.3748/wjg.v20.i19.5903).

The bacteria-metaplasia cancer link gets even more complicated when H. pylori is viewed as a contributing member of society, rather than a hermit. The bacterium seems to be a bully in the neighborhood, radically altering the normal gastric microbiome, Dr. Peek said.

In the absence of H. pylori, the gastric microbiome is much more diverse, consisting of about 50% Actinobacteria and 25% Firmicutes species. Bacteroides and Proteobacteria species make up the remainder, with a small population of Cyanobacteria as well. In its presence, Proteobacteria – a gram-negative genus that includes a wide variety of pathogens – almost completely subsume beneficial bacteria.

Researchers saw this change in action in 2011, when a group at the Massachusetts Institute of Technology, Cambridge, inoculated two mouse populations with H. pylori and followed them for gastric neoplasms (Gastroenterology. 2011. doi:10.1053/j.gastro.2010.09.048). All the mice were genetically engineered to overexpress human gastrin, a characteristic that invariably leads them to develop gastric cancers.

One group comprised germ-free mice raised in sterile environments. The control group was free of pathogens, but lived in a conventional environment and so had normal gastric flora. Both groups were inoculated with H. pylori.

By 11 months, the microbiome of the control group was strikingly different. It showed a significant increase in the number of Firmicutes bacteria in the stomach, with an associated decrease in the number and variety of other bacteria including Bacteroides. This was especially interesting when viewed in relation to the rate of gastric neoplasia, Dr. Peek said.

These mice are programmed to develop gastric cancer by 6 months of age – and this is what happened in the control mice, which had H. pylori plus other gastric microbes. But the germ-free mice who were monoinfected with H. pylori showed a much different progression of disease. At 7 months, most showed only a mild hypergastrinemia. Conversely, at 7 months, all of the H. pylori–infected control mice had developed gastric intraepithelial neoplasia, 80% of it high grade. Only 10% of the monoinfected mice developed cancer, and all of it was low grade.

“It looks like there is active collaboration between H. pylori and other bacteria in the stomach,” resulting in this increased cancer risk, Dr. Peek said.

It’s a collaboration that reaches deep into the tumors themselves, he said. “A very interesting study a couple of years ago searched cancer genomes for the presence of bacterial DNA, and found that gastric cancers incorporated the second-highest amount of microbial DNA into their cancer genomes. But it wasn’t just H. pylori. Many other species had integrated their DNA into these tumors.”

 

 

That study, published in 2013, was the first to prove that bacterial DNA can impact carcinogenesis. Acute myeloid leukemia showed the highest integration of bacterial DNA, but gastric adenocarcinoma was a close second. Most of the species were of the Proteobacteria lineages (83%), with a third of that represented by Pseudomonas, particularly P. fluorescens and P. aeruginosa. Both of those species have been shown to promote gastric tumorigenesis in rats. All of the DNA integrations occurred in five genes; four of these are already known to be upregulated in gastric cancer (PLOS Comp Biol. 2013;9[6]:e1003107).

Interestingly, only a few of the sample reads turned up DNA integration with H. pylori.

This reduction in gastric microbial diversity could be an important key to H. pylori’s relation to gastric cancer, Dr. Peek said. He examined this in residents of two towns in Colombia, South America: Tumaco, where the risk of gastric cancer is low, and Tuquerres, where it’s 25 times higher (Sci Rep. 2016. doi:10.1038/srep18594).

What was different was the gastric microbiome of residents. Those living in low-risk Tumaco had much more microbial diversity: 361 varieties, compared with 194 in Tuquerres. And 16 of these groups – representative of what’s usually considered a healthy microbiome – were absent in the high-risk subjects. But Tuquerres residents had two bacteria that weren’t found in Tumaco residents, including Leptorichia wadei, which has been associated with necrotizing enterocolitis.

There was no difference, however, in the prevalence of H. pylori between these high- and low-risk groups.

These new findings illustrate an increasingly complicated interplay of bacteria and gastric cancer, Dr. Peek said. But they also provide a new direction for research.

“We have a framework now where we can move forward and try to understand how some of these other strains impact gastric cancer risk,” he said.

Dr. Peek had no relevant financial disclosures.

msullivan@frontlinemedcom.com

On Twitter @Alz_Gal

References

References

Publications
Publications
Topics
Article Type
Display Headline
H. pylori’s relationship with gastric cancer? It’s complicated
Display Headline
H. pylori’s relationship with gastric cancer? It’s complicated
Legacy Keywords
Freston Conference 2016, gastric cancer, H. pylori, DNA
Legacy Keywords
Freston Conference 2016, gastric cancer, H. pylori, DNA
Sections
Article Source

AT THE 2016 JAMES W. FRESTON CONFERENCE

PURLs Copyright

Inside the Article

Disallow All Ads

Gastroesophageal cancers continue to make their mark globally

Article Type
Changed
Wed, 05/26/2021 - 13:54
Display Headline
Gastroesophageal cancers continue to make their mark globally

CHICAGO – Severe intestinal metaplasia can progress to adenocarcinoma in a small number of patients over 10 years, whether it’s in the esophagus or in the stomach.

Studies emerging from around the world find the same patterns and similar rates of progression in both diseases: 2% for esophageal and 3%-5% for gastric cancers over 10 years, no matter where the studies are conducted, Ernst Kuipers, MD, PhD, said at the meeting sponsored by the American Gastroenterological Association.

“It doesn’t matter if you’re in an area with a high rate or a low rate,” of gastroesophageal cancer, said Dr. Kuipers, professor of gastroenterology and hepatology at Erasmus University Medical Center, Rotterdam, the Netherlands. “The risk is about the same.”

On the flip side, global data also confirm that surveillance and treatment mitigate the risks. Following at-risk patients endoscopically means tumors are found earlier. And when a severely dysplastic lesion is removed, the risk of recurrence is very low – less than 1%.

These findings of a relatively constant rate of progression from gastroesophageal dysplasia to adenocarcinomas somewhat contradict the idea that the cancers are more of a concern in Asian countries, and fading away in Western countries. There has indeed been a dramatic decrease in stomach cancer in the last century – in the early 1900s, Dr. Kuipers said, up to 40% of cancers reported in Germany were gastric. The reasons for the decrease are many: improved diet, improved hygiene, and widespread use of antibiotics are factors. But the disease does still exist, especially among some ethnic/racial groups.

“The U.S. Surveillance, Epidemiology, and End Results (SEER) database shows that there is still a lot of it out there, and there’s huge disparity within groups, so we have to look at this from a broader perspective.”

Overall, the U.S. rates of esophageal and gastric cancer are about 8 and 10/100,000, respectively. In whites, those rates are about 8 and 9/100,000, but much higher in blacks, Asians, Native Americans, and Hispanics, with gastric cancer hovering around 14/100,000.

A 2010 meta-analysis found that Barrett’s metaplasia progressed to esophageal adenocarcinoma at a rate of 6.3/1,000 patients per year, but that number in particular came from analysis of tertiary care cohorts (Clin Gastroenterol Hepatol. 2010. doi: 10.1016/j.cgh.2009.10.010).

A 2015 analysis found lower rates of progression – about 2/1,000 patients per year in patients with short-segment Barrett’s, and about 3/1,000 patients per year in long-segment Barrett’s patients that have no dysplasia. “That means if you’re following 300 patients, one of these will convert to cancer every year,” Dr. Kuipers said (Gut. 2015. doi: 10.1136/gutjnl-2013-305506).

The risk appears much higher for patients with dysplastic Barrett’s, although the data vary widely. “Some report low progression rates, but some report these patients have a 50% or higher risk of progression within a few years. This variation depends on how selective one is in diagnosing low-grade dysplasia.”

A Dutch nationwide study of 42,200 patients with Barrett’s found that 4% progressed to adenocarcinoma over 10 years, for an annual progression rate of 0.4%. But among the small group of those with low-grade dysplasia, more than 10% had progressed by 10 years – a 1% annual progression rate (Gut. 2010. doi: 10.1136/gut.2009.176701).

An Irish national study found strikingly similar results. The annual progression risk in patients with metaplasia was 1.6/1,000 patients per year overall, but 2.7/1,000 per year in those with intestinal metaplasia (J Natl Cancer Inst. 2011. doi: 10.1093/jnci/djr203).

“So, if we have some idea of progression rate, is there evidence that we could identify and treat these cancers earlier if our patients are under surveillance?” Dr. Kuipers said. “Well the answer is ‘yes.’ ”

He cited a very recent study by his colleagues at Erasmus University (Gut. 2016. doi: 10.1136/gutjnl-2014-308802). Investigators determined that in Barrett’s patients who were followed endoscopically, esophageal cancers were identified at much earlier stages than among the general population; 66% of the neoplasias were identified at the high-grade dysplasia stage, 26% at stage 1. The remainder were stage 2; there were no stage 3 or 4 cancers. In the general population, numbers were reversed: 45% were stage 4 when identified, 25% stage 3, 18% stage 2, and only a few at stages 1 or high-grade dysplasia.

Gastric cancer shows the same consistency of incidence and relation to baseline premalignant severity. A Dutch study with 98,000 cases found the annual incidence of gastric cancer was 0.1% for patients with atrophic gastritis, 0.25% for intestinal metaplasia, 0.6% for mild to moderate dysplasia, and 6% for severe dysplasia within 5 years after diagnosis (Gastroenterology. 2008. doi: 10.1053/j.gastro.2008.01.071).

 

 

A Swedish study last year found a progression rate of 3% over 10 years for patients with extensive intestinal metaplasia. (BMJ. 2015. doi: 10.1136/bmj.h3867).

“And this year, from Los Angeles, we saw a study of 4,300 patients with extensive intestinal metaplasia and a very similar progression rate of close to 5% in 10 years (Am J Gastro. 2016. doi: 10.1038/ajg.2016.188). It’s the same findings, everywhere,” he said, adding that a team from Iran presented almost identical numbers for gastric cancer incidence at this year’s Digestive Disease Week.

Again, follow-up improves outcomes. A gastric cancer endoscopy screening program for high-risk people improved survival of gastric cancer significantly over community-identified patients (80% vs. 60% at 60 months after diagnosis) (J Gastro Hepatol. 2014. doi: 10.1111/jgh.12387).

These data contribute strongly to recent guidelines for managing patients with premalignant stomach conditions. The European Society for Gastroenterological Endoscopy recommends that those with extensive intestinal metaplasia undergo endoscopy every 3 years (Endoscopy. 2012. doi: 10.1055/s-0031-1291491). Last year, the Kyoto global consensus report on Helicobacter pylori gastritis (Gut. 2015. doi: 10.1136/gutjnl-2015-309252) recommended that patients with extensive gastric atrophy be offered endoscopic surveillance.

There is no place for general population screening, Dr. Kuipers said in an interview. “I would not advocate actual population screening – i.e., offer the general population or risk groups a first screening endoscopy. There is at present no indication [for] this. This is different, however, from surveillance of patients who happen to be diagnosed with advanced intestinal metaplasia of the stomach, or long-segment Barrett’s esophagus, because this allows us to early detect development of neoplasia (high-grade dysplasia and cancer), which then allows for less invasive treatment, and better outcomes.”

He had no financial disclosures.

msullivan@frontlinemedcom.com

Meeting/Event
Publications
Topics
Legacy Keywords
Freston 2016, esophageal ancer, gastric cancer, intestinal metaplasia, Barrett's
Sections
Meeting/Event
Meeting/Event

CHICAGO – Severe intestinal metaplasia can progress to adenocarcinoma in a small number of patients over 10 years, whether it’s in the esophagus or in the stomach.

Studies emerging from around the world find the same patterns and similar rates of progression in both diseases: 2% for esophageal and 3%-5% for gastric cancers over 10 years, no matter where the studies are conducted, Ernst Kuipers, MD, PhD, said at the meeting sponsored by the American Gastroenterological Association.

“It doesn’t matter if you’re in an area with a high rate or a low rate,” of gastroesophageal cancer, said Dr. Kuipers, professor of gastroenterology and hepatology at Erasmus University Medical Center, Rotterdam, the Netherlands. “The risk is about the same.”

On the flip side, global data also confirm that surveillance and treatment mitigate the risks. Following at-risk patients endoscopically means tumors are found earlier. And when a severely dysplastic lesion is removed, the risk of recurrence is very low – less than 1%.

These findings of a relatively constant rate of progression from gastroesophageal dysplasia to adenocarcinomas somewhat contradict the idea that the cancers are more of a concern in Asian countries, and fading away in Western countries. There has indeed been a dramatic decrease in stomach cancer in the last century – in the early 1900s, Dr. Kuipers said, up to 40% of cancers reported in Germany were gastric. The reasons for the decrease are many: improved diet, improved hygiene, and widespread use of antibiotics are factors. But the disease does still exist, especially among some ethnic/racial groups.

“The U.S. Surveillance, Epidemiology, and End Results (SEER) database shows that there is still a lot of it out there, and there’s huge disparity within groups, so we have to look at this from a broader perspective.”

Overall, the U.S. rates of esophageal and gastric cancer are about 8 and 10/100,000, respectively. In whites, those rates are about 8 and 9/100,000, but much higher in blacks, Asians, Native Americans, and Hispanics, with gastric cancer hovering around 14/100,000.

A 2010 meta-analysis found that Barrett’s metaplasia progressed to esophageal adenocarcinoma at a rate of 6.3/1,000 patients per year, but that number in particular came from analysis of tertiary care cohorts (Clin Gastroenterol Hepatol. 2010. doi: 10.1016/j.cgh.2009.10.010).

A 2015 analysis found lower rates of progression – about 2/1,000 patients per year in patients with short-segment Barrett’s, and about 3/1,000 patients per year in long-segment Barrett’s patients that have no dysplasia. “That means if you’re following 300 patients, one of these will convert to cancer every year,” Dr. Kuipers said (Gut. 2015. doi: 10.1136/gutjnl-2013-305506).

The risk appears much higher for patients with dysplastic Barrett’s, although the data vary widely. “Some report low progression rates, but some report these patients have a 50% or higher risk of progression within a few years. This variation depends on how selective one is in diagnosing low-grade dysplasia.”

A Dutch nationwide study of 42,200 patients with Barrett’s found that 4% progressed to adenocarcinoma over 10 years, for an annual progression rate of 0.4%. But among the small group of those with low-grade dysplasia, more than 10% had progressed by 10 years – a 1% annual progression rate (Gut. 2010. doi: 10.1136/gut.2009.176701).

An Irish national study found strikingly similar results. The annual progression risk in patients with metaplasia was 1.6/1,000 patients per year overall, but 2.7/1,000 per year in those with intestinal metaplasia (J Natl Cancer Inst. 2011. doi: 10.1093/jnci/djr203).

“So, if we have some idea of progression rate, is there evidence that we could identify and treat these cancers earlier if our patients are under surveillance?” Dr. Kuipers said. “Well the answer is ‘yes.’ ”

He cited a very recent study by his colleagues at Erasmus University (Gut. 2016. doi: 10.1136/gutjnl-2014-308802). Investigators determined that in Barrett’s patients who were followed endoscopically, esophageal cancers were identified at much earlier stages than among the general population; 66% of the neoplasias were identified at the high-grade dysplasia stage, 26% at stage 1. The remainder were stage 2; there were no stage 3 or 4 cancers. In the general population, numbers were reversed: 45% were stage 4 when identified, 25% stage 3, 18% stage 2, and only a few at stages 1 or high-grade dysplasia.

Gastric cancer shows the same consistency of incidence and relation to baseline premalignant severity. A Dutch study with 98,000 cases found the annual incidence of gastric cancer was 0.1% for patients with atrophic gastritis, 0.25% for intestinal metaplasia, 0.6% for mild to moderate dysplasia, and 6% for severe dysplasia within 5 years after diagnosis (Gastroenterology. 2008. doi: 10.1053/j.gastro.2008.01.071).

 

 

A Swedish study last year found a progression rate of 3% over 10 years for patients with extensive intestinal metaplasia. (BMJ. 2015. doi: 10.1136/bmj.h3867).

“And this year, from Los Angeles, we saw a study of 4,300 patients with extensive intestinal metaplasia and a very similar progression rate of close to 5% in 10 years (Am J Gastro. 2016. doi: 10.1038/ajg.2016.188). It’s the same findings, everywhere,” he said, adding that a team from Iran presented almost identical numbers for gastric cancer incidence at this year’s Digestive Disease Week.

Again, follow-up improves outcomes. A gastric cancer endoscopy screening program for high-risk people improved survival of gastric cancer significantly over community-identified patients (80% vs. 60% at 60 months after diagnosis) (J Gastro Hepatol. 2014. doi: 10.1111/jgh.12387).

These data contribute strongly to recent guidelines for managing patients with premalignant stomach conditions. The European Society for Gastroenterological Endoscopy recommends that those with extensive intestinal metaplasia undergo endoscopy every 3 years (Endoscopy. 2012. doi: 10.1055/s-0031-1291491). Last year, the Kyoto global consensus report on Helicobacter pylori gastritis (Gut. 2015. doi: 10.1136/gutjnl-2015-309252) recommended that patients with extensive gastric atrophy be offered endoscopic surveillance.

There is no place for general population screening, Dr. Kuipers said in an interview. “I would not advocate actual population screening – i.e., offer the general population or risk groups a first screening endoscopy. There is at present no indication [for] this. This is different, however, from surveillance of patients who happen to be diagnosed with advanced intestinal metaplasia of the stomach, or long-segment Barrett’s esophagus, because this allows us to early detect development of neoplasia (high-grade dysplasia and cancer), which then allows for less invasive treatment, and better outcomes.”

He had no financial disclosures.

msullivan@frontlinemedcom.com

CHICAGO – Severe intestinal metaplasia can progress to adenocarcinoma in a small number of patients over 10 years, whether it’s in the esophagus or in the stomach.

Studies emerging from around the world find the same patterns and similar rates of progression in both diseases: 2% for esophageal and 3%-5% for gastric cancers over 10 years, no matter where the studies are conducted, Ernst Kuipers, MD, PhD, said at the meeting sponsored by the American Gastroenterological Association.

“It doesn’t matter if you’re in an area with a high rate or a low rate,” of gastroesophageal cancer, said Dr. Kuipers, professor of gastroenterology and hepatology at Erasmus University Medical Center, Rotterdam, the Netherlands. “The risk is about the same.”

On the flip side, global data also confirm that surveillance and treatment mitigate the risks. Following at-risk patients endoscopically means tumors are found earlier. And when a severely dysplastic lesion is removed, the risk of recurrence is very low – less than 1%.

These findings of a relatively constant rate of progression from gastroesophageal dysplasia to adenocarcinomas somewhat contradict the idea that the cancers are more of a concern in Asian countries, and fading away in Western countries. There has indeed been a dramatic decrease in stomach cancer in the last century – in the early 1900s, Dr. Kuipers said, up to 40% of cancers reported in Germany were gastric. The reasons for the decrease are many: improved diet, improved hygiene, and widespread use of antibiotics are factors. But the disease does still exist, especially among some ethnic/racial groups.

“The U.S. Surveillance, Epidemiology, and End Results (SEER) database shows that there is still a lot of it out there, and there’s huge disparity within groups, so we have to look at this from a broader perspective.”

Overall, the U.S. rates of esophageal and gastric cancer are about 8 and 10/100,000, respectively. In whites, those rates are about 8 and 9/100,000, but much higher in blacks, Asians, Native Americans, and Hispanics, with gastric cancer hovering around 14/100,000.

A 2010 meta-analysis found that Barrett’s metaplasia progressed to esophageal adenocarcinoma at a rate of 6.3/1,000 patients per year, but that number in particular came from analysis of tertiary care cohorts (Clin Gastroenterol Hepatol. 2010. doi: 10.1016/j.cgh.2009.10.010).

A 2015 analysis found lower rates of progression – about 2/1,000 patients per year in patients with short-segment Barrett’s, and about 3/1,000 patients per year in long-segment Barrett’s patients that have no dysplasia. “That means if you’re following 300 patients, one of these will convert to cancer every year,” Dr. Kuipers said (Gut. 2015. doi: 10.1136/gutjnl-2013-305506).

The risk appears much higher for patients with dysplastic Barrett’s, although the data vary widely. “Some report low progression rates, but some report these patients have a 50% or higher risk of progression within a few years. This variation depends on how selective one is in diagnosing low-grade dysplasia.”

A Dutch nationwide study of 42,200 patients with Barrett’s found that 4% progressed to adenocarcinoma over 10 years, for an annual progression rate of 0.4%. But among the small group of those with low-grade dysplasia, more than 10% had progressed by 10 years – a 1% annual progression rate (Gut. 2010. doi: 10.1136/gut.2009.176701).

An Irish national study found strikingly similar results. The annual progression risk in patients with metaplasia was 1.6/1,000 patients per year overall, but 2.7/1,000 per year in those with intestinal metaplasia (J Natl Cancer Inst. 2011. doi: 10.1093/jnci/djr203).

“So, if we have some idea of progression rate, is there evidence that we could identify and treat these cancers earlier if our patients are under surveillance?” Dr. Kuipers said. “Well the answer is ‘yes.’ ”

He cited a very recent study by his colleagues at Erasmus University (Gut. 2016. doi: 10.1136/gutjnl-2014-308802). Investigators determined that in Barrett’s patients who were followed endoscopically, esophageal cancers were identified at much earlier stages than among the general population; 66% of the neoplasias were identified at the high-grade dysplasia stage, 26% at stage 1. The remainder were stage 2; there were no stage 3 or 4 cancers. In the general population, numbers were reversed: 45% were stage 4 when identified, 25% stage 3, 18% stage 2, and only a few at stages 1 or high-grade dysplasia.

Gastric cancer shows the same consistency of incidence and relation to baseline premalignant severity. A Dutch study with 98,000 cases found the annual incidence of gastric cancer was 0.1% for patients with atrophic gastritis, 0.25% for intestinal metaplasia, 0.6% for mild to moderate dysplasia, and 6% for severe dysplasia within 5 years after diagnosis (Gastroenterology. 2008. doi: 10.1053/j.gastro.2008.01.071).

 

 

A Swedish study last year found a progression rate of 3% over 10 years for patients with extensive intestinal metaplasia. (BMJ. 2015. doi: 10.1136/bmj.h3867).

“And this year, from Los Angeles, we saw a study of 4,300 patients with extensive intestinal metaplasia and a very similar progression rate of close to 5% in 10 years (Am J Gastro. 2016. doi: 10.1038/ajg.2016.188). It’s the same findings, everywhere,” he said, adding that a team from Iran presented almost identical numbers for gastric cancer incidence at this year’s Digestive Disease Week.

Again, follow-up improves outcomes. A gastric cancer endoscopy screening program for high-risk people improved survival of gastric cancer significantly over community-identified patients (80% vs. 60% at 60 months after diagnosis) (J Gastro Hepatol. 2014. doi: 10.1111/jgh.12387).

These data contribute strongly to recent guidelines for managing patients with premalignant stomach conditions. The European Society for Gastroenterological Endoscopy recommends that those with extensive intestinal metaplasia undergo endoscopy every 3 years (Endoscopy. 2012. doi: 10.1055/s-0031-1291491). Last year, the Kyoto global consensus report on Helicobacter pylori gastritis (Gut. 2015. doi: 10.1136/gutjnl-2015-309252) recommended that patients with extensive gastric atrophy be offered endoscopic surveillance.

There is no place for general population screening, Dr. Kuipers said in an interview. “I would not advocate actual population screening – i.e., offer the general population or risk groups a first screening endoscopy. There is at present no indication [for] this. This is different, however, from surveillance of patients who happen to be diagnosed with advanced intestinal metaplasia of the stomach, or long-segment Barrett’s esophagus, because this allows us to early detect development of neoplasia (high-grade dysplasia and cancer), which then allows for less invasive treatment, and better outcomes.”

He had no financial disclosures.

msullivan@frontlinemedcom.com

Publications
Publications
Topics
Article Type
Display Headline
Gastroesophageal cancers continue to make their mark globally
Display Headline
Gastroesophageal cancers continue to make their mark globally
Legacy Keywords
Freston 2016, esophageal ancer, gastric cancer, intestinal metaplasia, Barrett's
Legacy Keywords
Freston 2016, esophageal ancer, gastric cancer, intestinal metaplasia, Barrett's
Sections
Article Source

AT THE 2016 JAMES W. FRESTON CONFERENCE

Disallow All Ads

GERD – new thinking turns pathology away from acid injury to inflammatory overdrive

Article Type
Changed
Fri, 01/18/2019 - 16:09
Display Headline
GERD – new thinking turns pathology away from acid injury to inflammatory overdrive

CHICAGO – A new model of gastroesophageal reflux disease (GERD) paints it as a disease caused by inflammatory molecules, rather than a reaction to an acid-inflicted wound.

And rather than esophagitis due to GERD being a top-down process, from surface epithelium to submucosa, multiple lines of evidence now suggest it is a bottom-up phenomenon sparked by activation of a hypoxia-inducible factor that occurs when esophageal epithelium is exposed to acidic bile salts, Rhonda Souza, MD, said at the meeting sponsored by the American Gastroenterological Association.

 

Michele G. Sullivan/Frontline Medical News
Dr. Rhonda Souza and Dr. Stuart Spechler are building a new model of GERD.

“We’re proposing that reflux is a cytokine-mediated injury,” said Dr. Souza of the University of Texas Southwestern Medical Center, Dallas, and the Dallas VA Medical Center. “The reflux of acid and bile doesn’t destroy the epithelial cells directly, but induces them to produce proinflammatory cytokines. These cytokines attract lymphocytes first, which induce the basal cell proliferation characteristic of GERD. Ultimately, it’s these inflammatory cells that mediate the epithelial injury – not the direct caustic effects of gastric acid.”

Dr. Souza and her colleagues, including Dr. Stuart Spechler and Dr. Kerry Dunbar, also of UT Southwestern and the Dallas VA Medical Center, have been building this case for several years, beginning with a surgical rat model of GERD. Their histologic findings in this model have been recapitulated in human cell lines and, most recently, in a clinical trial of 12 patients (JAMA. 2016. doi:10.1001/jama.2016.5657).

The rat model, published in 2009 (Gastroenterology. 2009. doi:10.1053/j.gastro.2009.07.055) provided one of the first very early looks at the pathogenesis of acute GERD.

Rats underwent esophagoduodenostomy, a procedure that left the stomach in place so that both gastric and duodenal contents could reflux into the esophagus, thus ensuring immediate esophageal exposure to acid and bile acids. But the investigators were puzzled as to why it took weeks to see changes in the esophageal surface. “The epithelial mucosa stayed intact for far longer than it should have – up to 3 weeks – if acid simply caused a caustic injury as the mechanism of cell death and replacement,” Dr. Souza said.

What she did see, however, was a rapid migration of T cells into the submucosa. “By postoperative week 3, we observed profound basal cell and papillary hyperplasia, but the surface cells were still intact, so this hyperplasia was not due to the death of surface cells.”

The team proceeded to an in vitro model using esophageal squamous cell lines established from endoscopic biopsies obtained from GERD patients. When the squamous cells were exposed in culture to acidic bile salts, the cells ramped up their production of several proinflammatory cytokines, including interleukin-8 and interleukin-1b. The production of proinflammatory cytokines released into the surrounding media were potent recruitment signals for lymphocytes.

The researchers saw this same signaling response in their rat model. “We saw a dramatic increase in IL-8 by postoperative week 2. It was in the intracellular spaces between cells at the epithelial surface and in the cell cytoplasm, and we also saw it in the submucosa and in the lamina propria.”

Acute reflux esophagitis has been almost impossible to observe in humans, Dr. Souza said, because most patients don’t seek medical attention until they’ve had months or years of acid reflux symptoms. By then, the injury response to gastroesophageal reflux has become chronic and well established.

The human study, published in May, confirmed the findings in the rat model. It comprised 12 patients with severe GERD who had been on twice-daily proton pump inhibitor (PPI) therapy for at least 1 month. Successful PPI treatment heals reflux esophagitis rapidly, and healing was endoscopically confirmed at baseline in all these patients. Then, however, they gave up their medication so that the damage would begin again. Dr. Souza and her colleagues could travel back in time, clinically speaking, and track the histopatholgic changes as they occurred. Within 2 weeks, esophagitis had reappeared in every patient: Three had the least-severe LA (Los Angeles) grade A, four had LA grade B, and five had LA grade C esophagitis, with extensive mucosal breaks.

“We know from older studies that within 6 months of going off of PPIs, most patients with reflux esophagitis develop it again, but we weren’t sure we would get this response within 2 weeks. It was surprising that not only did everyone get it, but that a few were so severe,” Dr. Souza said.

Biopsies at weeks 1 and 2 showed the same kind of inflammatory signaling seen in the rats. Again, the responding cells were almost exclusively lymphocytes; neutrophils and eosinophils were very rare or absent in all specimens. The team also observed basal cell and papillary hyperplasia and areas of spongiosis, even though the surface cells were still intact.

 

 

The lymphocyte-predominant response is the key to this new pathogenic theory, Dr. Souza wrote in her JAMA paper.

“If the traditional notion were true, that acute GERD is caused by refluxed acid directly inflicting lethal, chemical injury to surface epithelial cells, then basal cell and papillary hyperplasia would have been expected only in areas with surface erosions, and the infiltrating inflammatory cells would have been granulocytes primarily.”

She also suggested that PPIs may be healing esophagitis not simply by preventing acid reflux, but by exerting anti-inflammatory properties.

“Cytokines like IL-8 may also have proliferative effects which might have contributed to esophageal basal cell and papillary hyperplasia observed in the absence of surface erosions. In esophageal epithelial cells in culture, PPIs inhibit secretion of IL-8 through acid-independent mechanisms. This observation raises the interesting possibility that anti-inflammatory PPI effects, independent of their effects on acid inhibition, might contribute to GERD healing by PPIs.”

Dr. Souza said she continues to investigate, focusing now on how the initial insult of acidic bile salts on esophageal epithelium stimulates this inflammatory response. The key may be in a small protein called hypoxia-inducible factor-2 alpha (HIF-2a), one of a family of transcription factors that enable cells to respond to hypoxic stress.

Under normal oxygen conditions, HIF proteins are low, their levels regulated by an enzyme called prolyl hydroxylase. This enzyme is inactivated under hypoxic conditions, or in the presence of reactive oxygen species. HIF factors then rise and, among other functions, stimulate a strong inflammatory response. Inflamed tissues like those seen in esophagitis are frequently hypoxic, Dr. Souza said, and this state could be activating HIFs.

She examined HIF levels in her 12-patient cohort. These results were presented earlier this year at the Digestive Disease Weekmeeting in San Diego.

“At weeks 1 and 2, we found large associations between HIF-2a and increases in a number of proinflammatory cytokines including IL-8 and intercellular adhesion molecule–1,” a protein that facilitates leukocyte migration. Preliminary studies of HIF-2a inhibition in esophageal squamous cells in culture exposed to acidic bile salts show promising results as a potential therapeutic strategy to reduce proinflammatory cytokine expression. It is conceivable that anti-inflammatory therapies directed at HIF-2a may be on the horizon for the prevention and treatment of reflux esophagitis, she added.

Neither Dr. Souza nor her colleagues had any relevant financial disclosures.

msullivan@frontlinemedcom.com

On Twitter @Alz_Gal

Meeting/Event
Publications
Topics
Legacy Keywords
Freston 2016, GERD, inflammation
Meeting/Event
Meeting/Event

CHICAGO – A new model of gastroesophageal reflux disease (GERD) paints it as a disease caused by inflammatory molecules, rather than a reaction to an acid-inflicted wound.

And rather than esophagitis due to GERD being a top-down process, from surface epithelium to submucosa, multiple lines of evidence now suggest it is a bottom-up phenomenon sparked by activation of a hypoxia-inducible factor that occurs when esophageal epithelium is exposed to acidic bile salts, Rhonda Souza, MD, said at the meeting sponsored by the American Gastroenterological Association.

 

Michele G. Sullivan/Frontline Medical News
Dr. Rhonda Souza and Dr. Stuart Spechler are building a new model of GERD.

“We’re proposing that reflux is a cytokine-mediated injury,” said Dr. Souza of the University of Texas Southwestern Medical Center, Dallas, and the Dallas VA Medical Center. “The reflux of acid and bile doesn’t destroy the epithelial cells directly, but induces them to produce proinflammatory cytokines. These cytokines attract lymphocytes first, which induce the basal cell proliferation characteristic of GERD. Ultimately, it’s these inflammatory cells that mediate the epithelial injury – not the direct caustic effects of gastric acid.”

Dr. Souza and her colleagues, including Dr. Stuart Spechler and Dr. Kerry Dunbar, also of UT Southwestern and the Dallas VA Medical Center, have been building this case for several years, beginning with a surgical rat model of GERD. Their histologic findings in this model have been recapitulated in human cell lines and, most recently, in a clinical trial of 12 patients (JAMA. 2016. doi:10.1001/jama.2016.5657).

The rat model, published in 2009 (Gastroenterology. 2009. doi:10.1053/j.gastro.2009.07.055) provided one of the first very early looks at the pathogenesis of acute GERD.

Rats underwent esophagoduodenostomy, a procedure that left the stomach in place so that both gastric and duodenal contents could reflux into the esophagus, thus ensuring immediate esophageal exposure to acid and bile acids. But the investigators were puzzled as to why it took weeks to see changes in the esophageal surface. “The epithelial mucosa stayed intact for far longer than it should have – up to 3 weeks – if acid simply caused a caustic injury as the mechanism of cell death and replacement,” Dr. Souza said.

What she did see, however, was a rapid migration of T cells into the submucosa. “By postoperative week 3, we observed profound basal cell and papillary hyperplasia, but the surface cells were still intact, so this hyperplasia was not due to the death of surface cells.”

The team proceeded to an in vitro model using esophageal squamous cell lines established from endoscopic biopsies obtained from GERD patients. When the squamous cells were exposed in culture to acidic bile salts, the cells ramped up their production of several proinflammatory cytokines, including interleukin-8 and interleukin-1b. The production of proinflammatory cytokines released into the surrounding media were potent recruitment signals for lymphocytes.

The researchers saw this same signaling response in their rat model. “We saw a dramatic increase in IL-8 by postoperative week 2. It was in the intracellular spaces between cells at the epithelial surface and in the cell cytoplasm, and we also saw it in the submucosa and in the lamina propria.”

Acute reflux esophagitis has been almost impossible to observe in humans, Dr. Souza said, because most patients don’t seek medical attention until they’ve had months or years of acid reflux symptoms. By then, the injury response to gastroesophageal reflux has become chronic and well established.

The human study, published in May, confirmed the findings in the rat model. It comprised 12 patients with severe GERD who had been on twice-daily proton pump inhibitor (PPI) therapy for at least 1 month. Successful PPI treatment heals reflux esophagitis rapidly, and healing was endoscopically confirmed at baseline in all these patients. Then, however, they gave up their medication so that the damage would begin again. Dr. Souza and her colleagues could travel back in time, clinically speaking, and track the histopatholgic changes as they occurred. Within 2 weeks, esophagitis had reappeared in every patient: Three had the least-severe LA (Los Angeles) grade A, four had LA grade B, and five had LA grade C esophagitis, with extensive mucosal breaks.

“We know from older studies that within 6 months of going off of PPIs, most patients with reflux esophagitis develop it again, but we weren’t sure we would get this response within 2 weeks. It was surprising that not only did everyone get it, but that a few were so severe,” Dr. Souza said.

Biopsies at weeks 1 and 2 showed the same kind of inflammatory signaling seen in the rats. Again, the responding cells were almost exclusively lymphocytes; neutrophils and eosinophils were very rare or absent in all specimens. The team also observed basal cell and papillary hyperplasia and areas of spongiosis, even though the surface cells were still intact.

 

 

The lymphocyte-predominant response is the key to this new pathogenic theory, Dr. Souza wrote in her JAMA paper.

“If the traditional notion were true, that acute GERD is caused by refluxed acid directly inflicting lethal, chemical injury to surface epithelial cells, then basal cell and papillary hyperplasia would have been expected only in areas with surface erosions, and the infiltrating inflammatory cells would have been granulocytes primarily.”

She also suggested that PPIs may be healing esophagitis not simply by preventing acid reflux, but by exerting anti-inflammatory properties.

“Cytokines like IL-8 may also have proliferative effects which might have contributed to esophageal basal cell and papillary hyperplasia observed in the absence of surface erosions. In esophageal epithelial cells in culture, PPIs inhibit secretion of IL-8 through acid-independent mechanisms. This observation raises the interesting possibility that anti-inflammatory PPI effects, independent of their effects on acid inhibition, might contribute to GERD healing by PPIs.”

Dr. Souza said she continues to investigate, focusing now on how the initial insult of acidic bile salts on esophageal epithelium stimulates this inflammatory response. The key may be in a small protein called hypoxia-inducible factor-2 alpha (HIF-2a), one of a family of transcription factors that enable cells to respond to hypoxic stress.

Under normal oxygen conditions, HIF proteins are low, their levels regulated by an enzyme called prolyl hydroxylase. This enzyme is inactivated under hypoxic conditions, or in the presence of reactive oxygen species. HIF factors then rise and, among other functions, stimulate a strong inflammatory response. Inflamed tissues like those seen in esophagitis are frequently hypoxic, Dr. Souza said, and this state could be activating HIFs.

She examined HIF levels in her 12-patient cohort. These results were presented earlier this year at the Digestive Disease Weekmeeting in San Diego.

“At weeks 1 and 2, we found large associations between HIF-2a and increases in a number of proinflammatory cytokines including IL-8 and intercellular adhesion molecule–1,” a protein that facilitates leukocyte migration. Preliminary studies of HIF-2a inhibition in esophageal squamous cells in culture exposed to acidic bile salts show promising results as a potential therapeutic strategy to reduce proinflammatory cytokine expression. It is conceivable that anti-inflammatory therapies directed at HIF-2a may be on the horizon for the prevention and treatment of reflux esophagitis, she added.

Neither Dr. Souza nor her colleagues had any relevant financial disclosures.

msullivan@frontlinemedcom.com

On Twitter @Alz_Gal

CHICAGO – A new model of gastroesophageal reflux disease (GERD) paints it as a disease caused by inflammatory molecules, rather than a reaction to an acid-inflicted wound.

And rather than esophagitis due to GERD being a top-down process, from surface epithelium to submucosa, multiple lines of evidence now suggest it is a bottom-up phenomenon sparked by activation of a hypoxia-inducible factor that occurs when esophageal epithelium is exposed to acidic bile salts, Rhonda Souza, MD, said at the meeting sponsored by the American Gastroenterological Association.

 

Michele G. Sullivan/Frontline Medical News
Dr. Rhonda Souza and Dr. Stuart Spechler are building a new model of GERD.

“We’re proposing that reflux is a cytokine-mediated injury,” said Dr. Souza of the University of Texas Southwestern Medical Center, Dallas, and the Dallas VA Medical Center. “The reflux of acid and bile doesn’t destroy the epithelial cells directly, but induces them to produce proinflammatory cytokines. These cytokines attract lymphocytes first, which induce the basal cell proliferation characteristic of GERD. Ultimately, it’s these inflammatory cells that mediate the epithelial injury – not the direct caustic effects of gastric acid.”

Dr. Souza and her colleagues, including Dr. Stuart Spechler and Dr. Kerry Dunbar, also of UT Southwestern and the Dallas VA Medical Center, have been building this case for several years, beginning with a surgical rat model of GERD. Their histologic findings in this model have been recapitulated in human cell lines and, most recently, in a clinical trial of 12 patients (JAMA. 2016. doi:10.1001/jama.2016.5657).

The rat model, published in 2009 (Gastroenterology. 2009. doi:10.1053/j.gastro.2009.07.055) provided one of the first very early looks at the pathogenesis of acute GERD.

Rats underwent esophagoduodenostomy, a procedure that left the stomach in place so that both gastric and duodenal contents could reflux into the esophagus, thus ensuring immediate esophageal exposure to acid and bile acids. But the investigators were puzzled as to why it took weeks to see changes in the esophageal surface. “The epithelial mucosa stayed intact for far longer than it should have – up to 3 weeks – if acid simply caused a caustic injury as the mechanism of cell death and replacement,” Dr. Souza said.

What she did see, however, was a rapid migration of T cells into the submucosa. “By postoperative week 3, we observed profound basal cell and papillary hyperplasia, but the surface cells were still intact, so this hyperplasia was not due to the death of surface cells.”

The team proceeded to an in vitro model using esophageal squamous cell lines established from endoscopic biopsies obtained from GERD patients. When the squamous cells were exposed in culture to acidic bile salts, the cells ramped up their production of several proinflammatory cytokines, including interleukin-8 and interleukin-1b. The production of proinflammatory cytokines released into the surrounding media were potent recruitment signals for lymphocytes.

The researchers saw this same signaling response in their rat model. “We saw a dramatic increase in IL-8 by postoperative week 2. It was in the intracellular spaces between cells at the epithelial surface and in the cell cytoplasm, and we also saw it in the submucosa and in the lamina propria.”

Acute reflux esophagitis has been almost impossible to observe in humans, Dr. Souza said, because most patients don’t seek medical attention until they’ve had months or years of acid reflux symptoms. By then, the injury response to gastroesophageal reflux has become chronic and well established.

The human study, published in May, confirmed the findings in the rat model. It comprised 12 patients with severe GERD who had been on twice-daily proton pump inhibitor (PPI) therapy for at least 1 month. Successful PPI treatment heals reflux esophagitis rapidly, and healing was endoscopically confirmed at baseline in all these patients. Then, however, they gave up their medication so that the damage would begin again. Dr. Souza and her colleagues could travel back in time, clinically speaking, and track the histopatholgic changes as they occurred. Within 2 weeks, esophagitis had reappeared in every patient: Three had the least-severe LA (Los Angeles) grade A, four had LA grade B, and five had LA grade C esophagitis, with extensive mucosal breaks.

“We know from older studies that within 6 months of going off of PPIs, most patients with reflux esophagitis develop it again, but we weren’t sure we would get this response within 2 weeks. It was surprising that not only did everyone get it, but that a few were so severe,” Dr. Souza said.

Biopsies at weeks 1 and 2 showed the same kind of inflammatory signaling seen in the rats. Again, the responding cells were almost exclusively lymphocytes; neutrophils and eosinophils were very rare or absent in all specimens. The team also observed basal cell and papillary hyperplasia and areas of spongiosis, even though the surface cells were still intact.

 

 

The lymphocyte-predominant response is the key to this new pathogenic theory, Dr. Souza wrote in her JAMA paper.

“If the traditional notion were true, that acute GERD is caused by refluxed acid directly inflicting lethal, chemical injury to surface epithelial cells, then basal cell and papillary hyperplasia would have been expected only in areas with surface erosions, and the infiltrating inflammatory cells would have been granulocytes primarily.”

She also suggested that PPIs may be healing esophagitis not simply by preventing acid reflux, but by exerting anti-inflammatory properties.

“Cytokines like IL-8 may also have proliferative effects which might have contributed to esophageal basal cell and papillary hyperplasia observed in the absence of surface erosions. In esophageal epithelial cells in culture, PPIs inhibit secretion of IL-8 through acid-independent mechanisms. This observation raises the interesting possibility that anti-inflammatory PPI effects, independent of their effects on acid inhibition, might contribute to GERD healing by PPIs.”

Dr. Souza said she continues to investigate, focusing now on how the initial insult of acidic bile salts on esophageal epithelium stimulates this inflammatory response. The key may be in a small protein called hypoxia-inducible factor-2 alpha (HIF-2a), one of a family of transcription factors that enable cells to respond to hypoxic stress.

Under normal oxygen conditions, HIF proteins are low, their levels regulated by an enzyme called prolyl hydroxylase. This enzyme is inactivated under hypoxic conditions, or in the presence of reactive oxygen species. HIF factors then rise and, among other functions, stimulate a strong inflammatory response. Inflamed tissues like those seen in esophagitis are frequently hypoxic, Dr. Souza said, and this state could be activating HIFs.

She examined HIF levels in her 12-patient cohort. These results were presented earlier this year at the Digestive Disease Weekmeeting in San Diego.

“At weeks 1 and 2, we found large associations between HIF-2a and increases in a number of proinflammatory cytokines including IL-8 and intercellular adhesion molecule–1,” a protein that facilitates leukocyte migration. Preliminary studies of HIF-2a inhibition in esophageal squamous cells in culture exposed to acidic bile salts show promising results as a potential therapeutic strategy to reduce proinflammatory cytokine expression. It is conceivable that anti-inflammatory therapies directed at HIF-2a may be on the horizon for the prevention and treatment of reflux esophagitis, she added.

Neither Dr. Souza nor her colleagues had any relevant financial disclosures.

msullivan@frontlinemedcom.com

On Twitter @Alz_Gal

Publications
Publications
Topics
Article Type
Display Headline
GERD – new thinking turns pathology away from acid injury to inflammatory overdrive
Display Headline
GERD – new thinking turns pathology away from acid injury to inflammatory overdrive
Legacy Keywords
Freston 2016, GERD, inflammation
Legacy Keywords
Freston 2016, GERD, inflammation
Article Source

AT THE 2016 JAMES W. FRESTON CONFERENCE

Disallow All Ads

Mycobiome much more diverse in children than in adults

Article Type
Changed
Fri, 01/18/2019 - 16:07
Display Headline
Mycobiome much more diverse in children than in adults

The normal fungal communities that inhabit healthy skin are much more diverse in children than adults, a new study has discovered.

That diversity dwindles, however, around puberty, when the lipophilic taxa Malassezia surges in abundance. This is probably mediated by the increase in sebaceous gland activation and sebum composition that occurs around sexual maturity, Jay-Hyun Jo, PhD, wrote (J Invest Dermatol. 2016 Jul 28; doi: 10.1016/j.jid.2016.05.130).

The diversity of the childhood mycobiome may also play into the larger prevalence of fungal skin diseases in children, wrote Dr. Jo of the National Cancer Institute.

“Several fungal skin infections (dermatophytoses), such as tinea capitis and tinea corporis, are more frequently seen in children. This epidemiological dichotomy in fungal infections may relate to the physiologic characteristics of younger skin, which appears more permissive to colonization by diverse fungi.”

The researchers used the fungal internal transcribed spacer–1 (ITS1) sequence to pinpoint the taxonomic details of the mycobiome of 14 healthy children and 19 healthy adults. They looked at samples from 10 sites on each subject: the external auditory canal, forehead, occiput, retroauricular crease, back, manubrium, antecubital fossa, inguinal crease, volar forearm, and nares.

Malassezia monopolized the adult samples, constituting 80%-99% of the communities on each skin site. In children, however, Malassezia was much less common, comprising 35%-76% of the samples of each site.

However, children boasted a much more diverse mycobiome. Other constituents included members of the Ascomycota, Aspergillus, Epicoccum, and Phoma taxae. Ascomycota species were found on 40% of samples from children, compared with 9.5% of samples from adults. Children also played host to communities of Epicoccum, Cladosporium, and Cryptococcus.

There were individual variations in diversity, however, the authors noted. “For clinical samples from children, decreased diversity was correlated with increased relative abundance of Malassezia, especially on sebaceous sites. Given the predominance of Malassezia on sebaceous skin, it is possible that reduction in diversity was attributed to relative overexpansion of Malassezia.”

The team also discovered gender differences in the mycobiome of children. The sebaceous skin sites of boys were much more likely to host species of Epicoccum and Cryptococcus. Girls showed an early enrichment of Malassezia. “These results suggested that gender may affect mycobiome structures during sexual maturation.”

“Since Malassezia is an obligatory lipophilic fungus, differential Malassezia abundance might be due to the full activation of sebaceous glands during puberty,” they theorized. “Therefore, it would be intriguing to identify the sebaceous gland activity and sebum signatures during childhood in conjunction with sequence-based mycobiome analysis.”

The National Institutes of Health funded the study. Dr. Jo had no financial disclosures.

msullivan@frontlinemedcom.com

On Twitter @Alz_Gal

References

Author and Disclosure Information

Publications
Topics
Legacy Keywords
mycobiome, children
Author and Disclosure Information

Author and Disclosure Information

The normal fungal communities that inhabit healthy skin are much more diverse in children than adults, a new study has discovered.

That diversity dwindles, however, around puberty, when the lipophilic taxa Malassezia surges in abundance. This is probably mediated by the increase in sebaceous gland activation and sebum composition that occurs around sexual maturity, Jay-Hyun Jo, PhD, wrote (J Invest Dermatol. 2016 Jul 28; doi: 10.1016/j.jid.2016.05.130).

The diversity of the childhood mycobiome may also play into the larger prevalence of fungal skin diseases in children, wrote Dr. Jo of the National Cancer Institute.

“Several fungal skin infections (dermatophytoses), such as tinea capitis and tinea corporis, are more frequently seen in children. This epidemiological dichotomy in fungal infections may relate to the physiologic characteristics of younger skin, which appears more permissive to colonization by diverse fungi.”

The researchers used the fungal internal transcribed spacer–1 (ITS1) sequence to pinpoint the taxonomic details of the mycobiome of 14 healthy children and 19 healthy adults. They looked at samples from 10 sites on each subject: the external auditory canal, forehead, occiput, retroauricular crease, back, manubrium, antecubital fossa, inguinal crease, volar forearm, and nares.

Malassezia monopolized the adult samples, constituting 80%-99% of the communities on each skin site. In children, however, Malassezia was much less common, comprising 35%-76% of the samples of each site.

However, children boasted a much more diverse mycobiome. Other constituents included members of the Ascomycota, Aspergillus, Epicoccum, and Phoma taxae. Ascomycota species were found on 40% of samples from children, compared with 9.5% of samples from adults. Children also played host to communities of Epicoccum, Cladosporium, and Cryptococcus.

There were individual variations in diversity, however, the authors noted. “For clinical samples from children, decreased diversity was correlated with increased relative abundance of Malassezia, especially on sebaceous sites. Given the predominance of Malassezia on sebaceous skin, it is possible that reduction in diversity was attributed to relative overexpansion of Malassezia.”

The team also discovered gender differences in the mycobiome of children. The sebaceous skin sites of boys were much more likely to host species of Epicoccum and Cryptococcus. Girls showed an early enrichment of Malassezia. “These results suggested that gender may affect mycobiome structures during sexual maturation.”

“Since Malassezia is an obligatory lipophilic fungus, differential Malassezia abundance might be due to the full activation of sebaceous glands during puberty,” they theorized. “Therefore, it would be intriguing to identify the sebaceous gland activity and sebum signatures during childhood in conjunction with sequence-based mycobiome analysis.”

The National Institutes of Health funded the study. Dr. Jo had no financial disclosures.

msullivan@frontlinemedcom.com

On Twitter @Alz_Gal

The normal fungal communities that inhabit healthy skin are much more diverse in children than adults, a new study has discovered.

That diversity dwindles, however, around puberty, when the lipophilic taxa Malassezia surges in abundance. This is probably mediated by the increase in sebaceous gland activation and sebum composition that occurs around sexual maturity, Jay-Hyun Jo, PhD, wrote (J Invest Dermatol. 2016 Jul 28; doi: 10.1016/j.jid.2016.05.130).

The diversity of the childhood mycobiome may also play into the larger prevalence of fungal skin diseases in children, wrote Dr. Jo of the National Cancer Institute.

“Several fungal skin infections (dermatophytoses), such as tinea capitis and tinea corporis, are more frequently seen in children. This epidemiological dichotomy in fungal infections may relate to the physiologic characteristics of younger skin, which appears more permissive to colonization by diverse fungi.”

The researchers used the fungal internal transcribed spacer–1 (ITS1) sequence to pinpoint the taxonomic details of the mycobiome of 14 healthy children and 19 healthy adults. They looked at samples from 10 sites on each subject: the external auditory canal, forehead, occiput, retroauricular crease, back, manubrium, antecubital fossa, inguinal crease, volar forearm, and nares.

Malassezia monopolized the adult samples, constituting 80%-99% of the communities on each skin site. In children, however, Malassezia was much less common, comprising 35%-76% of the samples of each site.

However, children boasted a much more diverse mycobiome. Other constituents included members of the Ascomycota, Aspergillus, Epicoccum, and Phoma taxae. Ascomycota species were found on 40% of samples from children, compared with 9.5% of samples from adults. Children also played host to communities of Epicoccum, Cladosporium, and Cryptococcus.

There were individual variations in diversity, however, the authors noted. “For clinical samples from children, decreased diversity was correlated with increased relative abundance of Malassezia, especially on sebaceous sites. Given the predominance of Malassezia on sebaceous skin, it is possible that reduction in diversity was attributed to relative overexpansion of Malassezia.”

The team also discovered gender differences in the mycobiome of children. The sebaceous skin sites of boys were much more likely to host species of Epicoccum and Cryptococcus. Girls showed an early enrichment of Malassezia. “These results suggested that gender may affect mycobiome structures during sexual maturation.”

“Since Malassezia is an obligatory lipophilic fungus, differential Malassezia abundance might be due to the full activation of sebaceous glands during puberty,” they theorized. “Therefore, it would be intriguing to identify the sebaceous gland activity and sebum signatures during childhood in conjunction with sequence-based mycobiome analysis.”

The National Institutes of Health funded the study. Dr. Jo had no financial disclosures.

msullivan@frontlinemedcom.com

On Twitter @Alz_Gal

References

References

Publications
Publications
Topics
Article Type
Display Headline
Mycobiome much more diverse in children than in adults
Display Headline
Mycobiome much more diverse in children than in adults
Legacy Keywords
mycobiome, children
Legacy Keywords
mycobiome, children
Article Source

FROM THE JOURNAL OF INVESTIGATIVE DERMATOLOGY

PURLs Copyright

Inside the Article

Disallow All Ads
Vitals

Key clinical point: The mycobiome of children is much more diverse than that of adults.

Major finding: Malassezia species comprised 80%-99% the adult mycobiome, while numerous other taxae were found on children’s skin.

Data source: The taxonomic analysis comprised 19 healthy adults and 14 healthy children.

Disclosures: The National Institutes of Health funded the study. Dr. Jo had no financial disclosures.

Changes in HIV-related cancers reflect changes in HIV patient care

Article Type
Changed
Fri, 01/18/2019 - 16:07
Display Headline
Changes in HIV-related cancers reflect changes in HIV patient care

The incidence of Kaposi’s sarcoma among HIV patients has declined in the antiretroviral era, but the cancers are now presenting in older patients, after treatment starts, and in patients who have undetectable levels of HIV RNA.

Changes were also seen in the appearance of non-Hodgkin lymphoma, which is now occurring in patients with higher CD4 counts and lower HIV viral loads.

Although the findings of this large database study were largely driven by the fact that more patients are on antiretroviral therapy (ART) and in active clinical care, biological forces may also be at work, wrote Elizabeth L. Yanik, PhD, of the National Cancer Institute, Rockville, Md., and her associates (Am J Clin Oncol. 2016 Aug 9. doi: 10.1200/JCO.2016.67.6999).

“For example, cancers that develop in patients with HIV infection after immune recovery may manifest genetic or epigenetic changes that facilitate evasion from the immune system … [and] given that human herpesvirus-8 and Epstein-Barr virus are genetically heterogeneous, another possibility is that patients in whom Kaposi’s or non-Hodgkin lymphoma develops after immune reconstitution may be infected with more pathogenic strains.”

Dr. Yanik and her colleagues mined data from the Centers for AIDS Research Network of Integrated Clinical Systems (CNICS). The 24,901 patients have been followed from 1996-2011. Among them, 446 cases of Kaposi’s sarcoma (KS) and 258 cases of non-Hodgkin lymphoma (NHL) developed. Overall, KS and NHL incidence rates decreased 5% and 8% per year, respectively.

The proportion of KS diagnosed during routine care increased significantly, from 32% to 49%, reflecting the fact that more HIV patients continue to enter active clinical settings. The diagnostic setting of NHL did not change significantly over the study period, with 64% of cases being diagnosed in routine care in the latter years. From the beginning to the end of the study period, patient median age at diagnosis increased for both KS (from 37 to 42 years) and NHL (from 40 to 46 years).

The authors said this is a direct result of changing care patterns. “The proportion of KS cases diagnosed among patients who received ART increased not because KS incidence increased in patients who received ART but because of the growing fraction of the HIV population administered ART.”

As the study period progressed, more cases of KS appeared 6 months or longer after ART initiation, from 26% in the early years to 60% in the latter years. This change didn’t occur with NHL cases; 68% of them were diagnosed at least 6 months after ART began.

The mean CD4 count at diagnosis increased with time for both KS and NHL. During 2007-2011, 15% of KS cases and 24% of NHL cases were diagnosed at CD4 counts of 500 cells/mL or more, whereas less than half were diagnosed at CD4 counts less than 200 cells/mL, the authors observed.

Both cancers began to appear during periods of decreased viral load as the study progressed, although the decrease was only significant for NHL. However, from 2007 to 2011, 29% of KS cases and 51% of NHL cases were diagnosed when HIV RNA was suppressed to 500 copies/mL or lower.

Again, the authors related this to improved clinical care. “These clinical characteristics and the changes in the underlying HIV population are inherently related. Improvements in ART access and earlier initiation lead to earlier suppression of HIV RNA and, ultimately, higher CD4 counts,” they said.

msullivan@frontlinemedcom.com

On Twitter @Alz_Gal

References

Author and Disclosure Information

Publications
Topics
Author and Disclosure Information

Author and Disclosure Information

The incidence of Kaposi’s sarcoma among HIV patients has declined in the antiretroviral era, but the cancers are now presenting in older patients, after treatment starts, and in patients who have undetectable levels of HIV RNA.

Changes were also seen in the appearance of non-Hodgkin lymphoma, which is now occurring in patients with higher CD4 counts and lower HIV viral loads.

Although the findings of this large database study were largely driven by the fact that more patients are on antiretroviral therapy (ART) and in active clinical care, biological forces may also be at work, wrote Elizabeth L. Yanik, PhD, of the National Cancer Institute, Rockville, Md., and her associates (Am J Clin Oncol. 2016 Aug 9. doi: 10.1200/JCO.2016.67.6999).

“For example, cancers that develop in patients with HIV infection after immune recovery may manifest genetic or epigenetic changes that facilitate evasion from the immune system … [and] given that human herpesvirus-8 and Epstein-Barr virus are genetically heterogeneous, another possibility is that patients in whom Kaposi’s or non-Hodgkin lymphoma develops after immune reconstitution may be infected with more pathogenic strains.”

Dr. Yanik and her colleagues mined data from the Centers for AIDS Research Network of Integrated Clinical Systems (CNICS). The 24,901 patients have been followed from 1996-2011. Among them, 446 cases of Kaposi’s sarcoma (KS) and 258 cases of non-Hodgkin lymphoma (NHL) developed. Overall, KS and NHL incidence rates decreased 5% and 8% per year, respectively.

The proportion of KS diagnosed during routine care increased significantly, from 32% to 49%, reflecting the fact that more HIV patients continue to enter active clinical settings. The diagnostic setting of NHL did not change significantly over the study period, with 64% of cases being diagnosed in routine care in the latter years. From the beginning to the end of the study period, patient median age at diagnosis increased for both KS (from 37 to 42 years) and NHL (from 40 to 46 years).

The authors said this is a direct result of changing care patterns. “The proportion of KS cases diagnosed among patients who received ART increased not because KS incidence increased in patients who received ART but because of the growing fraction of the HIV population administered ART.”

As the study period progressed, more cases of KS appeared 6 months or longer after ART initiation, from 26% in the early years to 60% in the latter years. This change didn’t occur with NHL cases; 68% of them were diagnosed at least 6 months after ART began.

The mean CD4 count at diagnosis increased with time for both KS and NHL. During 2007-2011, 15% of KS cases and 24% of NHL cases were diagnosed at CD4 counts of 500 cells/mL or more, whereas less than half were diagnosed at CD4 counts less than 200 cells/mL, the authors observed.

Both cancers began to appear during periods of decreased viral load as the study progressed, although the decrease was only significant for NHL. However, from 2007 to 2011, 29% of KS cases and 51% of NHL cases were diagnosed when HIV RNA was suppressed to 500 copies/mL or lower.

Again, the authors related this to improved clinical care. “These clinical characteristics and the changes in the underlying HIV population are inherently related. Improvements in ART access and earlier initiation lead to earlier suppression of HIV RNA and, ultimately, higher CD4 counts,” they said.

msullivan@frontlinemedcom.com

On Twitter @Alz_Gal

The incidence of Kaposi’s sarcoma among HIV patients has declined in the antiretroviral era, but the cancers are now presenting in older patients, after treatment starts, and in patients who have undetectable levels of HIV RNA.

Changes were also seen in the appearance of non-Hodgkin lymphoma, which is now occurring in patients with higher CD4 counts and lower HIV viral loads.

Although the findings of this large database study were largely driven by the fact that more patients are on antiretroviral therapy (ART) and in active clinical care, biological forces may also be at work, wrote Elizabeth L. Yanik, PhD, of the National Cancer Institute, Rockville, Md., and her associates (Am J Clin Oncol. 2016 Aug 9. doi: 10.1200/JCO.2016.67.6999).

“For example, cancers that develop in patients with HIV infection after immune recovery may manifest genetic or epigenetic changes that facilitate evasion from the immune system … [and] given that human herpesvirus-8 and Epstein-Barr virus are genetically heterogeneous, another possibility is that patients in whom Kaposi’s or non-Hodgkin lymphoma develops after immune reconstitution may be infected with more pathogenic strains.”

Dr. Yanik and her colleagues mined data from the Centers for AIDS Research Network of Integrated Clinical Systems (CNICS). The 24,901 patients have been followed from 1996-2011. Among them, 446 cases of Kaposi’s sarcoma (KS) and 258 cases of non-Hodgkin lymphoma (NHL) developed. Overall, KS and NHL incidence rates decreased 5% and 8% per year, respectively.

The proportion of KS diagnosed during routine care increased significantly, from 32% to 49%, reflecting the fact that more HIV patients continue to enter active clinical settings. The diagnostic setting of NHL did not change significantly over the study period, with 64% of cases being diagnosed in routine care in the latter years. From the beginning to the end of the study period, patient median age at diagnosis increased for both KS (from 37 to 42 years) and NHL (from 40 to 46 years).

The authors said this is a direct result of changing care patterns. “The proportion of KS cases diagnosed among patients who received ART increased not because KS incidence increased in patients who received ART but because of the growing fraction of the HIV population administered ART.”

As the study period progressed, more cases of KS appeared 6 months or longer after ART initiation, from 26% in the early years to 60% in the latter years. This change didn’t occur with NHL cases; 68% of them were diagnosed at least 6 months after ART began.

The mean CD4 count at diagnosis increased with time for both KS and NHL. During 2007-2011, 15% of KS cases and 24% of NHL cases were diagnosed at CD4 counts of 500 cells/mL or more, whereas less than half were diagnosed at CD4 counts less than 200 cells/mL, the authors observed.

Both cancers began to appear during periods of decreased viral load as the study progressed, although the decrease was only significant for NHL. However, from 2007 to 2011, 29% of KS cases and 51% of NHL cases were diagnosed when HIV RNA was suppressed to 500 copies/mL or lower.

Again, the authors related this to improved clinical care. “These clinical characteristics and the changes in the underlying HIV population are inherently related. Improvements in ART access and earlier initiation lead to earlier suppression of HIV RNA and, ultimately, higher CD4 counts,” they said.

msullivan@frontlinemedcom.com

On Twitter @Alz_Gal

References

References

Publications
Publications
Topics
Article Type
Display Headline
Changes in HIV-related cancers reflect changes in HIV patient care
Display Headline
Changes in HIV-related cancers reflect changes in HIV patient care
Article Source

FROM THE JOURNAL OF CLINICAL ONCOLOGY

PURLs Copyright

Inside the Article

Disallow All Ads
Vitals

Key clinical point: Kaposi’s sarcoma and non-Hodgkin lymphoma are appearing in older, less- immunosuppressed patients.

Major finding: From 1996-2011, median patient age at diagnosis increased for both Kaposi’s sarcoma (from 37 to 42 years) and non-Hodgkin lymphoma (from 40 to 46 years).

Data source: The database study examined 446 cases of Kaposi’s sarcoma and 258 cases of non-Hodgkin lymphoma.

Disclosures: The National Cancer Institute headed the study; Dr. Yanik had no financial disclosures.

Retinal nerve fiber layer thinning predicts cognitive decline

Article Type
Changed
Fri, 01/18/2019 - 16:06
Display Headline
Retinal nerve fiber layer thinning predicts cognitive decline

TORONTO – A thinner-than-normal layer of retinal nerve fibers in the eye is now linked with cognitive decline – another suggestion that extracranial physical findings could be leveraged into dementia screening tools.

The findings were seen in a cohort of 32,000 people enrolled in the U.K. Biobank– an ongoing prospective study following half a million people and collecting data on cancer, heart diseases, stroke, diabetes, arthritis, osteoporosis, eye disorders, depression, and dementia.

The correlation between retinal nerve fiber thickness and cognition was observed in the large cohort at baseline, Fang Sarah Ko, MD, said during a press briefing at the Alzheimer’s Association International Conference 2016. But after following 1,251 of these subjects for 3 years, she and her colleagues found that the correlation continued unabated.

Courtesy National Eye Institute
Optical coherence tomography machine used to provide an overview of the retina's structure.

“It’s amazing that we found this in such a healthy population,” Dr. Ko said during the briefing. “We wouldn’t have expected in just 3 years to see any cognitive decline in this cohort, much less measurable cognitive decline with a significant association with retinal nerve fiber layer thickness.”

Dr. Ko, an ophthalmologist in private practice in Tallahassee, Fla., said later during her main presentation of the study that the finding suggests a possible role for retinal imaging as a cognitive health screen.

“Thinner nerve fiber layer was associated with worse performance on memory, reasoning, and reaction time at baseline, and with a decline in each of these tests over time,” she said. “It may be that the nerve fiber layer could be used as a biomarker,” because it is easy to observe and measure with equipment available in most ophthalmology offices. “I would say the potential for clinical use is quite high.”

The U.K. Biobank recruits all of its subjects through the U.K. National Health Service patient registry. All undergo a standard battery of numerous tests; among them are tests of cognitive function and spectral-domain optical coherence tomography (S-DOCT) of the eye. S-DOCT is an increasingly common method of imaging the retina. It produces three-dimensional images of extremely fine resolution.

The 32,000 subjects included in the baseline cohort were all free of diabetes and ocular or neurological disease, and they had normal intraocular pressure. They undertook four tests of cognition: prospective memory, pairs matching, numeric and verbal reasoning, and reaction time. The relationship between these test results and retinal nerve fiber thickness was adjusted for age, sex, race, socioeconomic status, height, refraction, and intraocular pressure.

At baseline, the mean retinal nerve fiber layer was significantly thinner among subjects with abnormal scores on any of the cognitive tests. On the prospective memory test, the layer was an average of 53.3 micrometers for subjects who had correct first-time recall, 52.5 micrometers for those with correct second-time recall, and 51.9 micrometers for those who did not recall. The layer was also significantly thinner in subjects who had low scores on pairs matching, numeric and verbal reasoning, and reaction times.

And the relationship between test results and retinal nerve fiber thinning appeared additive, Dr. Ko said. For each test that a subject failed, the layer was about 1 micrometer thinner. In the multivariate analysis, thinner retinal nerve fiber layer was associated with worse performance on all of the tests: The layer was 0.13 micrometer thinner for each incorrect match on pairs matching; 0.14 micrometer thinner for every 2 points lower in score on numeric and verbal reasoning; and 0.14 micrometer thinner for every 100 millisecond slower reaction time.

The 3-year follow-up data confirmed that these baseline findings persisted, and predicted cognitive decline. “Again, this was true after controlling for all the variables,” Dr. Ko said. “We found that those with the thinnest layers at baseline got worse on more of the tests, compared to those who had the thickest nerve fiber layers at baseline.”

Although this is the first time retinal nerve fiber thickness has predicted cognitive decline, the association with cognition has been studied for a few years. A 2015 meta-analysis found 17 studies comparing the marker between patients with Alzheimer’s and healthy controls and 5 studies of patients with mild cognitive impairment MCI) and healthy controls (Alzheimers Dement (Amst). 2015 Apr 23;1[2]:136-43). All of these found significant retinal nerve fiber thinning in Alzheimer’s and MCI patients.

The lead author of that paper, Kelsey Thompson of the University of Edinburgh (United Kingdom), said the retinal ganglion cell axons can be seen as a sentinel marker for neurodegeneration in the brain.

 

 

“Retinal nerve fiber layer thinning in [Alzheimer’s disease] has been hypothesized to occur because of retrograde degeneration of the retinal ganglion cell axons, and these changes have been suggested to occur even before memory is affected. There is also a suggestion that neuroretinal atrophy may occur as a result of amyloid-beta plaque deposits within the retina, although this hypothesis remains more speculative.”

Dr. Ko had no financial declarations.

msullivan@frontlinemedcom.com

On Twitter @alz_gal

References

Meeting/Event
Author and Disclosure Information

Publications
Topics
Legacy Keywords
AAIC 2016, retinal nerve fiber layer, optical coherence tomography, cognitive decline
Sections
Author and Disclosure Information

Author and Disclosure Information

Meeting/Event
Meeting/Event

TORONTO – A thinner-than-normal layer of retinal nerve fibers in the eye is now linked with cognitive decline – another suggestion that extracranial physical findings could be leveraged into dementia screening tools.

The findings were seen in a cohort of 32,000 people enrolled in the U.K. Biobank– an ongoing prospective study following half a million people and collecting data on cancer, heart diseases, stroke, diabetes, arthritis, osteoporosis, eye disorders, depression, and dementia.

The correlation between retinal nerve fiber thickness and cognition was observed in the large cohort at baseline, Fang Sarah Ko, MD, said during a press briefing at the Alzheimer’s Association International Conference 2016. But after following 1,251 of these subjects for 3 years, she and her colleagues found that the correlation continued unabated.

Courtesy National Eye Institute
Optical coherence tomography machine used to provide an overview of the retina's structure.

“It’s amazing that we found this in such a healthy population,” Dr. Ko said during the briefing. “We wouldn’t have expected in just 3 years to see any cognitive decline in this cohort, much less measurable cognitive decline with a significant association with retinal nerve fiber layer thickness.”

Dr. Ko, an ophthalmologist in private practice in Tallahassee, Fla., said later during her main presentation of the study that the finding suggests a possible role for retinal imaging as a cognitive health screen.

“Thinner nerve fiber layer was associated with worse performance on memory, reasoning, and reaction time at baseline, and with a decline in each of these tests over time,” she said. “It may be that the nerve fiber layer could be used as a biomarker,” because it is easy to observe and measure with equipment available in most ophthalmology offices. “I would say the potential for clinical use is quite high.”

The U.K. Biobank recruits all of its subjects through the U.K. National Health Service patient registry. All undergo a standard battery of numerous tests; among them are tests of cognitive function and spectral-domain optical coherence tomography (S-DOCT) of the eye. S-DOCT is an increasingly common method of imaging the retina. It produces three-dimensional images of extremely fine resolution.

The 32,000 subjects included in the baseline cohort were all free of diabetes and ocular or neurological disease, and they had normal intraocular pressure. They undertook four tests of cognition: prospective memory, pairs matching, numeric and verbal reasoning, and reaction time. The relationship between these test results and retinal nerve fiber thickness was adjusted for age, sex, race, socioeconomic status, height, refraction, and intraocular pressure.

At baseline, the mean retinal nerve fiber layer was significantly thinner among subjects with abnormal scores on any of the cognitive tests. On the prospective memory test, the layer was an average of 53.3 micrometers for subjects who had correct first-time recall, 52.5 micrometers for those with correct second-time recall, and 51.9 micrometers for those who did not recall. The layer was also significantly thinner in subjects who had low scores on pairs matching, numeric and verbal reasoning, and reaction times.

And the relationship between test results and retinal nerve fiber thinning appeared additive, Dr. Ko said. For each test that a subject failed, the layer was about 1 micrometer thinner. In the multivariate analysis, thinner retinal nerve fiber layer was associated with worse performance on all of the tests: The layer was 0.13 micrometer thinner for each incorrect match on pairs matching; 0.14 micrometer thinner for every 2 points lower in score on numeric and verbal reasoning; and 0.14 micrometer thinner for every 100 millisecond slower reaction time.

The 3-year follow-up data confirmed that these baseline findings persisted, and predicted cognitive decline. “Again, this was true after controlling for all the variables,” Dr. Ko said. “We found that those with the thinnest layers at baseline got worse on more of the tests, compared to those who had the thickest nerve fiber layers at baseline.”

Although this is the first time retinal nerve fiber thickness has predicted cognitive decline, the association with cognition has been studied for a few years. A 2015 meta-analysis found 17 studies comparing the marker between patients with Alzheimer’s and healthy controls and 5 studies of patients with mild cognitive impairment MCI) and healthy controls (Alzheimers Dement (Amst). 2015 Apr 23;1[2]:136-43). All of these found significant retinal nerve fiber thinning in Alzheimer’s and MCI patients.

The lead author of that paper, Kelsey Thompson of the University of Edinburgh (United Kingdom), said the retinal ganglion cell axons can be seen as a sentinel marker for neurodegeneration in the brain.

 

 

“Retinal nerve fiber layer thinning in [Alzheimer’s disease] has been hypothesized to occur because of retrograde degeneration of the retinal ganglion cell axons, and these changes have been suggested to occur even before memory is affected. There is also a suggestion that neuroretinal atrophy may occur as a result of amyloid-beta plaque deposits within the retina, although this hypothesis remains more speculative.”

Dr. Ko had no financial declarations.

msullivan@frontlinemedcom.com

On Twitter @alz_gal

TORONTO – A thinner-than-normal layer of retinal nerve fibers in the eye is now linked with cognitive decline – another suggestion that extracranial physical findings could be leveraged into dementia screening tools.

The findings were seen in a cohort of 32,000 people enrolled in the U.K. Biobank– an ongoing prospective study following half a million people and collecting data on cancer, heart diseases, stroke, diabetes, arthritis, osteoporosis, eye disorders, depression, and dementia.

The correlation between retinal nerve fiber thickness and cognition was observed in the large cohort at baseline, Fang Sarah Ko, MD, said during a press briefing at the Alzheimer’s Association International Conference 2016. But after following 1,251 of these subjects for 3 years, she and her colleagues found that the correlation continued unabated.

Courtesy National Eye Institute
Optical coherence tomography machine used to provide an overview of the retina's structure.

“It’s amazing that we found this in such a healthy population,” Dr. Ko said during the briefing. “We wouldn’t have expected in just 3 years to see any cognitive decline in this cohort, much less measurable cognitive decline with a significant association with retinal nerve fiber layer thickness.”

Dr. Ko, an ophthalmologist in private practice in Tallahassee, Fla., said later during her main presentation of the study that the finding suggests a possible role for retinal imaging as a cognitive health screen.

“Thinner nerve fiber layer was associated with worse performance on memory, reasoning, and reaction time at baseline, and with a decline in each of these tests over time,” she said. “It may be that the nerve fiber layer could be used as a biomarker,” because it is easy to observe and measure with equipment available in most ophthalmology offices. “I would say the potential for clinical use is quite high.”

The U.K. Biobank recruits all of its subjects through the U.K. National Health Service patient registry. All undergo a standard battery of numerous tests; among them are tests of cognitive function and spectral-domain optical coherence tomography (S-DOCT) of the eye. S-DOCT is an increasingly common method of imaging the retina. It produces three-dimensional images of extremely fine resolution.

The 32,000 subjects included in the baseline cohort were all free of diabetes and ocular or neurological disease, and they had normal intraocular pressure. They undertook four tests of cognition: prospective memory, pairs matching, numeric and verbal reasoning, and reaction time. The relationship between these test results and retinal nerve fiber thickness was adjusted for age, sex, race, socioeconomic status, height, refraction, and intraocular pressure.

At baseline, the mean retinal nerve fiber layer was significantly thinner among subjects with abnormal scores on any of the cognitive tests. On the prospective memory test, the layer was an average of 53.3 micrometers for subjects who had correct first-time recall, 52.5 micrometers for those with correct second-time recall, and 51.9 micrometers for those who did not recall. The layer was also significantly thinner in subjects who had low scores on pairs matching, numeric and verbal reasoning, and reaction times.

And the relationship between test results and retinal nerve fiber thinning appeared additive, Dr. Ko said. For each test that a subject failed, the layer was about 1 micrometer thinner. In the multivariate analysis, thinner retinal nerve fiber layer was associated with worse performance on all of the tests: The layer was 0.13 micrometer thinner for each incorrect match on pairs matching; 0.14 micrometer thinner for every 2 points lower in score on numeric and verbal reasoning; and 0.14 micrometer thinner for every 100 millisecond slower reaction time.

The 3-year follow-up data confirmed that these baseline findings persisted, and predicted cognitive decline. “Again, this was true after controlling for all the variables,” Dr. Ko said. “We found that those with the thinnest layers at baseline got worse on more of the tests, compared to those who had the thickest nerve fiber layers at baseline.”

Although this is the first time retinal nerve fiber thickness has predicted cognitive decline, the association with cognition has been studied for a few years. A 2015 meta-analysis found 17 studies comparing the marker between patients with Alzheimer’s and healthy controls and 5 studies of patients with mild cognitive impairment MCI) and healthy controls (Alzheimers Dement (Amst). 2015 Apr 23;1[2]:136-43). All of these found significant retinal nerve fiber thinning in Alzheimer’s and MCI patients.

The lead author of that paper, Kelsey Thompson of the University of Edinburgh (United Kingdom), said the retinal ganglion cell axons can be seen as a sentinel marker for neurodegeneration in the brain.

 

 

“Retinal nerve fiber layer thinning in [Alzheimer’s disease] has been hypothesized to occur because of retrograde degeneration of the retinal ganglion cell axons, and these changes have been suggested to occur even before memory is affected. There is also a suggestion that neuroretinal atrophy may occur as a result of amyloid-beta plaque deposits within the retina, although this hypothesis remains more speculative.”

Dr. Ko had no financial declarations.

msullivan@frontlinemedcom.com

On Twitter @alz_gal

References

References

Publications
Publications
Topics
Article Type
Display Headline
Retinal nerve fiber layer thinning predicts cognitive decline
Display Headline
Retinal nerve fiber layer thinning predicts cognitive decline
Legacy Keywords
AAIC 2016, retinal nerve fiber layer, optical coherence tomography, cognitive decline
Legacy Keywords
AAIC 2016, retinal nerve fiber layer, optical coherence tomography, cognitive decline
Sections
Article Source

AT AAIC 2016

PURLs Copyright

Inside the Article

Vitals

Key clinical point: Thinning of the retinal nerve fiber layer was associated with poorer cognitive performance and predicted cognitive decline as well.

Major finding: On a prospective memory test, the layer was an average of 53.3 micrometers for subjects who had correct first-time recall, vs. 51.9 micrometers for those who did not recall.

Data source: The study comprised 32,000 patients at baseline, of whom 1,251 were followed for 3 years.

Disclosures: Dr. Ko had no financial disclosures.

Early Alzheimer’s Treatment Decreases Both Costs and Mortality

Article Type
Changed
Tue, 12/13/2016 - 10:27
Display Headline
Early Alzheimer’s Treatment Decreases Both Costs and Mortality

TORONTO – For patients with Alzheimer’s disease, early treatment may translate into lower health care costs and better survival, based on study results reported at the Alzheimer’s Association International Conference 2016.

A review of Medicare claims data from more than 1,300 patients who received a diagnosis of Alzheimer’s disease during 2010-2013 found that patients who got standard anti-dementia therapy within a month of an Alzheimer’s diagnosis had a 28% lower risk of dying by 6 months than did patients who weren’t treated. And while their health care costs spiked at the time of diagnosis, monthly costs were consistently lower, yielding an overall savings of about $1,700 by the end of the study.

©Kheng guan Toh/Thinkstock

It’s not that the drugs themselves exerted any lifesaving effects, said study co-author Christopher Black, associate director of outcomes research at Merck Research Laboratories, Rahway, N.J. Rather, the observed benefit is probably because the patients who got treated also then got consistent medical attention for health-threatening comorbidities.

“This is an important caveat,” Mr. Black said in an interview at the meeting. “We are not saying that anti-dementia treatment is causing longer survival. It’s a proxy for better care. Typically, dementia patients die of complications from comorbidities that are exacerbated by Alzheimer’s symptoms.”

Mr. Black and his colleagues examined healthcare costs and utilization during the 12 months before diagnosis and in the 6 months after diagnosis. They also looked at mortality incidence. He and his colleagues identified 6,553 incident Alzheimer’s patients. Just 35% received a prescription for an anti-dementia medication within the month after diagnosis. Most patients (67%) got donepezil. Other prescribed medications were memantine (19%), rivastigmine (12%) and galantamine (2%).

There were several significant differences between the treated and untreated groups. Untreated patients were older (84 vs. 81 years) and had a significantly higher Charlson comorbidity index (3.5 vs. 3.2). To account for these differences, the researchers used propensity score matching on the basis of medical comorbidities, age, and other demographics to compare 694 patients from each group.

At the end of 6 months, treated patients had a 28% lower risk of dying (hazard ratio, 0.72) than did non-treated patients. Health care costs and utilization showed significant differences as well. Before diagnosis, monthly medical costs were similar (average $665). During the month of diagnosis, costs surged for both groups, although more so in the group that was untreated (mean $6,711 vs. $5,535). This probably reflected the general health differences between the two groups, as well as the cost of transitioning into a hospital, or from a hospital to a long-term care facility, Mr. Black said.

“This spike at the time of diagnosis is important,” he said in an interview. “The major cost driver was inpatient hospitalization and skilled nursing home placement, and this was driven by 10% of the patients.”

After the first month, costs declined and stabilized in each group. However, the between-group differences remained. There were half as many hospice visits per month among the treated patients (0.04 vs. 0.09; P = .0001). Monthly costs overall were lower, but not to a statistically significant extent ($2,207 vs. $2,349; P = .3037). Total health care costs by the end of the follow-up period averaged about $1,700 less in treated patients.

“Even after adjusting for demographic and clinical differences, results suggested that treated Alzheimer’s patients had lower all-cause health care costs and lower mortality rates compared to untreated patients,” Mr. Black said. “The arguments for early treatment are myriad, but this study shows greater survival and less all-cause health care costs among those receiving treatment for dementia. These results indicate that choosing not to treat, or even a delay in starting treatment, may lead to less favorable results. Early diagnosis and time to treatment should be a priority for policymakers, physicians and the public.”

References

Meeting/Event
Author and Disclosure Information

Michele G. Sullivan, Family Practice News Digital Network

Publications
Topics
Legacy Keywords
AAIC 2016, donepezil, Alzheimer's, Medicare, costs
Author and Disclosure Information

Michele G. Sullivan, Family Practice News Digital Network

Author and Disclosure Information

Michele G. Sullivan, Family Practice News Digital Network

Meeting/Event
Meeting/Event

TORONTO – For patients with Alzheimer’s disease, early treatment may translate into lower health care costs and better survival, based on study results reported at the Alzheimer’s Association International Conference 2016.

A review of Medicare claims data from more than 1,300 patients who received a diagnosis of Alzheimer’s disease during 2010-2013 found that patients who got standard anti-dementia therapy within a month of an Alzheimer’s diagnosis had a 28% lower risk of dying by 6 months than did patients who weren’t treated. And while their health care costs spiked at the time of diagnosis, monthly costs were consistently lower, yielding an overall savings of about $1,700 by the end of the study.

©Kheng guan Toh/Thinkstock

It’s not that the drugs themselves exerted any lifesaving effects, said study co-author Christopher Black, associate director of outcomes research at Merck Research Laboratories, Rahway, N.J. Rather, the observed benefit is probably because the patients who got treated also then got consistent medical attention for health-threatening comorbidities.

“This is an important caveat,” Mr. Black said in an interview at the meeting. “We are not saying that anti-dementia treatment is causing longer survival. It’s a proxy for better care. Typically, dementia patients die of complications from comorbidities that are exacerbated by Alzheimer’s symptoms.”

Mr. Black and his colleagues examined healthcare costs and utilization during the 12 months before diagnosis and in the 6 months after diagnosis. They also looked at mortality incidence. He and his colleagues identified 6,553 incident Alzheimer’s patients. Just 35% received a prescription for an anti-dementia medication within the month after diagnosis. Most patients (67%) got donepezil. Other prescribed medications were memantine (19%), rivastigmine (12%) and galantamine (2%).

There were several significant differences between the treated and untreated groups. Untreated patients were older (84 vs. 81 years) and had a significantly higher Charlson comorbidity index (3.5 vs. 3.2). To account for these differences, the researchers used propensity score matching on the basis of medical comorbidities, age, and other demographics to compare 694 patients from each group.

At the end of 6 months, treated patients had a 28% lower risk of dying (hazard ratio, 0.72) than did non-treated patients. Health care costs and utilization showed significant differences as well. Before diagnosis, monthly medical costs were similar (average $665). During the month of diagnosis, costs surged for both groups, although more so in the group that was untreated (mean $6,711 vs. $5,535). This probably reflected the general health differences between the two groups, as well as the cost of transitioning into a hospital, or from a hospital to a long-term care facility, Mr. Black said.

“This spike at the time of diagnosis is important,” he said in an interview. “The major cost driver was inpatient hospitalization and skilled nursing home placement, and this was driven by 10% of the patients.”

After the first month, costs declined and stabilized in each group. However, the between-group differences remained. There were half as many hospice visits per month among the treated patients (0.04 vs. 0.09; P = .0001). Monthly costs overall were lower, but not to a statistically significant extent ($2,207 vs. $2,349; P = .3037). Total health care costs by the end of the follow-up period averaged about $1,700 less in treated patients.

“Even after adjusting for demographic and clinical differences, results suggested that treated Alzheimer’s patients had lower all-cause health care costs and lower mortality rates compared to untreated patients,” Mr. Black said. “The arguments for early treatment are myriad, but this study shows greater survival and less all-cause health care costs among those receiving treatment for dementia. These results indicate that choosing not to treat, or even a delay in starting treatment, may lead to less favorable results. Early diagnosis and time to treatment should be a priority for policymakers, physicians and the public.”

TORONTO – For patients with Alzheimer’s disease, early treatment may translate into lower health care costs and better survival, based on study results reported at the Alzheimer’s Association International Conference 2016.

A review of Medicare claims data from more than 1,300 patients who received a diagnosis of Alzheimer’s disease during 2010-2013 found that patients who got standard anti-dementia therapy within a month of an Alzheimer’s diagnosis had a 28% lower risk of dying by 6 months than did patients who weren’t treated. And while their health care costs spiked at the time of diagnosis, monthly costs were consistently lower, yielding an overall savings of about $1,700 by the end of the study.

©Kheng guan Toh/Thinkstock

It’s not that the drugs themselves exerted any lifesaving effects, said study co-author Christopher Black, associate director of outcomes research at Merck Research Laboratories, Rahway, N.J. Rather, the observed benefit is probably because the patients who got treated also then got consistent medical attention for health-threatening comorbidities.

“This is an important caveat,” Mr. Black said in an interview at the meeting. “We are not saying that anti-dementia treatment is causing longer survival. It’s a proxy for better care. Typically, dementia patients die of complications from comorbidities that are exacerbated by Alzheimer’s symptoms.”

Mr. Black and his colleagues examined healthcare costs and utilization during the 12 months before diagnosis and in the 6 months after diagnosis. They also looked at mortality incidence. He and his colleagues identified 6,553 incident Alzheimer’s patients. Just 35% received a prescription for an anti-dementia medication within the month after diagnosis. Most patients (67%) got donepezil. Other prescribed medications were memantine (19%), rivastigmine (12%) and galantamine (2%).

There were several significant differences between the treated and untreated groups. Untreated patients were older (84 vs. 81 years) and had a significantly higher Charlson comorbidity index (3.5 vs. 3.2). To account for these differences, the researchers used propensity score matching on the basis of medical comorbidities, age, and other demographics to compare 694 patients from each group.

At the end of 6 months, treated patients had a 28% lower risk of dying (hazard ratio, 0.72) than did non-treated patients. Health care costs and utilization showed significant differences as well. Before diagnosis, monthly medical costs were similar (average $665). During the month of diagnosis, costs surged for both groups, although more so in the group that was untreated (mean $6,711 vs. $5,535). This probably reflected the general health differences between the two groups, as well as the cost of transitioning into a hospital, or from a hospital to a long-term care facility, Mr. Black said.

“This spike at the time of diagnosis is important,” he said in an interview. “The major cost driver was inpatient hospitalization and skilled nursing home placement, and this was driven by 10% of the patients.”

After the first month, costs declined and stabilized in each group. However, the between-group differences remained. There were half as many hospice visits per month among the treated patients (0.04 vs. 0.09; P = .0001). Monthly costs overall were lower, but not to a statistically significant extent ($2,207 vs. $2,349; P = .3037). Total health care costs by the end of the follow-up period averaged about $1,700 less in treated patients.

“Even after adjusting for demographic and clinical differences, results suggested that treated Alzheimer’s patients had lower all-cause health care costs and lower mortality rates compared to untreated patients,” Mr. Black said. “The arguments for early treatment are myriad, but this study shows greater survival and less all-cause health care costs among those receiving treatment for dementia. These results indicate that choosing not to treat, or even a delay in starting treatment, may lead to less favorable results. Early diagnosis and time to treatment should be a priority for policymakers, physicians and the public.”

References

References

Publications
Publications
Topics
Article Type
Display Headline
Early Alzheimer’s Treatment Decreases Both Costs and Mortality
Display Headline
Early Alzheimer’s Treatment Decreases Both Costs and Mortality
Legacy Keywords
AAIC 2016, donepezil, Alzheimer's, Medicare, costs
Legacy Keywords
AAIC 2016, donepezil, Alzheimer's, Medicare, costs
Article Source

AT AAIC 2016

PURLs Copyright

Inside the Article

Disallow All Ads

Early Alzheimer’s treatment decreases both costs and mortality

Article Type
Changed
Fri, 01/18/2019 - 16:06
Display Headline
Early Alzheimer’s treatment decreases both costs and mortality

TORONTO – For patients with Alzheimer’s disease, early treatment may translate into lower health care costs and better survival, based on study results reported at the Alzheimer’s Association International Conference 2016.

A review of Medicare claims data from more than 1,300 patients who received a diagnosis of Alzheimer’s disease during 2010-2013 found that patients who got standard anti-dementia therapy within a month of an Alzheimer’s diagnosis had a 28% lower risk of dying by 6 months than did patients who weren’t treated. And while their health care costs spiked at the time of diagnosis, monthly costs were consistently lower, yielding an overall savings of about $1,700 by the end of the study.

 

©Kheng guan Toh/Thinkstock

It’s not that the drugs themselves exerted any lifesaving effects, said study co-author Christopher Black, associate director of outcomes research at Merck Research Laboratories, Rahway, N.J. Rather, the observed benefit is probably because the patients who got treated also then got consistent medical attention for health-threatening comorbidities.

“This is an important caveat,” Mr. Black said in an interview at the meeting. “We are not saying that anti-dementia treatment is causing longer survival. It’s a proxy for better care. Typically, dementia patients die of complications from comorbidities that are exacerbated by Alzheimer’s symptoms.”

Mr. Black and his colleagues examined healthcare costs and utilization during the 12 months before diagnosis and in the 6 months after diagnosis. They also looked at mortality incidence. He and his colleagues identified 6,553 incident Alzheimer’s patients. Just 35% received a prescription for an anti-dementia medication within the month after diagnosis. Most patients (67%) got donepezil. Other prescribed medications were memantine (19%), rivastigmine (12%) and galantamine (2%).

There were several significant differences between the treated and untreated groups. Untreated patients were older (84 vs. 81 years) and had a significantly higher Charlson comorbidity index (3.5 vs. 3.2). To account for these differences, the researchers used propensity score matching on the basis of medical comorbidities, age, and other demographics to compare 694 patients from each group.

At the end of 6 months, treated patients had a 28% lower risk of dying (hazard ratio, 0.72) than did non-treated patients. Health care costs and utilization showed significant differences as well. Before diagnosis, monthly medical costs were similar (average $665). During the month of diagnosis, costs surged for both groups, although more so in the group that was untreated (mean $6,711 vs. $5,535). This probably reflected the general health differences between the two groups, as well as the cost of transitioning into a hospital, or from a hospital to a long-term care facility, Mr. Black said.

“This spike at the time of diagnosis is important,” he said in an interview. “The major cost driver was inpatient hospitalization and skilled nursing home placement, and this was driven by 10% of the patients.”

After the first month, costs declined and stabilized in each group. However, the between-group differences remained. There were half as many hospice visits per month among the treated patients (0.04 vs. 0.09; P = .0001). Monthly costs overall were lower, but not to a statistically significant extent ($2,207 vs. $2,349; P = .3037). Total health care costs by the end of the follow-up period averaged about $1,700 less in treated patients.

“Even after adjusting for demographic and clinical differences, results suggested that treated Alzheimer’s patients had lower all-cause health care costs and lower mortality rates compared to untreated patients,” Mr. Black said. “The arguments for early treatment are myriad, but this study shows greater survival and less all-cause health care costs among those receiving treatment for dementia. These results indicate that choosing not to treat, or even a delay in starting treatment, may lead to less favorable results. Early diagnosis and time to treatment should be a priority for policymakers, physicians and the public.”

msullivan@frontlinemedcom.com

On Twitter @alz_gal

Meeting/Event
Publications
Topics
Legacy Keywords
AAIC 2016, donepezil, Alzheimer's, Medicare, costs
Sections
Meeting/Event
Meeting/Event

TORONTO – For patients with Alzheimer’s disease, early treatment may translate into lower health care costs and better survival, based on study results reported at the Alzheimer’s Association International Conference 2016.

A review of Medicare claims data from more than 1,300 patients who received a diagnosis of Alzheimer’s disease during 2010-2013 found that patients who got standard anti-dementia therapy within a month of an Alzheimer’s diagnosis had a 28% lower risk of dying by 6 months than did patients who weren’t treated. And while their health care costs spiked at the time of diagnosis, monthly costs were consistently lower, yielding an overall savings of about $1,700 by the end of the study.

 

©Kheng guan Toh/Thinkstock

It’s not that the drugs themselves exerted any lifesaving effects, said study co-author Christopher Black, associate director of outcomes research at Merck Research Laboratories, Rahway, N.J. Rather, the observed benefit is probably because the patients who got treated also then got consistent medical attention for health-threatening comorbidities.

“This is an important caveat,” Mr. Black said in an interview at the meeting. “We are not saying that anti-dementia treatment is causing longer survival. It’s a proxy for better care. Typically, dementia patients die of complications from comorbidities that are exacerbated by Alzheimer’s symptoms.”

Mr. Black and his colleagues examined healthcare costs and utilization during the 12 months before diagnosis and in the 6 months after diagnosis. They also looked at mortality incidence. He and his colleagues identified 6,553 incident Alzheimer’s patients. Just 35% received a prescription for an anti-dementia medication within the month after diagnosis. Most patients (67%) got donepezil. Other prescribed medications were memantine (19%), rivastigmine (12%) and galantamine (2%).

There were several significant differences between the treated and untreated groups. Untreated patients were older (84 vs. 81 years) and had a significantly higher Charlson comorbidity index (3.5 vs. 3.2). To account for these differences, the researchers used propensity score matching on the basis of medical comorbidities, age, and other demographics to compare 694 patients from each group.

At the end of 6 months, treated patients had a 28% lower risk of dying (hazard ratio, 0.72) than did non-treated patients. Health care costs and utilization showed significant differences as well. Before diagnosis, monthly medical costs were similar (average $665). During the month of diagnosis, costs surged for both groups, although more so in the group that was untreated (mean $6,711 vs. $5,535). This probably reflected the general health differences between the two groups, as well as the cost of transitioning into a hospital, or from a hospital to a long-term care facility, Mr. Black said.

“This spike at the time of diagnosis is important,” he said in an interview. “The major cost driver was inpatient hospitalization and skilled nursing home placement, and this was driven by 10% of the patients.”

After the first month, costs declined and stabilized in each group. However, the between-group differences remained. There were half as many hospice visits per month among the treated patients (0.04 vs. 0.09; P = .0001). Monthly costs overall were lower, but not to a statistically significant extent ($2,207 vs. $2,349; P = .3037). Total health care costs by the end of the follow-up period averaged about $1,700 less in treated patients.

“Even after adjusting for demographic and clinical differences, results suggested that treated Alzheimer’s patients had lower all-cause health care costs and lower mortality rates compared to untreated patients,” Mr. Black said. “The arguments for early treatment are myriad, but this study shows greater survival and less all-cause health care costs among those receiving treatment for dementia. These results indicate that choosing not to treat, or even a delay in starting treatment, may lead to less favorable results. Early diagnosis and time to treatment should be a priority for policymakers, physicians and the public.”

msullivan@frontlinemedcom.com

On Twitter @alz_gal

TORONTO – For patients with Alzheimer’s disease, early treatment may translate into lower health care costs and better survival, based on study results reported at the Alzheimer’s Association International Conference 2016.

A review of Medicare claims data from more than 1,300 patients who received a diagnosis of Alzheimer’s disease during 2010-2013 found that patients who got standard anti-dementia therapy within a month of an Alzheimer’s diagnosis had a 28% lower risk of dying by 6 months than did patients who weren’t treated. And while their health care costs spiked at the time of diagnosis, monthly costs were consistently lower, yielding an overall savings of about $1,700 by the end of the study.

 

©Kheng guan Toh/Thinkstock

It’s not that the drugs themselves exerted any lifesaving effects, said study co-author Christopher Black, associate director of outcomes research at Merck Research Laboratories, Rahway, N.J. Rather, the observed benefit is probably because the patients who got treated also then got consistent medical attention for health-threatening comorbidities.

“This is an important caveat,” Mr. Black said in an interview at the meeting. “We are not saying that anti-dementia treatment is causing longer survival. It’s a proxy for better care. Typically, dementia patients die of complications from comorbidities that are exacerbated by Alzheimer’s symptoms.”

Mr. Black and his colleagues examined healthcare costs and utilization during the 12 months before diagnosis and in the 6 months after diagnosis. They also looked at mortality incidence. He and his colleagues identified 6,553 incident Alzheimer’s patients. Just 35% received a prescription for an anti-dementia medication within the month after diagnosis. Most patients (67%) got donepezil. Other prescribed medications were memantine (19%), rivastigmine (12%) and galantamine (2%).

There were several significant differences between the treated and untreated groups. Untreated patients were older (84 vs. 81 years) and had a significantly higher Charlson comorbidity index (3.5 vs. 3.2). To account for these differences, the researchers used propensity score matching on the basis of medical comorbidities, age, and other demographics to compare 694 patients from each group.

At the end of 6 months, treated patients had a 28% lower risk of dying (hazard ratio, 0.72) than did non-treated patients. Health care costs and utilization showed significant differences as well. Before diagnosis, monthly medical costs were similar (average $665). During the month of diagnosis, costs surged for both groups, although more so in the group that was untreated (mean $6,711 vs. $5,535). This probably reflected the general health differences between the two groups, as well as the cost of transitioning into a hospital, or from a hospital to a long-term care facility, Mr. Black said.

“This spike at the time of diagnosis is important,” he said in an interview. “The major cost driver was inpatient hospitalization and skilled nursing home placement, and this was driven by 10% of the patients.”

After the first month, costs declined and stabilized in each group. However, the between-group differences remained. There were half as many hospice visits per month among the treated patients (0.04 vs. 0.09; P = .0001). Monthly costs overall were lower, but not to a statistically significant extent ($2,207 vs. $2,349; P = .3037). Total health care costs by the end of the follow-up period averaged about $1,700 less in treated patients.

“Even after adjusting for demographic and clinical differences, results suggested that treated Alzheimer’s patients had lower all-cause health care costs and lower mortality rates compared to untreated patients,” Mr. Black said. “The arguments for early treatment are myriad, but this study shows greater survival and less all-cause health care costs among those receiving treatment for dementia. These results indicate that choosing not to treat, or even a delay in starting treatment, may lead to less favorable results. Early diagnosis and time to treatment should be a priority for policymakers, physicians and the public.”

msullivan@frontlinemedcom.com

On Twitter @alz_gal

Publications
Publications
Topics
Article Type
Display Headline
Early Alzheimer’s treatment decreases both costs and mortality
Display Headline
Early Alzheimer’s treatment decreases both costs and mortality
Legacy Keywords
AAIC 2016, donepezil, Alzheimer's, Medicare, costs
Legacy Keywords
AAIC 2016, donepezil, Alzheimer's, Medicare, costs
Sections
Article Source

AT AAIC 2016

Vitals

Key clinical point: Alzheimer’s patients who get early medical treatment have less risk of dying and incur lower health care costs.

Major finding: By 6 months after diagnosis, treated patients had a 28% lower risk of dying, and had incurred about $1,700 less in health care expenditures.

Data source: The Medicare claims database study comprised about 1,300 patients.

Disclosures: Christopher Black is an associate director of outcomes research at Merck Research Laboratories, Rahway, N.J.