Antireflux surgery may not reduce cancer risk in Barrett’s esophagus

Article Type
Changed
Tue, 12/12/2023 - 15:02

Antireflux surgery may be no more effective than antireflux medication for reducing risk of esophageal adenocarcinoma (EAC) among patients with Barrett’s esophagus, according to a Nordic retrospective study.

Risk of EAC was higher among patients who underwent surgery, and risk appeared to increase over time, suggesting that postoperative patients should continue to participate in surveillance programs, reported lead author Jesper Lagergren, MD, PhD, of the Karolinska Institutet, Stockholm, and colleagues.

Karolinska Institutet
Dr. Jesper Lagergren

“Antireflux surgery with fundoplication increases the ability of the gastroesophageal anatomic and physiological barrier to prevent reflux, and can thus prevent any carcinogenic gastric content from reaching the esophagus, including both acid and bile,” the investigators wrote in Gastroenterology, noting that surgery reduces esophageal acid exposure to a greater degree than medication. “Antireflux surgery may thus prevent esophageal adenocarcinoma better than antireflux medication.”

Three meta-analyses to date, however, have failed to provide consistent support for this hypothesis.

“Most of the studies included in these meta-analyses came from single centers, were of small sample size, examined only one treatment arm, and had a short or incomplete follow-up, and ... were hampered by heterogeneity among the included studies,” they noted.

For the present study, Dr. Lagergren and colleagues analyzed national registry data from 33,939 patients with Barrett’s esophagus in Denmark, Finland, Norway, and Sweden. Out of this group, 542 patients (1.6%) had undergone antireflux surgery, while the remainder were managed with antireflux medication.

In both groups, approximately two-thirds of the patients were men. The median age at enrollment was about a decade higher in the medication group (66 vs. 54 years), and this group also tended to have more comorbidities.

After a follow-up period as long as 32 years, the absolute rates of EAC were 1.3% and 2.6% in the medication and surgery groups, respectively. Multivariate analysis, with adjustments for sex, age, year, comorbidities, and age, revealed that postsurgical patients had a 90% increased risk of EAC (hazard ratio [HR], 1.9; 95% CI, 1.1-3.5), versus patients treated with antireflux medication alone.

The relatively higher risk of EAC appeared to increase over time, based on a nonsignificant hazard ratio of 1.8 during the 1- to 4-year follow-up period (HR, 1.8; 95% CI, 0.6-5.0), versus a significant, fourfold risk elevation during the 10- to 32-year follow-up period (HR, 4.4; 95% CI, 1.4-13.5).

“In this cohort of patients with Barrett’s esophagus, the risk of esophageal adenocarcinoma did not decrease after antireflux surgery compared with antireflux medication,” the investigators wrote. “Instead, the risk was increased throughout the follow-up among patients having undergone antireflux surgery.”

Dr. Lagergren and colleagues suggested that the reason for relatively higher cancer risk in the group that underwent surgery likely stems from early and prolonged acid exposure.

“[P]erforming antireflux surgery after years of GERD may be too late to enable a cancer-preventative effect, and most of the patients first diagnosed with Barrett’s esophagus reported a history of many years of GERD symptoms,” they wrote, suggesting that carcinogenic processes had already been set in motion by the time surgery was performed.

“[P]atients with Barrett’s esophagus who undergo antireflux surgery remain at an increased risk of esophageal adenocarcinoma and should continue taking part in surveillance programs,” the investigators concluded.

The study was funded by the Swedish Cancer Society, Swedish Research Council, and Stockholm County Council. The investigators disclosed no conflicts of interest.

Body

Esophageal adenocarcinoma (EAC) has been increasing in frequency for decades. EAC’s only known precursor is Barrett’s esophagus (BE), a complication of GERD with chronic esophageal inflammation (reflux esophagitis). Chronic inflammation can predispose to cancer and refluxed acid itself can cause potentially carcinogenic double-strand DNA breaks in Barrett’s metaplasia. PPIs, which block secretion of the gastric acid that causes reflux esophagitis and DNA damage, are recommended to BE patients for cancer prevention. Logical as that practice may seem, meta-analyses have reached contradictory conclusions regarding the cancer-preventive benefits of PPIs. PPIs do not stop the reflux of other potential carcinogens such as bile salts, and thus it has been argued that fundoplication, which blocks the reflux of all gastric material, should be superior to PPIs for cancer prevention. Plausible as that argument sounds, meta-analyses of the generally small and heterogeneous studies on this issue have not found consistently that antireflux surgery is superior to medical therapy for cancer prevention in BE.

Dr. Stuart J. Spechler

Now, a large, population-based cohort study by Åkerström et al. of Nordic BE patients followed for up to 32 years has found that the overall risk of EAC was higher for patients treated with fundoplication than for those treated with medication (adjusted HR 1.9, 95%CI 1.1-3.5). Furthermore, the EAC risk increased over time in the surgical patients. Well done as this study was, it has important limitations. The overall BE population was large (n=33,939), but only 1.6% (542 patients) had antireflux surgery, and only 14 of those developed EAC during follow-up. Those small numbers limit statistical power. Moreover, important residual confounding cannot be excluded. The surgical patients might have had more severe GERD than medical patients, and it is difficult to make a plausible argument for why fundoplication should increase EAC risk. Nevertheless, this study provides a good lesson on why a plausible argument needs supportive evidence before acting on it in clinical practice. While there may be some excellent reasons for recommending antireflux surgery over medication for patients with severe GERD, better esophageal cancer prevention does not appear to be one of them.
 

Stuart Jon Spechler, MD, is chief of the division of gastroenterology and codirector of the Center for Esophageal Diseases at Baylor University Medical Center, and codirector of the Center for Esophageal Research at Baylor Scott & White Research Institute, Dallas, Texas. Dr. Spechler is a consultant for Phathom Pharmaceuticals and ISOThrive, LLC.

Publications
Topics
Sections
Body

Esophageal adenocarcinoma (EAC) has been increasing in frequency for decades. EAC’s only known precursor is Barrett’s esophagus (BE), a complication of GERD with chronic esophageal inflammation (reflux esophagitis). Chronic inflammation can predispose to cancer and refluxed acid itself can cause potentially carcinogenic double-strand DNA breaks in Barrett’s metaplasia. PPIs, which block secretion of the gastric acid that causes reflux esophagitis and DNA damage, are recommended to BE patients for cancer prevention. Logical as that practice may seem, meta-analyses have reached contradictory conclusions regarding the cancer-preventive benefits of PPIs. PPIs do not stop the reflux of other potential carcinogens such as bile salts, and thus it has been argued that fundoplication, which blocks the reflux of all gastric material, should be superior to PPIs for cancer prevention. Plausible as that argument sounds, meta-analyses of the generally small and heterogeneous studies on this issue have not found consistently that antireflux surgery is superior to medical therapy for cancer prevention in BE.

Dr. Stuart J. Spechler

Now, a large, population-based cohort study by Åkerström et al. of Nordic BE patients followed for up to 32 years has found that the overall risk of EAC was higher for patients treated with fundoplication than for those treated with medication (adjusted HR 1.9, 95%CI 1.1-3.5). Furthermore, the EAC risk increased over time in the surgical patients. Well done as this study was, it has important limitations. The overall BE population was large (n=33,939), but only 1.6% (542 patients) had antireflux surgery, and only 14 of those developed EAC during follow-up. Those small numbers limit statistical power. Moreover, important residual confounding cannot be excluded. The surgical patients might have had more severe GERD than medical patients, and it is difficult to make a plausible argument for why fundoplication should increase EAC risk. Nevertheless, this study provides a good lesson on why a plausible argument needs supportive evidence before acting on it in clinical practice. While there may be some excellent reasons for recommending antireflux surgery over medication for patients with severe GERD, better esophageal cancer prevention does not appear to be one of them.
 

Stuart Jon Spechler, MD, is chief of the division of gastroenterology and codirector of the Center for Esophageal Diseases at Baylor University Medical Center, and codirector of the Center for Esophageal Research at Baylor Scott & White Research Institute, Dallas, Texas. Dr. Spechler is a consultant for Phathom Pharmaceuticals and ISOThrive, LLC.

Body

Esophageal adenocarcinoma (EAC) has been increasing in frequency for decades. EAC’s only known precursor is Barrett’s esophagus (BE), a complication of GERD with chronic esophageal inflammation (reflux esophagitis). Chronic inflammation can predispose to cancer and refluxed acid itself can cause potentially carcinogenic double-strand DNA breaks in Barrett’s metaplasia. PPIs, which block secretion of the gastric acid that causes reflux esophagitis and DNA damage, are recommended to BE patients for cancer prevention. Logical as that practice may seem, meta-analyses have reached contradictory conclusions regarding the cancer-preventive benefits of PPIs. PPIs do not stop the reflux of other potential carcinogens such as bile salts, and thus it has been argued that fundoplication, which blocks the reflux of all gastric material, should be superior to PPIs for cancer prevention. Plausible as that argument sounds, meta-analyses of the generally small and heterogeneous studies on this issue have not found consistently that antireflux surgery is superior to medical therapy for cancer prevention in BE.

Dr. Stuart J. Spechler

Now, a large, population-based cohort study by Åkerström et al. of Nordic BE patients followed for up to 32 years has found that the overall risk of EAC was higher for patients treated with fundoplication than for those treated with medication (adjusted HR 1.9, 95%CI 1.1-3.5). Furthermore, the EAC risk increased over time in the surgical patients. Well done as this study was, it has important limitations. The overall BE population was large (n=33,939), but only 1.6% (542 patients) had antireflux surgery, and only 14 of those developed EAC during follow-up. Those small numbers limit statistical power. Moreover, important residual confounding cannot be excluded. The surgical patients might have had more severe GERD than medical patients, and it is difficult to make a plausible argument for why fundoplication should increase EAC risk. Nevertheless, this study provides a good lesson on why a plausible argument needs supportive evidence before acting on it in clinical practice. While there may be some excellent reasons for recommending antireflux surgery over medication for patients with severe GERD, better esophageal cancer prevention does not appear to be one of them.
 

Stuart Jon Spechler, MD, is chief of the division of gastroenterology and codirector of the Center for Esophageal Diseases at Baylor University Medical Center, and codirector of the Center for Esophageal Research at Baylor Scott & White Research Institute, Dallas, Texas. Dr. Spechler is a consultant for Phathom Pharmaceuticals and ISOThrive, LLC.

Antireflux surgery may be no more effective than antireflux medication for reducing risk of esophageal adenocarcinoma (EAC) among patients with Barrett’s esophagus, according to a Nordic retrospective study.

Risk of EAC was higher among patients who underwent surgery, and risk appeared to increase over time, suggesting that postoperative patients should continue to participate in surveillance programs, reported lead author Jesper Lagergren, MD, PhD, of the Karolinska Institutet, Stockholm, and colleagues.

Karolinska Institutet
Dr. Jesper Lagergren

“Antireflux surgery with fundoplication increases the ability of the gastroesophageal anatomic and physiological barrier to prevent reflux, and can thus prevent any carcinogenic gastric content from reaching the esophagus, including both acid and bile,” the investigators wrote in Gastroenterology, noting that surgery reduces esophageal acid exposure to a greater degree than medication. “Antireflux surgery may thus prevent esophageal adenocarcinoma better than antireflux medication.”

Three meta-analyses to date, however, have failed to provide consistent support for this hypothesis.

“Most of the studies included in these meta-analyses came from single centers, were of small sample size, examined only one treatment arm, and had a short or incomplete follow-up, and ... were hampered by heterogeneity among the included studies,” they noted.

For the present study, Dr. Lagergren and colleagues analyzed national registry data from 33,939 patients with Barrett’s esophagus in Denmark, Finland, Norway, and Sweden. Out of this group, 542 patients (1.6%) had undergone antireflux surgery, while the remainder were managed with antireflux medication.

In both groups, approximately two-thirds of the patients were men. The median age at enrollment was about a decade higher in the medication group (66 vs. 54 years), and this group also tended to have more comorbidities.

After a follow-up period as long as 32 years, the absolute rates of EAC were 1.3% and 2.6% in the medication and surgery groups, respectively. Multivariate analysis, with adjustments for sex, age, year, comorbidities, and age, revealed that postsurgical patients had a 90% increased risk of EAC (hazard ratio [HR], 1.9; 95% CI, 1.1-3.5), versus patients treated with antireflux medication alone.

The relatively higher risk of EAC appeared to increase over time, based on a nonsignificant hazard ratio of 1.8 during the 1- to 4-year follow-up period (HR, 1.8; 95% CI, 0.6-5.0), versus a significant, fourfold risk elevation during the 10- to 32-year follow-up period (HR, 4.4; 95% CI, 1.4-13.5).

“In this cohort of patients with Barrett’s esophagus, the risk of esophageal adenocarcinoma did not decrease after antireflux surgery compared with antireflux medication,” the investigators wrote. “Instead, the risk was increased throughout the follow-up among patients having undergone antireflux surgery.”

Dr. Lagergren and colleagues suggested that the reason for relatively higher cancer risk in the group that underwent surgery likely stems from early and prolonged acid exposure.

“[P]erforming antireflux surgery after years of GERD may be too late to enable a cancer-preventative effect, and most of the patients first diagnosed with Barrett’s esophagus reported a history of many years of GERD symptoms,” they wrote, suggesting that carcinogenic processes had already been set in motion by the time surgery was performed.

“[P]atients with Barrett’s esophagus who undergo antireflux surgery remain at an increased risk of esophageal adenocarcinoma and should continue taking part in surveillance programs,” the investigators concluded.

The study was funded by the Swedish Cancer Society, Swedish Research Council, and Stockholm County Council. The investigators disclosed no conflicts of interest.

Antireflux surgery may be no more effective than antireflux medication for reducing risk of esophageal adenocarcinoma (EAC) among patients with Barrett’s esophagus, according to a Nordic retrospective study.

Risk of EAC was higher among patients who underwent surgery, and risk appeared to increase over time, suggesting that postoperative patients should continue to participate in surveillance programs, reported lead author Jesper Lagergren, MD, PhD, of the Karolinska Institutet, Stockholm, and colleagues.

Karolinska Institutet
Dr. Jesper Lagergren

“Antireflux surgery with fundoplication increases the ability of the gastroesophageal anatomic and physiological barrier to prevent reflux, and can thus prevent any carcinogenic gastric content from reaching the esophagus, including both acid and bile,” the investigators wrote in Gastroenterology, noting that surgery reduces esophageal acid exposure to a greater degree than medication. “Antireflux surgery may thus prevent esophageal adenocarcinoma better than antireflux medication.”

Three meta-analyses to date, however, have failed to provide consistent support for this hypothesis.

“Most of the studies included in these meta-analyses came from single centers, were of small sample size, examined only one treatment arm, and had a short or incomplete follow-up, and ... were hampered by heterogeneity among the included studies,” they noted.

For the present study, Dr. Lagergren and colleagues analyzed national registry data from 33,939 patients with Barrett’s esophagus in Denmark, Finland, Norway, and Sweden. Out of this group, 542 patients (1.6%) had undergone antireflux surgery, while the remainder were managed with antireflux medication.

In both groups, approximately two-thirds of the patients were men. The median age at enrollment was about a decade higher in the medication group (66 vs. 54 years), and this group also tended to have more comorbidities.

After a follow-up period as long as 32 years, the absolute rates of EAC were 1.3% and 2.6% in the medication and surgery groups, respectively. Multivariate analysis, with adjustments for sex, age, year, comorbidities, and age, revealed that postsurgical patients had a 90% increased risk of EAC (hazard ratio [HR], 1.9; 95% CI, 1.1-3.5), versus patients treated with antireflux medication alone.

The relatively higher risk of EAC appeared to increase over time, based on a nonsignificant hazard ratio of 1.8 during the 1- to 4-year follow-up period (HR, 1.8; 95% CI, 0.6-5.0), versus a significant, fourfold risk elevation during the 10- to 32-year follow-up period (HR, 4.4; 95% CI, 1.4-13.5).

“In this cohort of patients with Barrett’s esophagus, the risk of esophageal adenocarcinoma did not decrease after antireflux surgery compared with antireflux medication,” the investigators wrote. “Instead, the risk was increased throughout the follow-up among patients having undergone antireflux surgery.”

Dr. Lagergren and colleagues suggested that the reason for relatively higher cancer risk in the group that underwent surgery likely stems from early and prolonged acid exposure.

“[P]erforming antireflux surgery after years of GERD may be too late to enable a cancer-preventative effect, and most of the patients first diagnosed with Barrett’s esophagus reported a history of many years of GERD symptoms,” they wrote, suggesting that carcinogenic processes had already been set in motion by the time surgery was performed.

“[P]atients with Barrett’s esophagus who undergo antireflux surgery remain at an increased risk of esophageal adenocarcinoma and should continue taking part in surveillance programs,” the investigators concluded.

The study was funded by the Swedish Cancer Society, Swedish Research Council, and Stockholm County Council. The investigators disclosed no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Fewer than 1 out of 4 patients with HCV-related liver cancer receive antivirals

Article Type
Changed
Thu, 12/07/2023 - 18:12

Fewer than one out of four patients with hepatitis C virus (HCV)-related hepatocellular carcinoma (HCC) receive oral interferon-free direct-acting antiviral agents (DAAs), and rates aren’t much better for patients seen by specialists, based on a retrospective analysis of private insurance claims.

The study also showed that patients receiving DAAs lived significantly longer, emphasizing the importance of prescribing these medications to all eligible patients, reported principal investigator Mindie H. Nguyen, MD, AGAF,, of Stanford University Medical Center, Palo Alto, California, and colleagues.

“Prior studies have shown evidence of improved survival among HCV-related HCC patients who received DAA treatment, but not much is known about the current DAA utilization among these patients in the general US population,” said lead author Leslie Y. Kam, MD, a postdoctoral scholar in gastroenterology at Stanford Medicine, who presented the findings in November at the annual meeting of the American Association for the Study of Liver Diseases.

To generate real-world data, the investigators analyzed medical records from 3922 patients in Optum’s Clinformatics Data Mart Database. All patients had private medical insurance and received care for HCV-related HCC between 2015 and 2021.

“Instead of using institutional databases which tend to bias toward highly specialized tertiary care center patients, our study uses a large, national sample of HCV-HCC patients that represents real-world DAA treatment rates and survival outcomes,” Dr. Kam said in a written comment.

Within this cohort, fewer than one out of four patients (23.5%) received DAA, a rate that Dr. Kam called “dismally low.”

Patients with either compensated or decompensated cirrhosis had higher treatment rates than those without cirrhosis (24.2% or 24.5%, respectively, vs. 16.2%; P = .001). The investigators noted that more than half of the patients had decompensated cirrhosis, suggesting that HCV-related HCC was diagnosed late in the disease course.

Receiving care from a gastroenterologist or infectious disease physician also was associated with a higher treatment rate. Patients managed by a gastroenterologist alone had a treatment rate of 27.0%, while those who received care from a gastroenterologist or infectious disease doctor alongside an oncologist had a treatment rate of 25.6%, versus just 9.4% for those who received care from an oncologist alone, and 12.4% among those who did not see a specialist of any kind (P = .005).

These findings highlight “the need for a multidisciplinary approach to care in this population,” Dr. Kam suggested.

Echoing previous research, DAAs were associated with extended survival. A significantly greater percentage of patients who received DAA were alive after 5 years, compared with patients who did not receive DAA (47.2% vs. 35.2%; P less than .001). After adjustment for comorbidities, HCC treatment, race/ethnicity, sex, and age, DAAs were associated with a 39% reduction in risk of death (adjusted hazard ratio, 0.61; 0.53-0.69; P less than .001).

“There were also racial ethnic disparities in patient survival whether patients received DAA or not, with Black patients having worse survival,” Dr. Kam said. “As such, our study highlights that awareness of HCV remains low as does the use of DAA treatment. Therefore, culturally appropriate efforts to improve awareness of HCV must continue among the general public and health care workers as well as efforts to provide point of care accurate and rapid screening tests for HCV so that DAA treatment can be initiated in a timely manner for eligible patients. Continual education on the use of DAA treatment is also needed.”

Robert John Fontana, MD, AGAF, professor of medicine and transplant hepatologist at the University of Michigan, Ann Arbor, described the findings as “frustrating,” and “not the kind of stuff I like to hear about.

“Treatment rates are so low,” Dr. Fontana said, noting that even among gastroenterologists and infectious disease doctors, who should be well-versed in DAAs, antivirals were prescribed less than 30% of the time.

In an interview, Dr. Fontana highlighted the benefits of DAAs, including their ease-of-use and effectiveness.

“Hepatitis C was the leading reason that we had to do liver transplants in the United States for years,” he said. “Then once these really amazing drugs called direct-acting antivirals came out, they changed the landscape very quickly. It really was a game changer for my whole practice, and, nationally, the practice of transplant.”

Yet, this study and others suggest that these practice-altering agents are being underutilized, Dr. Fontana said. A variety of reasons could explain suboptimal usage, he suggested, including lack of awareness among medical professionals and the public, the recency of DAA approvals, low HCV testing rates, lack of symptoms in HCV-positive patients, and medication costs.

This latter barrier, at least, is dissolving, Dr. Fontana said. Some payers initially restricted which providers could prescribe DAAs, but now the economic consensus has swung in their favor, since curing patients of HCV brings significant health care savings down the line. This financial advantage—theoretically multiplied across 4-5 million Americans living with HCV—has bolstered a multi-institutional effort toward universal HCV screening, with testing recommended at least once in every person’s lifetime.

“It’s highly cost effective,” Dr. Fontana said. “Even though the drugs are super expensive, you will reduce cost by preventing the people streaming towards liver cancer or streaming towards liver transplant. That’s why all the professional societies—the USPSTF, the CDC—they all say, ‘OK, screen everyone.’ ”

Screening may be getting easier soon, Dr. Fontana predicted, as at-home HCV-testing kits are on the horizon, with development and adoption likely accelerated by the success of at-home viral testing during the COVID-19 pandemic.

Beyond broader screening, Dr. Fontana suggested that greater awareness of DAAs is needed both within and beyond the medical community.

He advised health care providers who don’t yet feel comfortable diagnosing or treating HCV to refer to their local specialist.

“That’s the main message,” Dr. Fontana said. “I’m always eternally hopeful that every little message helps.”

The investigators and Dr. Fontana disclosed no conflicts of interest.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Fewer than one out of four patients with hepatitis C virus (HCV)-related hepatocellular carcinoma (HCC) receive oral interferon-free direct-acting antiviral agents (DAAs), and rates aren’t much better for patients seen by specialists, based on a retrospective analysis of private insurance claims.

The study also showed that patients receiving DAAs lived significantly longer, emphasizing the importance of prescribing these medications to all eligible patients, reported principal investigator Mindie H. Nguyen, MD, AGAF,, of Stanford University Medical Center, Palo Alto, California, and colleagues.

“Prior studies have shown evidence of improved survival among HCV-related HCC patients who received DAA treatment, but not much is known about the current DAA utilization among these patients in the general US population,” said lead author Leslie Y. Kam, MD, a postdoctoral scholar in gastroenterology at Stanford Medicine, who presented the findings in November at the annual meeting of the American Association for the Study of Liver Diseases.

To generate real-world data, the investigators analyzed medical records from 3922 patients in Optum’s Clinformatics Data Mart Database. All patients had private medical insurance and received care for HCV-related HCC between 2015 and 2021.

“Instead of using institutional databases which tend to bias toward highly specialized tertiary care center patients, our study uses a large, national sample of HCV-HCC patients that represents real-world DAA treatment rates and survival outcomes,” Dr. Kam said in a written comment.

Within this cohort, fewer than one out of four patients (23.5%) received DAA, a rate that Dr. Kam called “dismally low.”

Patients with either compensated or decompensated cirrhosis had higher treatment rates than those without cirrhosis (24.2% or 24.5%, respectively, vs. 16.2%; P = .001). The investigators noted that more than half of the patients had decompensated cirrhosis, suggesting that HCV-related HCC was diagnosed late in the disease course.

Receiving care from a gastroenterologist or infectious disease physician also was associated with a higher treatment rate. Patients managed by a gastroenterologist alone had a treatment rate of 27.0%, while those who received care from a gastroenterologist or infectious disease doctor alongside an oncologist had a treatment rate of 25.6%, versus just 9.4% for those who received care from an oncologist alone, and 12.4% among those who did not see a specialist of any kind (P = .005).

These findings highlight “the need for a multidisciplinary approach to care in this population,” Dr. Kam suggested.

Echoing previous research, DAAs were associated with extended survival. A significantly greater percentage of patients who received DAA were alive after 5 years, compared with patients who did not receive DAA (47.2% vs. 35.2%; P less than .001). After adjustment for comorbidities, HCC treatment, race/ethnicity, sex, and age, DAAs were associated with a 39% reduction in risk of death (adjusted hazard ratio, 0.61; 0.53-0.69; P less than .001).

“There were also racial ethnic disparities in patient survival whether patients received DAA or not, with Black patients having worse survival,” Dr. Kam said. “As such, our study highlights that awareness of HCV remains low as does the use of DAA treatment. Therefore, culturally appropriate efforts to improve awareness of HCV must continue among the general public and health care workers as well as efforts to provide point of care accurate and rapid screening tests for HCV so that DAA treatment can be initiated in a timely manner for eligible patients. Continual education on the use of DAA treatment is also needed.”

Robert John Fontana, MD, AGAF, professor of medicine and transplant hepatologist at the University of Michigan, Ann Arbor, described the findings as “frustrating,” and “not the kind of stuff I like to hear about.

“Treatment rates are so low,” Dr. Fontana said, noting that even among gastroenterologists and infectious disease doctors, who should be well-versed in DAAs, antivirals were prescribed less than 30% of the time.

In an interview, Dr. Fontana highlighted the benefits of DAAs, including their ease-of-use and effectiveness.

“Hepatitis C was the leading reason that we had to do liver transplants in the United States for years,” he said. “Then once these really amazing drugs called direct-acting antivirals came out, they changed the landscape very quickly. It really was a game changer for my whole practice, and, nationally, the practice of transplant.”

Yet, this study and others suggest that these practice-altering agents are being underutilized, Dr. Fontana said. A variety of reasons could explain suboptimal usage, he suggested, including lack of awareness among medical professionals and the public, the recency of DAA approvals, low HCV testing rates, lack of symptoms in HCV-positive patients, and medication costs.

This latter barrier, at least, is dissolving, Dr. Fontana said. Some payers initially restricted which providers could prescribe DAAs, but now the economic consensus has swung in their favor, since curing patients of HCV brings significant health care savings down the line. This financial advantage—theoretically multiplied across 4-5 million Americans living with HCV—has bolstered a multi-institutional effort toward universal HCV screening, with testing recommended at least once in every person’s lifetime.

“It’s highly cost effective,” Dr. Fontana said. “Even though the drugs are super expensive, you will reduce cost by preventing the people streaming towards liver cancer or streaming towards liver transplant. That’s why all the professional societies—the USPSTF, the CDC—they all say, ‘OK, screen everyone.’ ”

Screening may be getting easier soon, Dr. Fontana predicted, as at-home HCV-testing kits are on the horizon, with development and adoption likely accelerated by the success of at-home viral testing during the COVID-19 pandemic.

Beyond broader screening, Dr. Fontana suggested that greater awareness of DAAs is needed both within and beyond the medical community.

He advised health care providers who don’t yet feel comfortable diagnosing or treating HCV to refer to their local specialist.

“That’s the main message,” Dr. Fontana said. “I’m always eternally hopeful that every little message helps.”

The investigators and Dr. Fontana disclosed no conflicts of interest.

Fewer than one out of four patients with hepatitis C virus (HCV)-related hepatocellular carcinoma (HCC) receive oral interferon-free direct-acting antiviral agents (DAAs), and rates aren’t much better for patients seen by specialists, based on a retrospective analysis of private insurance claims.

The study also showed that patients receiving DAAs lived significantly longer, emphasizing the importance of prescribing these medications to all eligible patients, reported principal investigator Mindie H. Nguyen, MD, AGAF,, of Stanford University Medical Center, Palo Alto, California, and colleagues.

“Prior studies have shown evidence of improved survival among HCV-related HCC patients who received DAA treatment, but not much is known about the current DAA utilization among these patients in the general US population,” said lead author Leslie Y. Kam, MD, a postdoctoral scholar in gastroenterology at Stanford Medicine, who presented the findings in November at the annual meeting of the American Association for the Study of Liver Diseases.

To generate real-world data, the investigators analyzed medical records from 3922 patients in Optum’s Clinformatics Data Mart Database. All patients had private medical insurance and received care for HCV-related HCC between 2015 and 2021.

“Instead of using institutional databases which tend to bias toward highly specialized tertiary care center patients, our study uses a large, national sample of HCV-HCC patients that represents real-world DAA treatment rates and survival outcomes,” Dr. Kam said in a written comment.

Within this cohort, fewer than one out of four patients (23.5%) received DAA, a rate that Dr. Kam called “dismally low.”

Patients with either compensated or decompensated cirrhosis had higher treatment rates than those without cirrhosis (24.2% or 24.5%, respectively, vs. 16.2%; P = .001). The investigators noted that more than half of the patients had decompensated cirrhosis, suggesting that HCV-related HCC was diagnosed late in the disease course.

Receiving care from a gastroenterologist or infectious disease physician also was associated with a higher treatment rate. Patients managed by a gastroenterologist alone had a treatment rate of 27.0%, while those who received care from a gastroenterologist or infectious disease doctor alongside an oncologist had a treatment rate of 25.6%, versus just 9.4% for those who received care from an oncologist alone, and 12.4% among those who did not see a specialist of any kind (P = .005).

These findings highlight “the need for a multidisciplinary approach to care in this population,” Dr. Kam suggested.

Echoing previous research, DAAs were associated with extended survival. A significantly greater percentage of patients who received DAA were alive after 5 years, compared with patients who did not receive DAA (47.2% vs. 35.2%; P less than .001). After adjustment for comorbidities, HCC treatment, race/ethnicity, sex, and age, DAAs were associated with a 39% reduction in risk of death (adjusted hazard ratio, 0.61; 0.53-0.69; P less than .001).

“There were also racial ethnic disparities in patient survival whether patients received DAA or not, with Black patients having worse survival,” Dr. Kam said. “As such, our study highlights that awareness of HCV remains low as does the use of DAA treatment. Therefore, culturally appropriate efforts to improve awareness of HCV must continue among the general public and health care workers as well as efforts to provide point of care accurate and rapid screening tests for HCV so that DAA treatment can be initiated in a timely manner for eligible patients. Continual education on the use of DAA treatment is also needed.”

Robert John Fontana, MD, AGAF, professor of medicine and transplant hepatologist at the University of Michigan, Ann Arbor, described the findings as “frustrating,” and “not the kind of stuff I like to hear about.

“Treatment rates are so low,” Dr. Fontana said, noting that even among gastroenterologists and infectious disease doctors, who should be well-versed in DAAs, antivirals were prescribed less than 30% of the time.

In an interview, Dr. Fontana highlighted the benefits of DAAs, including their ease-of-use and effectiveness.

“Hepatitis C was the leading reason that we had to do liver transplants in the United States for years,” he said. “Then once these really amazing drugs called direct-acting antivirals came out, they changed the landscape very quickly. It really was a game changer for my whole practice, and, nationally, the practice of transplant.”

Yet, this study and others suggest that these practice-altering agents are being underutilized, Dr. Fontana said. A variety of reasons could explain suboptimal usage, he suggested, including lack of awareness among medical professionals and the public, the recency of DAA approvals, low HCV testing rates, lack of symptoms in HCV-positive patients, and medication costs.

This latter barrier, at least, is dissolving, Dr. Fontana said. Some payers initially restricted which providers could prescribe DAAs, but now the economic consensus has swung in their favor, since curing patients of HCV brings significant health care savings down the line. This financial advantage—theoretically multiplied across 4-5 million Americans living with HCV—has bolstered a multi-institutional effort toward universal HCV screening, with testing recommended at least once in every person’s lifetime.

“It’s highly cost effective,” Dr. Fontana said. “Even though the drugs are super expensive, you will reduce cost by preventing the people streaming towards liver cancer or streaming towards liver transplant. That’s why all the professional societies—the USPSTF, the CDC—they all say, ‘OK, screen everyone.’ ”

Screening may be getting easier soon, Dr. Fontana predicted, as at-home HCV-testing kits are on the horizon, with development and adoption likely accelerated by the success of at-home viral testing during the COVID-19 pandemic.

Beyond broader screening, Dr. Fontana suggested that greater awareness of DAAs is needed both within and beyond the medical community.

He advised health care providers who don’t yet feel comfortable diagnosing or treating HCV to refer to their local specialist.

“That’s the main message,” Dr. Fontana said. “I’m always eternally hopeful that every little message helps.”

The investigators and Dr. Fontana disclosed no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT THE LIVER MEETING

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Taste and smell changes linked with worse QOL and cognition in cirrhosis, renal failure

Article Type
Changed
Mon, 12/04/2023 - 12:41

Patients with cirrhosis or renal failure who experience changes in taste and smell may have worse quality-of-life (QOL) and may be more likely to exhibit cognitive impairment than those who do not exhibit these sensory changes, according to investigators.

Clinicians should screen for changes in taste and smell among patients at risk of cognitive changes, and offer nutritional interventions to support body weight and QOL, reported principal investigator Jasmohan S. Bajaj, MD, AGAF, of Virginia Commonwealth University, Richmond, and colleagues.

Dr. Jasmohan S. Bajaj

“Cirrhosis is linked with poor nutrition, which could partly be due to anorexia in hepatic encephalopathy (HE) and coexistent renal failure,” the investigators wrote in their abstract, which Dr. Bajaj presented in November at the annual meeting of the American Association for the Study of Liver Diseases.

“We wanted to measure how changes in the brain in cirrhosis affect patients’ abilities to smell and taste, and study how that affects their quality of life,” Dr. Bajaj said in a written comment.

To this end, the investigators conducted an observational study involving 59 participants, among whom 22 were healthy, 21 had cirrhosis, and 16 had renal failure requiring dialysis.

“Prior studies individually have shown changes in taste and smell for these two organ failures,” Dr. Bajaj said. “We studied them together as well and linked these to quality of life and individual cognitive tests.”

Of note, individuals with past or current COVID-19, or with current or recent alcohol or tobacco use, were excluded.

Compared with healthy individuals, participants with cirrhosis or renal failure had significantly worse performance on a taste discrimination test, with perceptions of sweet and sour most affected.

Cognitive measurement with Psychometric Hepatic Encephalopathy Score (PHES) and Stroop tests showed that scores were worse for patients with disease than those without. Taste discrimination significantly correlated with both cognitive test scores, regardless of HE or dialysis, whereas smell only correlated with the Stroop test.

Multivariable analysis revealed that better PHES scores and smell discrimination were linked with better taste discrimination. Similarly, better PHES scores and taste discrimination contributed to better smell discrimination. Eating impairment was associated with worse Stroop scores and worse olfactory-related QOL, suggesting that sensory changes, cognitive changes, and eating behaviors were all correlated.

“Health care providers ought to be alert to changes in patients’ eating habits, diet and weight as their liver and kidney disease worsen and as their brain function changes,” Dr. Bajaj said. “Nutritionists and others may be able to assist patients with a healthy diet and suggest ways to improve patients’ reports of their quality of life. Taste and smell are just a few aspects of the complicated assessment of health-related quality of life, brain dysfunction, and nutritional compromise in cirrhosis. We need to be mindful to not just focus on these aspects but to individualize care.”

Adrian M. Di Bisceglie, MD, hepatologist and emeritus professor of internal medicine at Saint Louis University, said the study was “well done,” and called the findings “an interesting little tidbit” that would probably not change his practice as a physician, but could be valuable for designing nutritional interventions.

Saint Louis University
Dr. Adrian M. Di Bisceglie

In an interview, Dr. Di Bisceglie explained that a well-balanced diet with adequate caloric intake can help slow the muscle wasting that occurs with the condition, but creating a tasty menu can be challenging when patients are asked to restrict their sodium intake as a means of reducing fluid retention.

“Salt contributes substantially to the enjoyment of food,” Dr. Di Bisceglie said.

Although the study did not specifically report the salt level in patients’ diets, Dr. Di Bisceglie said the findings highlight the need for low-salt strategies to improve palatability. For example, he suggested increasing umami, or savory flavor, as this can be accomplished without adding a significant amount of salt.

When asked if changes in taste or smell might be used as simple screening tools to detect cognitive impairment in patients with cirrhosis, Dr. Di Bisceglie said that this might be “possible,” but is probably unnecessary.

“There is an easy bedside test that we’ve been using for decades [to predict hepatic encephalopathy], which is reading,” Dr. Di Bisceglie said, noting that patients with cognitive deficits often describe reading paragraphs repeatedly without comprehending what they have read.

The investigators and Dr. Di Bisceglie disclosed no conflicts of interest.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Patients with cirrhosis or renal failure who experience changes in taste and smell may have worse quality-of-life (QOL) and may be more likely to exhibit cognitive impairment than those who do not exhibit these sensory changes, according to investigators.

Clinicians should screen for changes in taste and smell among patients at risk of cognitive changes, and offer nutritional interventions to support body weight and QOL, reported principal investigator Jasmohan S. Bajaj, MD, AGAF, of Virginia Commonwealth University, Richmond, and colleagues.

Dr. Jasmohan S. Bajaj

“Cirrhosis is linked with poor nutrition, which could partly be due to anorexia in hepatic encephalopathy (HE) and coexistent renal failure,” the investigators wrote in their abstract, which Dr. Bajaj presented in November at the annual meeting of the American Association for the Study of Liver Diseases.

“We wanted to measure how changes in the brain in cirrhosis affect patients’ abilities to smell and taste, and study how that affects their quality of life,” Dr. Bajaj said in a written comment.

To this end, the investigators conducted an observational study involving 59 participants, among whom 22 were healthy, 21 had cirrhosis, and 16 had renal failure requiring dialysis.

“Prior studies individually have shown changes in taste and smell for these two organ failures,” Dr. Bajaj said. “We studied them together as well and linked these to quality of life and individual cognitive tests.”

Of note, individuals with past or current COVID-19, or with current or recent alcohol or tobacco use, were excluded.

Compared with healthy individuals, participants with cirrhosis or renal failure had significantly worse performance on a taste discrimination test, with perceptions of sweet and sour most affected.

Cognitive measurement with Psychometric Hepatic Encephalopathy Score (PHES) and Stroop tests showed that scores were worse for patients with disease than those without. Taste discrimination significantly correlated with both cognitive test scores, regardless of HE or dialysis, whereas smell only correlated with the Stroop test.

Multivariable analysis revealed that better PHES scores and smell discrimination were linked with better taste discrimination. Similarly, better PHES scores and taste discrimination contributed to better smell discrimination. Eating impairment was associated with worse Stroop scores and worse olfactory-related QOL, suggesting that sensory changes, cognitive changes, and eating behaviors were all correlated.

“Health care providers ought to be alert to changes in patients’ eating habits, diet and weight as their liver and kidney disease worsen and as their brain function changes,” Dr. Bajaj said. “Nutritionists and others may be able to assist patients with a healthy diet and suggest ways to improve patients’ reports of their quality of life. Taste and smell are just a few aspects of the complicated assessment of health-related quality of life, brain dysfunction, and nutritional compromise in cirrhosis. We need to be mindful to not just focus on these aspects but to individualize care.”

Adrian M. Di Bisceglie, MD, hepatologist and emeritus professor of internal medicine at Saint Louis University, said the study was “well done,” and called the findings “an interesting little tidbit” that would probably not change his practice as a physician, but could be valuable for designing nutritional interventions.

Saint Louis University
Dr. Adrian M. Di Bisceglie

In an interview, Dr. Di Bisceglie explained that a well-balanced diet with adequate caloric intake can help slow the muscle wasting that occurs with the condition, but creating a tasty menu can be challenging when patients are asked to restrict their sodium intake as a means of reducing fluid retention.

“Salt contributes substantially to the enjoyment of food,” Dr. Di Bisceglie said.

Although the study did not specifically report the salt level in patients’ diets, Dr. Di Bisceglie said the findings highlight the need for low-salt strategies to improve palatability. For example, he suggested increasing umami, or savory flavor, as this can be accomplished without adding a significant amount of salt.

When asked if changes in taste or smell might be used as simple screening tools to detect cognitive impairment in patients with cirrhosis, Dr. Di Bisceglie said that this might be “possible,” but is probably unnecessary.

“There is an easy bedside test that we’ve been using for decades [to predict hepatic encephalopathy], which is reading,” Dr. Di Bisceglie said, noting that patients with cognitive deficits often describe reading paragraphs repeatedly without comprehending what they have read.

The investigators and Dr. Di Bisceglie disclosed no conflicts of interest.

Patients with cirrhosis or renal failure who experience changes in taste and smell may have worse quality-of-life (QOL) and may be more likely to exhibit cognitive impairment than those who do not exhibit these sensory changes, according to investigators.

Clinicians should screen for changes in taste and smell among patients at risk of cognitive changes, and offer nutritional interventions to support body weight and QOL, reported principal investigator Jasmohan S. Bajaj, MD, AGAF, of Virginia Commonwealth University, Richmond, and colleagues.

Dr. Jasmohan S. Bajaj

“Cirrhosis is linked with poor nutrition, which could partly be due to anorexia in hepatic encephalopathy (HE) and coexistent renal failure,” the investigators wrote in their abstract, which Dr. Bajaj presented in November at the annual meeting of the American Association for the Study of Liver Diseases.

“We wanted to measure how changes in the brain in cirrhosis affect patients’ abilities to smell and taste, and study how that affects their quality of life,” Dr. Bajaj said in a written comment.

To this end, the investigators conducted an observational study involving 59 participants, among whom 22 were healthy, 21 had cirrhosis, and 16 had renal failure requiring dialysis.

“Prior studies individually have shown changes in taste and smell for these two organ failures,” Dr. Bajaj said. “We studied them together as well and linked these to quality of life and individual cognitive tests.”

Of note, individuals with past or current COVID-19, or with current or recent alcohol or tobacco use, were excluded.

Compared with healthy individuals, participants with cirrhosis or renal failure had significantly worse performance on a taste discrimination test, with perceptions of sweet and sour most affected.

Cognitive measurement with Psychometric Hepatic Encephalopathy Score (PHES) and Stroop tests showed that scores were worse for patients with disease than those without. Taste discrimination significantly correlated with both cognitive test scores, regardless of HE or dialysis, whereas smell only correlated with the Stroop test.

Multivariable analysis revealed that better PHES scores and smell discrimination were linked with better taste discrimination. Similarly, better PHES scores and taste discrimination contributed to better smell discrimination. Eating impairment was associated with worse Stroop scores and worse olfactory-related QOL, suggesting that sensory changes, cognitive changes, and eating behaviors were all correlated.

“Health care providers ought to be alert to changes in patients’ eating habits, diet and weight as their liver and kidney disease worsen and as their brain function changes,” Dr. Bajaj said. “Nutritionists and others may be able to assist patients with a healthy diet and suggest ways to improve patients’ reports of their quality of life. Taste and smell are just a few aspects of the complicated assessment of health-related quality of life, brain dysfunction, and nutritional compromise in cirrhosis. We need to be mindful to not just focus on these aspects but to individualize care.”

Adrian M. Di Bisceglie, MD, hepatologist and emeritus professor of internal medicine at Saint Louis University, said the study was “well done,” and called the findings “an interesting little tidbit” that would probably not change his practice as a physician, but could be valuable for designing nutritional interventions.

Saint Louis University
Dr. Adrian M. Di Bisceglie

In an interview, Dr. Di Bisceglie explained that a well-balanced diet with adequate caloric intake can help slow the muscle wasting that occurs with the condition, but creating a tasty menu can be challenging when patients are asked to restrict their sodium intake as a means of reducing fluid retention.

“Salt contributes substantially to the enjoyment of food,” Dr. Di Bisceglie said.

Although the study did not specifically report the salt level in patients’ diets, Dr. Di Bisceglie said the findings highlight the need for low-salt strategies to improve palatability. For example, he suggested increasing umami, or savory flavor, as this can be accomplished without adding a significant amount of salt.

When asked if changes in taste or smell might be used as simple screening tools to detect cognitive impairment in patients with cirrhosis, Dr. Di Bisceglie said that this might be “possible,” but is probably unnecessary.

“There is an easy bedside test that we’ve been using for decades [to predict hepatic encephalopathy], which is reading,” Dr. Di Bisceglie said, noting that patients with cognitive deficits often describe reading paragraphs repeatedly without comprehending what they have read.

The investigators and Dr. Di Bisceglie disclosed no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT THE LIVER MEETING

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

COVID livers are safe for transplant

Article Type
Changed
Mon, 12/04/2023 - 12:22

Transplanting livers from deceased donors who tested positive for SARS-CoV-2 is safe and has no significant impact on short-term outcomes of allografts or recipients, based on a national study with the longest follow-up to date.

Using livers from deceased patients with COVID-19 could be an opportunity expand organ availability, reported principal investigator Nadim Mahmud, MD, of the University of Pennsylvania, Philadelphia, and colleagues.

Findings were presented in November at the annual meeting of the American Association for the Study of Liver Diseases.

“During the COVID-19 pandemic, a few centers trialed transplanting solid organs from COVID-19 positive donors with promising initial results,” presenting author Roy X. Wang, MD, of the University of Pennsylvania, said in a written comment. “However, these were smaller experiences with short follow-up that were not exclusively focused on liver transplantation. We wanted to explore the safety of liver transplantation from COVID-19 positive donors using a large national dataset with the longest follow up time to date.”

The dataset included 13,096 COVID-negative donors and 299 COVID-positive donors who died between July 2020 and July 2022, with cases and controls matched via propensity scoring. COVID-positive donors were significantly more likely to be younger and have died of brain death. Beyond this difference in age, no significant demographic differences were detected.

After 1 year of follow-up, no statistically significant differences in patient survival (subhazard ratio, 1.11; log-rank P = .70) or allograft survival (hazard ratio, 1.44; log-rank P = .14) were detected when comparing livers transplanted from positive versus negative donors.

“Our findings support and expand upon the results from earlier studies,” Dr. Wang concluded. “Liver transplant from COVID-19-positive donors has acceptable short-term outcomes and may represent an opportunity to expand organ access.”

Still, more work is needed to assess other clinical metrics and long-term outcomes, he added.

“While we were able to show similar patient and graft survival post-transplant between COVID-19-positive and negative donors, rates of other complications were not investigated such as episodes of rejection, liver injury, and hospitalizations,” Dr. Wang said. “Due to data limitations, we are only able to report on outcomes up to 1 year post transplant. Additional investigation will be needed to continue monitoring future outcomes and identifying any differences between recipients of COVID-19-positive and negative donors.”

Timucin Taner, MD, PhD, division chair of transplant surgery at Mayo Clinic, Rochester, Minnesota, said the study is important because it reaffirms the majority opinion among transplant physicians: These livers are safe.

In an interview, Dr. Taner suggested that Dr. Wang’s call for longer term data is “mostly science speak,” since 1 year of follow-up should be sufficient to determine liver viability.

Mayo Clinic
Dr. Timucin Taner

“If a liver from a COVID-19 donor behaved well for a year, then chances are it’s not going to behave badly [later on] because of the virus at the time of donation,” Dr. Taner said.

He said the reported trends in usage of COVID-positive livers reflect early hesitancy that waned with rising vaccination rates, and recognition that the virus could not be spread via liver donation.

“To date, the only transmission [of SARS-CoV-2] from a transplant has been from a lung transplant,” Dr. Taner said, “and that was back in the days that we didn’t know about this. Other organs don’t transmit the disease, so they are easily usable.”

These new data should further increase confidence among both health care providers and patients, he added.

“[This study is] reassuring to the patients on the waitlist that these organs are very safe to use,” Dr. Taner said. “We as the transplant society are comfortable using them without any hesitation.”

The investigators and Dr. Taner disclosed no conflicts of interest.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Transplanting livers from deceased donors who tested positive for SARS-CoV-2 is safe and has no significant impact on short-term outcomes of allografts or recipients, based on a national study with the longest follow-up to date.

Using livers from deceased patients with COVID-19 could be an opportunity expand organ availability, reported principal investigator Nadim Mahmud, MD, of the University of Pennsylvania, Philadelphia, and colleagues.

Findings were presented in November at the annual meeting of the American Association for the Study of Liver Diseases.

“During the COVID-19 pandemic, a few centers trialed transplanting solid organs from COVID-19 positive donors with promising initial results,” presenting author Roy X. Wang, MD, of the University of Pennsylvania, said in a written comment. “However, these were smaller experiences with short follow-up that were not exclusively focused on liver transplantation. We wanted to explore the safety of liver transplantation from COVID-19 positive donors using a large national dataset with the longest follow up time to date.”

The dataset included 13,096 COVID-negative donors and 299 COVID-positive donors who died between July 2020 and July 2022, with cases and controls matched via propensity scoring. COVID-positive donors were significantly more likely to be younger and have died of brain death. Beyond this difference in age, no significant demographic differences were detected.

After 1 year of follow-up, no statistically significant differences in patient survival (subhazard ratio, 1.11; log-rank P = .70) or allograft survival (hazard ratio, 1.44; log-rank P = .14) were detected when comparing livers transplanted from positive versus negative donors.

“Our findings support and expand upon the results from earlier studies,” Dr. Wang concluded. “Liver transplant from COVID-19-positive donors has acceptable short-term outcomes and may represent an opportunity to expand organ access.”

Still, more work is needed to assess other clinical metrics and long-term outcomes, he added.

“While we were able to show similar patient and graft survival post-transplant between COVID-19-positive and negative donors, rates of other complications were not investigated such as episodes of rejection, liver injury, and hospitalizations,” Dr. Wang said. “Due to data limitations, we are only able to report on outcomes up to 1 year post transplant. Additional investigation will be needed to continue monitoring future outcomes and identifying any differences between recipients of COVID-19-positive and negative donors.”

Timucin Taner, MD, PhD, division chair of transplant surgery at Mayo Clinic, Rochester, Minnesota, said the study is important because it reaffirms the majority opinion among transplant physicians: These livers are safe.

In an interview, Dr. Taner suggested that Dr. Wang’s call for longer term data is “mostly science speak,” since 1 year of follow-up should be sufficient to determine liver viability.

Mayo Clinic
Dr. Timucin Taner

“If a liver from a COVID-19 donor behaved well for a year, then chances are it’s not going to behave badly [later on] because of the virus at the time of donation,” Dr. Taner said.

He said the reported trends in usage of COVID-positive livers reflect early hesitancy that waned with rising vaccination rates, and recognition that the virus could not be spread via liver donation.

“To date, the only transmission [of SARS-CoV-2] from a transplant has been from a lung transplant,” Dr. Taner said, “and that was back in the days that we didn’t know about this. Other organs don’t transmit the disease, so they are easily usable.”

These new data should further increase confidence among both health care providers and patients, he added.

“[This study is] reassuring to the patients on the waitlist that these organs are very safe to use,” Dr. Taner said. “We as the transplant society are comfortable using them without any hesitation.”

The investigators and Dr. Taner disclosed no conflicts of interest.

Transplanting livers from deceased donors who tested positive for SARS-CoV-2 is safe and has no significant impact on short-term outcomes of allografts or recipients, based on a national study with the longest follow-up to date.

Using livers from deceased patients with COVID-19 could be an opportunity expand organ availability, reported principal investigator Nadim Mahmud, MD, of the University of Pennsylvania, Philadelphia, and colleagues.

Findings were presented in November at the annual meeting of the American Association for the Study of Liver Diseases.

“During the COVID-19 pandemic, a few centers trialed transplanting solid organs from COVID-19 positive donors with promising initial results,” presenting author Roy X. Wang, MD, of the University of Pennsylvania, said in a written comment. “However, these were smaller experiences with short follow-up that were not exclusively focused on liver transplantation. We wanted to explore the safety of liver transplantation from COVID-19 positive donors using a large national dataset with the longest follow up time to date.”

The dataset included 13,096 COVID-negative donors and 299 COVID-positive donors who died between July 2020 and July 2022, with cases and controls matched via propensity scoring. COVID-positive donors were significantly more likely to be younger and have died of brain death. Beyond this difference in age, no significant demographic differences were detected.

After 1 year of follow-up, no statistically significant differences in patient survival (subhazard ratio, 1.11; log-rank P = .70) or allograft survival (hazard ratio, 1.44; log-rank P = .14) were detected when comparing livers transplanted from positive versus negative donors.

“Our findings support and expand upon the results from earlier studies,” Dr. Wang concluded. “Liver transplant from COVID-19-positive donors has acceptable short-term outcomes and may represent an opportunity to expand organ access.”

Still, more work is needed to assess other clinical metrics and long-term outcomes, he added.

“While we were able to show similar patient and graft survival post-transplant between COVID-19-positive and negative donors, rates of other complications were not investigated such as episodes of rejection, liver injury, and hospitalizations,” Dr. Wang said. “Due to data limitations, we are only able to report on outcomes up to 1 year post transplant. Additional investigation will be needed to continue monitoring future outcomes and identifying any differences between recipients of COVID-19-positive and negative donors.”

Timucin Taner, MD, PhD, division chair of transplant surgery at Mayo Clinic, Rochester, Minnesota, said the study is important because it reaffirms the majority opinion among transplant physicians: These livers are safe.

In an interview, Dr. Taner suggested that Dr. Wang’s call for longer term data is “mostly science speak,” since 1 year of follow-up should be sufficient to determine liver viability.

Mayo Clinic
Dr. Timucin Taner

“If a liver from a COVID-19 donor behaved well for a year, then chances are it’s not going to behave badly [later on] because of the virus at the time of donation,” Dr. Taner said.

He said the reported trends in usage of COVID-positive livers reflect early hesitancy that waned with rising vaccination rates, and recognition that the virus could not be spread via liver donation.

“To date, the only transmission [of SARS-CoV-2] from a transplant has been from a lung transplant,” Dr. Taner said, “and that was back in the days that we didn’t know about this. Other organs don’t transmit the disease, so they are easily usable.”

These new data should further increase confidence among both health care providers and patients, he added.

“[This study is] reassuring to the patients on the waitlist that these organs are very safe to use,” Dr. Taner said. “We as the transplant society are comfortable using them without any hesitation.”

The investigators and Dr. Taner disclosed no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT THE LIVER MEETING

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

More than one-third of adults in the US could have NAFLD by 2050

Article Type
Changed
Mon, 12/04/2023 - 12:26

More than one out of three adults in the United States could have nonalcoholic fatty liver disease (NAFLD) by 2050, substantially increasing the national clinical burden, according to investigators.

These findings suggest that health care systems should prepare for “large increases” in cases of hepatocellular carcinoma (HCC) and need for liver transplants, reported lead author Phuc Le, PhD, MPH, of the Cleveland Clinic, and colleagues.

Cleveland Clinic
Dr. Phuc Le

“Following the alarming rise in prevalence of obesity and diabetes, NAFLD is projected to become the leading indication for liver transplant in the United States in the next decade,” Dr. Le and colleagues wrote in their abstract for the annual meeting of the American Association for the Study of Liver Diseases. “A better understanding of the clinical burden associated with NAFLD will enable health systems to prepare to meet this imminent demand from patients.”

To this end, Dr. Le and colleagues developed an agent-based state transition model to predict future prevalence of NAFLD and associated outcomes.

In the first part of the model, the investigators simulated population growth in the United States using Census Bureau data, including new births and immigration, from the year 2000 onward. The second part of the model simulated natural progression of NAFLD in adults via 14 associated conditions and events, including steatosis, nonalcoholic steatohepatitis (NASH), HCC, liver transplants, liver-related mortality, and others.

By first comparing simulated findings with actual findings between 2000 and 2018, the investigators confirmed that their model could reliably predict the intended epidemiological parameters.

Next, they turned their model toward the future.

It predicted that the prevalence of NAFLD among US adults will rise from 27.8% in 2020 to 34.3% in 2050. Over the same timeframe, prevalence of NASH is predicted to increase from 20.0% to 21.8%, proportion of NAFLD cases developing cirrhosis is expected to increase from 1.9% to 3.1%, and liver-related mortality is estimated to rise from 0.4% to 1% of all deaths.

The model also predicted that the burden of HCC will increase from 10,400 to 19,300 new cases per year, while liver transplant burden will more than double, from 1,700 to 4,200 transplants per year.

“Our model forecasts substantial clinical burden of NAFLD over the next three decades,” Dr. Le said in a virtual press conference. “And in the absence of effective treatments, health systems should plan for large increases in the number of liver cancer cases and the need for liver transplant.”

During the press conference, Norah Terrault, MD, president of the AASLD from the University of Southern California, Los Angeles, noted that all of the reported outcomes, including increasing rates of liver cancer, cirrhosis, and transplants are “potentially preventable.”

Keck School of Medicine
Dr. Norah Terrault

Dr. Terrault went on to suggest ways of combating this increasing burden of NAFLD, which she referred to as metabolic dysfunction–associated steatotic liver disease (MASLD), the name now recommended by the AASLD.

“There’s no way we’re going to be able to transplant our way out of this,” Dr. Terrault said. “We need to be bringing greater awareness both to patients, as well as to providers about how we seek out the diagnosis. And we need to bring greater awareness to the population around the things that contribute to MASLD.”

Rates of obesity and diabetes continue to rise, Dr. Terrault said, explaining why MASLD is more common than ever. To counteract these trends, she called for greater awareness of driving factors, such as dietary choices and sedentary lifestyle.

“These are all really important messages that we want to get out to the population, and are really the cornerstones for how we approach the management of patients who have MASLD,” Dr. Terrault said.

In discussion with Dr. Terrault, Dr. Le agreed that increased education may help stem the rising tide of disease, while treatment advances could also increase the odds of a brighter future.

“If we improve our management of NAFLD, or NAFLD-related comorbidities, and if we can develop an effective treatment for NAFLD, then obviously the future would not be so dark,” Dr. Le said, noting promising phase 3 data that would be presented at the meeting. “We are hopeful that the future of disease burden will not be as bad as our model predicts.”

The study was funded by the Agency for Healthcare Research and Quality. The investigators disclosed no conflicts of interest.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

More than one out of three adults in the United States could have nonalcoholic fatty liver disease (NAFLD) by 2050, substantially increasing the national clinical burden, according to investigators.

These findings suggest that health care systems should prepare for “large increases” in cases of hepatocellular carcinoma (HCC) and need for liver transplants, reported lead author Phuc Le, PhD, MPH, of the Cleveland Clinic, and colleagues.

Cleveland Clinic
Dr. Phuc Le

“Following the alarming rise in prevalence of obesity and diabetes, NAFLD is projected to become the leading indication for liver transplant in the United States in the next decade,” Dr. Le and colleagues wrote in their abstract for the annual meeting of the American Association for the Study of Liver Diseases. “A better understanding of the clinical burden associated with NAFLD will enable health systems to prepare to meet this imminent demand from patients.”

To this end, Dr. Le and colleagues developed an agent-based state transition model to predict future prevalence of NAFLD and associated outcomes.

In the first part of the model, the investigators simulated population growth in the United States using Census Bureau data, including new births and immigration, from the year 2000 onward. The second part of the model simulated natural progression of NAFLD in adults via 14 associated conditions and events, including steatosis, nonalcoholic steatohepatitis (NASH), HCC, liver transplants, liver-related mortality, and others.

By first comparing simulated findings with actual findings between 2000 and 2018, the investigators confirmed that their model could reliably predict the intended epidemiological parameters.

Next, they turned their model toward the future.

It predicted that the prevalence of NAFLD among US adults will rise from 27.8% in 2020 to 34.3% in 2050. Over the same timeframe, prevalence of NASH is predicted to increase from 20.0% to 21.8%, proportion of NAFLD cases developing cirrhosis is expected to increase from 1.9% to 3.1%, and liver-related mortality is estimated to rise from 0.4% to 1% of all deaths.

The model also predicted that the burden of HCC will increase from 10,400 to 19,300 new cases per year, while liver transplant burden will more than double, from 1,700 to 4,200 transplants per year.

“Our model forecasts substantial clinical burden of NAFLD over the next three decades,” Dr. Le said in a virtual press conference. “And in the absence of effective treatments, health systems should plan for large increases in the number of liver cancer cases and the need for liver transplant.”

During the press conference, Norah Terrault, MD, president of the AASLD from the University of Southern California, Los Angeles, noted that all of the reported outcomes, including increasing rates of liver cancer, cirrhosis, and transplants are “potentially preventable.”

Keck School of Medicine
Dr. Norah Terrault

Dr. Terrault went on to suggest ways of combating this increasing burden of NAFLD, which she referred to as metabolic dysfunction–associated steatotic liver disease (MASLD), the name now recommended by the AASLD.

“There’s no way we’re going to be able to transplant our way out of this,” Dr. Terrault said. “We need to be bringing greater awareness both to patients, as well as to providers about how we seek out the diagnosis. And we need to bring greater awareness to the population around the things that contribute to MASLD.”

Rates of obesity and diabetes continue to rise, Dr. Terrault said, explaining why MASLD is more common than ever. To counteract these trends, she called for greater awareness of driving factors, such as dietary choices and sedentary lifestyle.

“These are all really important messages that we want to get out to the population, and are really the cornerstones for how we approach the management of patients who have MASLD,” Dr. Terrault said.

In discussion with Dr. Terrault, Dr. Le agreed that increased education may help stem the rising tide of disease, while treatment advances could also increase the odds of a brighter future.

“If we improve our management of NAFLD, or NAFLD-related comorbidities, and if we can develop an effective treatment for NAFLD, then obviously the future would not be so dark,” Dr. Le said, noting promising phase 3 data that would be presented at the meeting. “We are hopeful that the future of disease burden will not be as bad as our model predicts.”

The study was funded by the Agency for Healthcare Research and Quality. The investigators disclosed no conflicts of interest.

More than one out of three adults in the United States could have nonalcoholic fatty liver disease (NAFLD) by 2050, substantially increasing the national clinical burden, according to investigators.

These findings suggest that health care systems should prepare for “large increases” in cases of hepatocellular carcinoma (HCC) and need for liver transplants, reported lead author Phuc Le, PhD, MPH, of the Cleveland Clinic, and colleagues.

Cleveland Clinic
Dr. Phuc Le

“Following the alarming rise in prevalence of obesity and diabetes, NAFLD is projected to become the leading indication for liver transplant in the United States in the next decade,” Dr. Le and colleagues wrote in their abstract for the annual meeting of the American Association for the Study of Liver Diseases. “A better understanding of the clinical burden associated with NAFLD will enable health systems to prepare to meet this imminent demand from patients.”

To this end, Dr. Le and colleagues developed an agent-based state transition model to predict future prevalence of NAFLD and associated outcomes.

In the first part of the model, the investigators simulated population growth in the United States using Census Bureau data, including new births and immigration, from the year 2000 onward. The second part of the model simulated natural progression of NAFLD in adults via 14 associated conditions and events, including steatosis, nonalcoholic steatohepatitis (NASH), HCC, liver transplants, liver-related mortality, and others.

By first comparing simulated findings with actual findings between 2000 and 2018, the investigators confirmed that their model could reliably predict the intended epidemiological parameters.

Next, they turned their model toward the future.

It predicted that the prevalence of NAFLD among US adults will rise from 27.8% in 2020 to 34.3% in 2050. Over the same timeframe, prevalence of NASH is predicted to increase from 20.0% to 21.8%, proportion of NAFLD cases developing cirrhosis is expected to increase from 1.9% to 3.1%, and liver-related mortality is estimated to rise from 0.4% to 1% of all deaths.

The model also predicted that the burden of HCC will increase from 10,400 to 19,300 new cases per year, while liver transplant burden will more than double, from 1,700 to 4,200 transplants per year.

“Our model forecasts substantial clinical burden of NAFLD over the next three decades,” Dr. Le said in a virtual press conference. “And in the absence of effective treatments, health systems should plan for large increases in the number of liver cancer cases and the need for liver transplant.”

During the press conference, Norah Terrault, MD, president of the AASLD from the University of Southern California, Los Angeles, noted that all of the reported outcomes, including increasing rates of liver cancer, cirrhosis, and transplants are “potentially preventable.”

Keck School of Medicine
Dr. Norah Terrault

Dr. Terrault went on to suggest ways of combating this increasing burden of NAFLD, which she referred to as metabolic dysfunction–associated steatotic liver disease (MASLD), the name now recommended by the AASLD.

“There’s no way we’re going to be able to transplant our way out of this,” Dr. Terrault said. “We need to be bringing greater awareness both to patients, as well as to providers about how we seek out the diagnosis. And we need to bring greater awareness to the population around the things that contribute to MASLD.”

Rates of obesity and diabetes continue to rise, Dr. Terrault said, explaining why MASLD is more common than ever. To counteract these trends, she called for greater awareness of driving factors, such as dietary choices and sedentary lifestyle.

“These are all really important messages that we want to get out to the population, and are really the cornerstones for how we approach the management of patients who have MASLD,” Dr. Terrault said.

In discussion with Dr. Terrault, Dr. Le agreed that increased education may help stem the rising tide of disease, while treatment advances could also increase the odds of a brighter future.

“If we improve our management of NAFLD, or NAFLD-related comorbidities, and if we can develop an effective treatment for NAFLD, then obviously the future would not be so dark,” Dr. Le said, noting promising phase 3 data that would be presented at the meeting. “We are hopeful that the future of disease burden will not be as bad as our model predicts.”

The study was funded by the Agency for Healthcare Research and Quality. The investigators disclosed no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT THE LIVER MEETING

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Food insecurity increases risk of adolescent MASLD

Article Type
Changed
Mon, 12/04/2023 - 12:21

Adolescents facing food insecurity have a significantly increased risk of metabolic dysfunction-associated steatotic liver disease (MASLD), likely due to overconsumption of low-cost, ultra-processed, unbalanced diets, according to a recent study.

These findings suggest that more work is needed to ensure that eligible adolescents can access Supplemental Nutrition Assistance Program (SNAP) benefits and have opportunities to engage in physical activities through school-associated programs, reported principal investigator Zobair M. Younossi, MD, MPH, professor and chairman of the Beatty Liver and Obesity Research Program, Inova Health System, Falls Church, Virginia, and colleagues.

Dr. Zobair M. Younossi

Dr. Younossi presented the findings in November during a press conference at the annual meeting of the American Association for the Study of Liver Diseases.

“Food insecurity among children is about 10.2% in the United States,” Dr. Younossi said. “[Food insecurity has] been shown to be a risk factor for MASLD among adults, but the data and children and adolescents are really lacking at the moment.”

To address this knowledge gap, Dr. Younossi and colleagues analyzed data from 771 adolescents aged 12-18 years in the National Health and Nutrition Examination Survey (2017-2018). Among these participants, 9.8% reported food insecurity and 10.8% had MASLD. Rates of obesity and central obesity were 22.5% and 45.4%, respectively, while 1.0% had diabetes and 20.9% had prediabetes.

Among adolescents facing food insecurity, more than half (51.5%) did not eat enough food, a vast majority (93.2%) could not access a balanced meal, and almost all (98.9%) relied upon low-cost food for daily sustenance.

The prevalence of MASLD in the food insecure group was almost twice as high as in the food secure group (18.7% vs 9.9%), and advanced fibrosis was about 9 times more common (2.8% vs. 0.3%). Food insecure participants were also more likely to come from a low-income household (70.4% vs. 25.7%) and participate in SNAP (62.4% vs. 25.1%).

Adjusting for SNAP participation, demographic factors, and metabolic disease showed that food insecurity independently increased risk of MASLD by more than twofold (odds ratio [OR], 2.62; 95% CI, 1.07–6.41). The negative effect of food insecurity was almost twice as strong in participants living in a low-income household (OR, 4.79; 95% CI, 1.44–15.86).

“The association between food insecurity and MASLD/NAFLD is most likely the result of not being able to eat a balanced meal and more likely having to purchase low-cost food,” Dr. Younossi said. “Together, these factors may lead to a cycle of overeating along with the overconsumption of ultra-processed foods and sugar-sweetened food and beverages.”

He went on to suggest that more work is needed to remove “systemic and structural barriers” that prevent eligible adolescents from participating in SNAP, while offering support so they can participate in “more physical activity in school and in after-school programs.”

Elliot Benjamin Tapper, MD, associate professor of medicine at the University of Michigan, Ann Arbor, recently published a similar study in the Journal of Clinical Gastroenterology linking food scarcity and MASLD in adults.

Michigan Medicine
Dr. Elliot Benjamin Tapper

In an interview, Dr. Tapper praised this new study by Dr. Younossi and colleagues because it “identifies a serious unmet need” among younger individuals, who may stand to benefit most from early intervention.

“The goal [of screening] is to prevent the development of progressive disease,” Dr. Tapper said. “Our current guidelines for screening for advanced liver disease and people with risk factors focus exclusively on adults. If you waited longer, then there’s a risk that these [younger] people [in the study] would have progressed to a later stage of disease.”

Dr. Tapper predicted increased enthusiasm for MAFLD screening among adolescents in response to these findings, but he cautioned that conventional educational intervention is unlikely to yield significant benefit.

“If you’re food insecure, you can’t go out and buy salmon and olive oil to follow the Mediterranean diet,” Dr. Tapper said. In this era, where the people who are at risk tomorrow are young and food insecure, we have to come up with a way of tailoring our interventions to the means that are available to these patients.”

To this end, health care providers need to collaborate with individuals who have personally dealt with food scarcity to implement practicable interventions.

“Referral to social work has to be paired with some kind of standard teaching,” Dr. Tapper said. “How would I use social and nutritional assistance programs to eat in a liver-healthy way? What can I avoid? [Educational materials] should be written by and edited by people with lived experience; i.e., people who have food insecurity or have walked a mile in those shoes.”

Dr. Younossi disclosed relationships with Merck, Abbott, AstraZeneca, and others. Dr. Tapper disclosed relationships with Takeda, Novo Nordisk, Madrigal, and others.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Adolescents facing food insecurity have a significantly increased risk of metabolic dysfunction-associated steatotic liver disease (MASLD), likely due to overconsumption of low-cost, ultra-processed, unbalanced diets, according to a recent study.

These findings suggest that more work is needed to ensure that eligible adolescents can access Supplemental Nutrition Assistance Program (SNAP) benefits and have opportunities to engage in physical activities through school-associated programs, reported principal investigator Zobair M. Younossi, MD, MPH, professor and chairman of the Beatty Liver and Obesity Research Program, Inova Health System, Falls Church, Virginia, and colleagues.

Dr. Zobair M. Younossi

Dr. Younossi presented the findings in November during a press conference at the annual meeting of the American Association for the Study of Liver Diseases.

“Food insecurity among children is about 10.2% in the United States,” Dr. Younossi said. “[Food insecurity has] been shown to be a risk factor for MASLD among adults, but the data and children and adolescents are really lacking at the moment.”

To address this knowledge gap, Dr. Younossi and colleagues analyzed data from 771 adolescents aged 12-18 years in the National Health and Nutrition Examination Survey (2017-2018). Among these participants, 9.8% reported food insecurity and 10.8% had MASLD. Rates of obesity and central obesity were 22.5% and 45.4%, respectively, while 1.0% had diabetes and 20.9% had prediabetes.

Among adolescents facing food insecurity, more than half (51.5%) did not eat enough food, a vast majority (93.2%) could not access a balanced meal, and almost all (98.9%) relied upon low-cost food for daily sustenance.

The prevalence of MASLD in the food insecure group was almost twice as high as in the food secure group (18.7% vs 9.9%), and advanced fibrosis was about 9 times more common (2.8% vs. 0.3%). Food insecure participants were also more likely to come from a low-income household (70.4% vs. 25.7%) and participate in SNAP (62.4% vs. 25.1%).

Adjusting for SNAP participation, demographic factors, and metabolic disease showed that food insecurity independently increased risk of MASLD by more than twofold (odds ratio [OR], 2.62; 95% CI, 1.07–6.41). The negative effect of food insecurity was almost twice as strong in participants living in a low-income household (OR, 4.79; 95% CI, 1.44–15.86).

“The association between food insecurity and MASLD/NAFLD is most likely the result of not being able to eat a balanced meal and more likely having to purchase low-cost food,” Dr. Younossi said. “Together, these factors may lead to a cycle of overeating along with the overconsumption of ultra-processed foods and sugar-sweetened food and beverages.”

He went on to suggest that more work is needed to remove “systemic and structural barriers” that prevent eligible adolescents from participating in SNAP, while offering support so they can participate in “more physical activity in school and in after-school programs.”

Elliot Benjamin Tapper, MD, associate professor of medicine at the University of Michigan, Ann Arbor, recently published a similar study in the Journal of Clinical Gastroenterology linking food scarcity and MASLD in adults.

Michigan Medicine
Dr. Elliot Benjamin Tapper

In an interview, Dr. Tapper praised this new study by Dr. Younossi and colleagues because it “identifies a serious unmet need” among younger individuals, who may stand to benefit most from early intervention.

“The goal [of screening] is to prevent the development of progressive disease,” Dr. Tapper said. “Our current guidelines for screening for advanced liver disease and people with risk factors focus exclusively on adults. If you waited longer, then there’s a risk that these [younger] people [in the study] would have progressed to a later stage of disease.”

Dr. Tapper predicted increased enthusiasm for MAFLD screening among adolescents in response to these findings, but he cautioned that conventional educational intervention is unlikely to yield significant benefit.

“If you’re food insecure, you can’t go out and buy salmon and olive oil to follow the Mediterranean diet,” Dr. Tapper said. In this era, where the people who are at risk tomorrow are young and food insecure, we have to come up with a way of tailoring our interventions to the means that are available to these patients.”

To this end, health care providers need to collaborate with individuals who have personally dealt with food scarcity to implement practicable interventions.

“Referral to social work has to be paired with some kind of standard teaching,” Dr. Tapper said. “How would I use social and nutritional assistance programs to eat in a liver-healthy way? What can I avoid? [Educational materials] should be written by and edited by people with lived experience; i.e., people who have food insecurity or have walked a mile in those shoes.”

Dr. Younossi disclosed relationships with Merck, Abbott, AstraZeneca, and others. Dr. Tapper disclosed relationships with Takeda, Novo Nordisk, Madrigal, and others.

Adolescents facing food insecurity have a significantly increased risk of metabolic dysfunction-associated steatotic liver disease (MASLD), likely due to overconsumption of low-cost, ultra-processed, unbalanced diets, according to a recent study.

These findings suggest that more work is needed to ensure that eligible adolescents can access Supplemental Nutrition Assistance Program (SNAP) benefits and have opportunities to engage in physical activities through school-associated programs, reported principal investigator Zobair M. Younossi, MD, MPH, professor and chairman of the Beatty Liver and Obesity Research Program, Inova Health System, Falls Church, Virginia, and colleagues.

Dr. Zobair M. Younossi

Dr. Younossi presented the findings in November during a press conference at the annual meeting of the American Association for the Study of Liver Diseases.

“Food insecurity among children is about 10.2% in the United States,” Dr. Younossi said. “[Food insecurity has] been shown to be a risk factor for MASLD among adults, but the data and children and adolescents are really lacking at the moment.”

To address this knowledge gap, Dr. Younossi and colleagues analyzed data from 771 adolescents aged 12-18 years in the National Health and Nutrition Examination Survey (2017-2018). Among these participants, 9.8% reported food insecurity and 10.8% had MASLD. Rates of obesity and central obesity were 22.5% and 45.4%, respectively, while 1.0% had diabetes and 20.9% had prediabetes.

Among adolescents facing food insecurity, more than half (51.5%) did not eat enough food, a vast majority (93.2%) could not access a balanced meal, and almost all (98.9%) relied upon low-cost food for daily sustenance.

The prevalence of MASLD in the food insecure group was almost twice as high as in the food secure group (18.7% vs 9.9%), and advanced fibrosis was about 9 times more common (2.8% vs. 0.3%). Food insecure participants were also more likely to come from a low-income household (70.4% vs. 25.7%) and participate in SNAP (62.4% vs. 25.1%).

Adjusting for SNAP participation, demographic factors, and metabolic disease showed that food insecurity independently increased risk of MASLD by more than twofold (odds ratio [OR], 2.62; 95% CI, 1.07–6.41). The negative effect of food insecurity was almost twice as strong in participants living in a low-income household (OR, 4.79; 95% CI, 1.44–15.86).

“The association between food insecurity and MASLD/NAFLD is most likely the result of not being able to eat a balanced meal and more likely having to purchase low-cost food,” Dr. Younossi said. “Together, these factors may lead to a cycle of overeating along with the overconsumption of ultra-processed foods and sugar-sweetened food and beverages.”

He went on to suggest that more work is needed to remove “systemic and structural barriers” that prevent eligible adolescents from participating in SNAP, while offering support so they can participate in “more physical activity in school and in after-school programs.”

Elliot Benjamin Tapper, MD, associate professor of medicine at the University of Michigan, Ann Arbor, recently published a similar study in the Journal of Clinical Gastroenterology linking food scarcity and MASLD in adults.

Michigan Medicine
Dr. Elliot Benjamin Tapper

In an interview, Dr. Tapper praised this new study by Dr. Younossi and colleagues because it “identifies a serious unmet need” among younger individuals, who may stand to benefit most from early intervention.

“The goal [of screening] is to prevent the development of progressive disease,” Dr. Tapper said. “Our current guidelines for screening for advanced liver disease and people with risk factors focus exclusively on adults. If you waited longer, then there’s a risk that these [younger] people [in the study] would have progressed to a later stage of disease.”

Dr. Tapper predicted increased enthusiasm for MAFLD screening among adolescents in response to these findings, but he cautioned that conventional educational intervention is unlikely to yield significant benefit.

“If you’re food insecure, you can’t go out and buy salmon and olive oil to follow the Mediterranean diet,” Dr. Tapper said. In this era, where the people who are at risk tomorrow are young and food insecure, we have to come up with a way of tailoring our interventions to the means that are available to these patients.”

To this end, health care providers need to collaborate with individuals who have personally dealt with food scarcity to implement practicable interventions.

“Referral to social work has to be paired with some kind of standard teaching,” Dr. Tapper said. “How would I use social and nutritional assistance programs to eat in a liver-healthy way? What can I avoid? [Educational materials] should be written by and edited by people with lived experience; i.e., people who have food insecurity or have walked a mile in those shoes.”

Dr. Younossi disclosed relationships with Merck, Abbott, AstraZeneca, and others. Dr. Tapper disclosed relationships with Takeda, Novo Nordisk, Madrigal, and others.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT THE LIVER MEETING

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Pancreatic cystic neoplasms rarely turn cancerous, study shows

Article Type
Changed
Tue, 12/05/2023 - 21:37

Individuals with intraductal papillary mucinous neoplasms (IPMNs) that lack “worrisome or high-risk features” have no greater risk of pancreatic cancer than individuals without IPMNs, based on a retrospective cohort study from Mayo Clinic.

These findings, if validated in a larger population, could challenge current surveillance practices for IPMNs, reported researchers who were led by Shounak Majumder, MD, a gastroenterologist in the pancreas clinic at Mayo Clinic, Rochester, Minn. The study was published in JAMA Network Open.

“Among intraductal papillary mucinous neoplasms (IPMNs) that were Fukuoka negative at baseline, fewer than 10% developed worrisome or high-risk features on follow-up. Pancreatic cancer development in IPMN was a rare event overall,” the authors wrote.

Dr. Shounak Majumder

“Current international consensus guidelines for the management of IPMNs recommend image-based surveillance with the aim to detect clinical and imaging features of advanced neoplasia,” the authors wrote. Yet “there are no population-based estimates of the burden of pancreatic cancer in individuals with IPMNs or the proportion of pancreatic cancers that develop from or adjacent to an IPMN.”

Researchers aimed to address this knowledge gap with a population-based cohort study. Drawing data from the Rochester Epidemiology Project, which includes longitudinal medical records from residents of Olmsted County, Minn., investigators identified two cohorts. The first group comprised 2,114 patients 50 years old or older who had undergone abdominal CT scans between 2000 and 2015, among whom 231 (10.9%) had IPMNs. The second cohort included 320 patients diagnosed with pancreatic cancer between 2000 and 2019, among whom 31 (9.8%) had IPMNs.

Further analysis showed that 81% of the patients with IPMNs in the first cohort lacked Fukuoka high-risk or worrisome features. Within this subgroup, the incidence rate of pancreatic cancer per 100 years was not significantly different than among individuals without IPMNs.

“Although the risk of IPMN-PC is has been extensively described, our population-based study further demonstrates that most IPMNs did not progress in Fukuoka stage and did not transform into pancreatic cancer, a similar message was expressed by the current American Gastroenterological Association pancreatic cyst guidelines, published in 2015, and studies published in 2022 and 2016,” the investigators wrote.

Analyzing the cohort of 320 patients with pancreatic cancer showed those with IPMNs had significantly better outcomes than those without IPMNs, including longer survival and lower rate of metastatic disease upon diagnosis. These findings align with previous research, the investigators wrote.

In an accompanying editorial, Stefano Crippa, MD, PhD, of Istituto di Ricovero e Cura a Carattere Scientifico San Raffaele Scientific Institute, Milan, and colleagues offered their perspective on the findings.

“Although results of this study should be validated in larger cohorts, they represent useful clinical data from an unselected population-based cohort that helps challenge current IPMN surveillance policies that recommend lifetime active surveillance for all fit individuals,” they wrote. “Currently, we can use follow-up data from studies like this one to identify patients with IPMNs who are not at risk of progression based on clinical-radiological parameters. We can furthermore start selecting subgroups of patients with limited life expectancy due to age or comorbidities to be considered for surveillance discontinuation.”

Timothy Louis Frankel, MD, a gastrointestinal surgeon at the University of Michigan, Ann Arbor, specializing in malignancies, said the findings are most useful for reassuring patients who have been diagnosed with an IPMN.

“The real take-home message is that in the absence of worrisome features people [with an IPMN] should feel comfortable that their risk is no higher than the general population for developing pancreatic cancer,” Dr. Frankel said in an interview.

Before any changes to surveillance can be considered, however, Dr. Frankel echoed the investigators’ call for a larger study, noting the relatively small population, most of whom (92%) were White.

“We do know that pancreas cancer and pancreas diseases vary significantly by race,” Dr. Frankel said. “So we do need to be a little bit cautious about changing the way that we manage patients based on a fairly homogeneous subset.”

He also pointed out that two patients had IPMNs that developed increased risk over time.

“They actually went from no risk features to having features that put them at risk,” Dr. Frankel said. “Those are patients who were saved by surveillance. So I’m not sure that this study was necessarily designed to let us know if and when we can stop following these lesions.”

Study authors had no relevant disclosures. The editorial writers reported no conflicts of interest.

Publications
Topics
Sections

Individuals with intraductal papillary mucinous neoplasms (IPMNs) that lack “worrisome or high-risk features” have no greater risk of pancreatic cancer than individuals without IPMNs, based on a retrospective cohort study from Mayo Clinic.

These findings, if validated in a larger population, could challenge current surveillance practices for IPMNs, reported researchers who were led by Shounak Majumder, MD, a gastroenterologist in the pancreas clinic at Mayo Clinic, Rochester, Minn. The study was published in JAMA Network Open.

“Among intraductal papillary mucinous neoplasms (IPMNs) that were Fukuoka negative at baseline, fewer than 10% developed worrisome or high-risk features on follow-up. Pancreatic cancer development in IPMN was a rare event overall,” the authors wrote.

Dr. Shounak Majumder

“Current international consensus guidelines for the management of IPMNs recommend image-based surveillance with the aim to detect clinical and imaging features of advanced neoplasia,” the authors wrote. Yet “there are no population-based estimates of the burden of pancreatic cancer in individuals with IPMNs or the proportion of pancreatic cancers that develop from or adjacent to an IPMN.”

Researchers aimed to address this knowledge gap with a population-based cohort study. Drawing data from the Rochester Epidemiology Project, which includes longitudinal medical records from residents of Olmsted County, Minn., investigators identified two cohorts. The first group comprised 2,114 patients 50 years old or older who had undergone abdominal CT scans between 2000 and 2015, among whom 231 (10.9%) had IPMNs. The second cohort included 320 patients diagnosed with pancreatic cancer between 2000 and 2019, among whom 31 (9.8%) had IPMNs.

Further analysis showed that 81% of the patients with IPMNs in the first cohort lacked Fukuoka high-risk or worrisome features. Within this subgroup, the incidence rate of pancreatic cancer per 100 years was not significantly different than among individuals without IPMNs.

“Although the risk of IPMN-PC is has been extensively described, our population-based study further demonstrates that most IPMNs did not progress in Fukuoka stage and did not transform into pancreatic cancer, a similar message was expressed by the current American Gastroenterological Association pancreatic cyst guidelines, published in 2015, and studies published in 2022 and 2016,” the investigators wrote.

Analyzing the cohort of 320 patients with pancreatic cancer showed those with IPMNs had significantly better outcomes than those without IPMNs, including longer survival and lower rate of metastatic disease upon diagnosis. These findings align with previous research, the investigators wrote.

In an accompanying editorial, Stefano Crippa, MD, PhD, of Istituto di Ricovero e Cura a Carattere Scientifico San Raffaele Scientific Institute, Milan, and colleagues offered their perspective on the findings.

“Although results of this study should be validated in larger cohorts, they represent useful clinical data from an unselected population-based cohort that helps challenge current IPMN surveillance policies that recommend lifetime active surveillance for all fit individuals,” they wrote. “Currently, we can use follow-up data from studies like this one to identify patients with IPMNs who are not at risk of progression based on clinical-radiological parameters. We can furthermore start selecting subgroups of patients with limited life expectancy due to age or comorbidities to be considered for surveillance discontinuation.”

Timothy Louis Frankel, MD, a gastrointestinal surgeon at the University of Michigan, Ann Arbor, specializing in malignancies, said the findings are most useful for reassuring patients who have been diagnosed with an IPMN.

“The real take-home message is that in the absence of worrisome features people [with an IPMN] should feel comfortable that their risk is no higher than the general population for developing pancreatic cancer,” Dr. Frankel said in an interview.

Before any changes to surveillance can be considered, however, Dr. Frankel echoed the investigators’ call for a larger study, noting the relatively small population, most of whom (92%) were White.

“We do know that pancreas cancer and pancreas diseases vary significantly by race,” Dr. Frankel said. “So we do need to be a little bit cautious about changing the way that we manage patients based on a fairly homogeneous subset.”

He also pointed out that two patients had IPMNs that developed increased risk over time.

“They actually went from no risk features to having features that put them at risk,” Dr. Frankel said. “Those are patients who were saved by surveillance. So I’m not sure that this study was necessarily designed to let us know if and when we can stop following these lesions.”

Study authors had no relevant disclosures. The editorial writers reported no conflicts of interest.

Individuals with intraductal papillary mucinous neoplasms (IPMNs) that lack “worrisome or high-risk features” have no greater risk of pancreatic cancer than individuals without IPMNs, based on a retrospective cohort study from Mayo Clinic.

These findings, if validated in a larger population, could challenge current surveillance practices for IPMNs, reported researchers who were led by Shounak Majumder, MD, a gastroenterologist in the pancreas clinic at Mayo Clinic, Rochester, Minn. The study was published in JAMA Network Open.

“Among intraductal papillary mucinous neoplasms (IPMNs) that were Fukuoka negative at baseline, fewer than 10% developed worrisome or high-risk features on follow-up. Pancreatic cancer development in IPMN was a rare event overall,” the authors wrote.

Dr. Shounak Majumder

“Current international consensus guidelines for the management of IPMNs recommend image-based surveillance with the aim to detect clinical and imaging features of advanced neoplasia,” the authors wrote. Yet “there are no population-based estimates of the burden of pancreatic cancer in individuals with IPMNs or the proportion of pancreatic cancers that develop from or adjacent to an IPMN.”

Researchers aimed to address this knowledge gap with a population-based cohort study. Drawing data from the Rochester Epidemiology Project, which includes longitudinal medical records from residents of Olmsted County, Minn., investigators identified two cohorts. The first group comprised 2,114 patients 50 years old or older who had undergone abdominal CT scans between 2000 and 2015, among whom 231 (10.9%) had IPMNs. The second cohort included 320 patients diagnosed with pancreatic cancer between 2000 and 2019, among whom 31 (9.8%) had IPMNs.

Further analysis showed that 81% of the patients with IPMNs in the first cohort lacked Fukuoka high-risk or worrisome features. Within this subgroup, the incidence rate of pancreatic cancer per 100 years was not significantly different than among individuals without IPMNs.

“Although the risk of IPMN-PC is has been extensively described, our population-based study further demonstrates that most IPMNs did not progress in Fukuoka stage and did not transform into pancreatic cancer, a similar message was expressed by the current American Gastroenterological Association pancreatic cyst guidelines, published in 2015, and studies published in 2022 and 2016,” the investigators wrote.

Analyzing the cohort of 320 patients with pancreatic cancer showed those with IPMNs had significantly better outcomes than those without IPMNs, including longer survival and lower rate of metastatic disease upon diagnosis. These findings align with previous research, the investigators wrote.

In an accompanying editorial, Stefano Crippa, MD, PhD, of Istituto di Ricovero e Cura a Carattere Scientifico San Raffaele Scientific Institute, Milan, and colleagues offered their perspective on the findings.

“Although results of this study should be validated in larger cohorts, they represent useful clinical data from an unselected population-based cohort that helps challenge current IPMN surveillance policies that recommend lifetime active surveillance for all fit individuals,” they wrote. “Currently, we can use follow-up data from studies like this one to identify patients with IPMNs who are not at risk of progression based on clinical-radiological parameters. We can furthermore start selecting subgroups of patients with limited life expectancy due to age or comorbidities to be considered for surveillance discontinuation.”

Timothy Louis Frankel, MD, a gastrointestinal surgeon at the University of Michigan, Ann Arbor, specializing in malignancies, said the findings are most useful for reassuring patients who have been diagnosed with an IPMN.

“The real take-home message is that in the absence of worrisome features people [with an IPMN] should feel comfortable that their risk is no higher than the general population for developing pancreatic cancer,” Dr. Frankel said in an interview.

Before any changes to surveillance can be considered, however, Dr. Frankel echoed the investigators’ call for a larger study, noting the relatively small population, most of whom (92%) were White.

“We do know that pancreas cancer and pancreas diseases vary significantly by race,” Dr. Frankel said. “So we do need to be a little bit cautious about changing the way that we manage patients based on a fairly homogeneous subset.”

He also pointed out that two patients had IPMNs that developed increased risk over time.

“They actually went from no risk features to having features that put them at risk,” Dr. Frankel said. “Those are patients who were saved by surveillance. So I’m not sure that this study was necessarily designed to let us know if and when we can stop following these lesions.”

Study authors had no relevant disclosures. The editorial writers reported no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA NETWORK OPEN

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

U.S. kids are taking melatonin for sleep, despite evidence gap

Article Type
Changed
Tue, 11/28/2023 - 10:44

Melatonin usage has become increasingly common among children in the United States, with almost one in five kids over the age of 5 having taken the sleep aid in the past 30 days, according to a recent study.

These findings should prompt clinicians to discuss with parents the various factors that could be driving sleep disturbances, and potential safety issues associated with melatonin usage, lead author Lauren E. Hartstein, PhD, a postdoctoral fellow in the Sleep and Development Lab at the University of Colorado, Boulder, and colleagues reported.

Dr. Lauren E. Hartstein

Writing in JAMA Pediatrics, the investigators noted that melatonin products are notorious for mislabeling, with active ingredient quantities as much as three times higher than the labeled amount. This issue is particularly concerning, they added, as calls to poison control for melatonin ingestion jumped more than fivefold from 2012 to 2021, with most cases involving children younger than 5 years. Meanwhile, scant evidence is available to characterize intentional usage in the same population.

“Current data are lacking on the prevalence of melatonin use and the frequency, dosing, and timing of melatonin administration in U.S. youth,” Dr. Hartstein and colleagues wrote.

To address this knowledge gap, the investigators conducted an online survey of parents with children and adolescents aged 1.0-13.9 years. The survey asked parents to report any melatonin usage in their children in the past 30 days.

Parents reporting melatonin usage were asked about frequency, dose, timing of administration before bedtime, and duration of use.

Findings were reported within three age groups: preschool (1-4 years), school aged (5-9 years), and preteen (10-13 years).

The survey revealed that almost one in five children in the older age groups were using melatonin, with a rate of 18.5% in the school-aged group and 19.4% in the preteen group. In comparison, 5.6% of preschool children had received melatonin for sleep in the past 30 days.
 

A significant uptick in usage

These findings point to a significant uptick in usage, according to Dr. Hartstein and colleagues, who cited a 2017-2018 study that found just 1.3% of U.S. children had taken melatonin in the past 30 days.

In the present study, melatonin was typically administered 30 minutes before bedtime, most often as a gummy (64.3%) or chewable tablet (27.0%).

Frequency of administration was similar between age groups and trended toward a bimodal pattern, with melatonin often given either 1 day per week or 7 days per week.

Median dose increased significantly with age, from 0.5 mg in the preschool group to 1.0 mg in the school-aged group and 2.0 mg in the preteen group. Median duration also showed a significant upward trend, with 12-month, 18-month, and 21-month durations, respectively, for ascending age groups.

The investigators concluded that melatonin usage among U.S. adolescents and children is “exceedingly common,” despite a lack of evidence to support long-term safety or guide optimal dosing.
 

Is melatonin use masking other sleep issues?

“Widespread melatonin use across developmental stages may suggest a high prevalence of sleep disruption, which deserves accurate diagnosis and effective treatment,” Dr. Hartstein and colleagues wrote. “Dissemination of information regarding safety concerns, such as overdose and supplement mislabeling, is necessary. Clinicians should discuss with parents the factors associated with sleep difficulties and effective behavioral strategies.”

Large-scale, long-term studies are needed, they added, to generate relevant safety and efficacy data, and to characterize the factors driving melatonin administration by parents.

courtesy UCLA
Dr. Alfonso J. Padilla

“Studies like these add to our knowledge base and give us insight into what patients or parents may be doing that can impact overall health,” said Alfonso J. Padilla, MD, assistant clinical professor of sleep medicine at the University of California, Los Angeles, in a written comment. “Often, in normal encounters with our patients we may not be able to gather this information easily. It may help open conversations about sleep issues that are not being addressed.”

Dr. Padilla suggested that parents may believe that melatonin is safe because it is not regulated by the Food and Drug Administration, when in fact they could be negatively impacting their children’s sleep. He noted that short-term risks include altered circadian rhythm and vivid dreams or nightmares, while long-term safety remains unclear.

“As a sleep physician, I use melatonin for specific indications only,” Dr. Padilla said. “I may use it in small children that are having difficulty falling asleep, especially in children with autism or special needs. I also use it for help in adjustment in circadian rhythm, especially in adolescents.”

He recommends melatonin, he added, if he has a complete case history, and melatonin is suitable for that patient.

Typically, it’s not.

“Most often a medication is not the answer for the sleep concern that parents are having about their child,” he said.

The investigators disclosed grants from the Eunice Kennedy Shriver National Institute of Child Health and Human Development and the Colorado Clinical and Translational Science Award Program of the National Center for Advancing Translational Sciences of the National Institutes of Health. They reported no conflicts of interest.

Publications
Topics
Sections

Melatonin usage has become increasingly common among children in the United States, with almost one in five kids over the age of 5 having taken the sleep aid in the past 30 days, according to a recent study.

These findings should prompt clinicians to discuss with parents the various factors that could be driving sleep disturbances, and potential safety issues associated with melatonin usage, lead author Lauren E. Hartstein, PhD, a postdoctoral fellow in the Sleep and Development Lab at the University of Colorado, Boulder, and colleagues reported.

Dr. Lauren E. Hartstein

Writing in JAMA Pediatrics, the investigators noted that melatonin products are notorious for mislabeling, with active ingredient quantities as much as three times higher than the labeled amount. This issue is particularly concerning, they added, as calls to poison control for melatonin ingestion jumped more than fivefold from 2012 to 2021, with most cases involving children younger than 5 years. Meanwhile, scant evidence is available to characterize intentional usage in the same population.

“Current data are lacking on the prevalence of melatonin use and the frequency, dosing, and timing of melatonin administration in U.S. youth,” Dr. Hartstein and colleagues wrote.

To address this knowledge gap, the investigators conducted an online survey of parents with children and adolescents aged 1.0-13.9 years. The survey asked parents to report any melatonin usage in their children in the past 30 days.

Parents reporting melatonin usage were asked about frequency, dose, timing of administration before bedtime, and duration of use.

Findings were reported within three age groups: preschool (1-4 years), school aged (5-9 years), and preteen (10-13 years).

The survey revealed that almost one in five children in the older age groups were using melatonin, with a rate of 18.5% in the school-aged group and 19.4% in the preteen group. In comparison, 5.6% of preschool children had received melatonin for sleep in the past 30 days.
 

A significant uptick in usage

These findings point to a significant uptick in usage, according to Dr. Hartstein and colleagues, who cited a 2017-2018 study that found just 1.3% of U.S. children had taken melatonin in the past 30 days.

In the present study, melatonin was typically administered 30 minutes before bedtime, most often as a gummy (64.3%) or chewable tablet (27.0%).

Frequency of administration was similar between age groups and trended toward a bimodal pattern, with melatonin often given either 1 day per week or 7 days per week.

Median dose increased significantly with age, from 0.5 mg in the preschool group to 1.0 mg in the school-aged group and 2.0 mg in the preteen group. Median duration also showed a significant upward trend, with 12-month, 18-month, and 21-month durations, respectively, for ascending age groups.

The investigators concluded that melatonin usage among U.S. adolescents and children is “exceedingly common,” despite a lack of evidence to support long-term safety or guide optimal dosing.
 

Is melatonin use masking other sleep issues?

“Widespread melatonin use across developmental stages may suggest a high prevalence of sleep disruption, which deserves accurate diagnosis and effective treatment,” Dr. Hartstein and colleagues wrote. “Dissemination of information regarding safety concerns, such as overdose and supplement mislabeling, is necessary. Clinicians should discuss with parents the factors associated with sleep difficulties and effective behavioral strategies.”

Large-scale, long-term studies are needed, they added, to generate relevant safety and efficacy data, and to characterize the factors driving melatonin administration by parents.

courtesy UCLA
Dr. Alfonso J. Padilla

“Studies like these add to our knowledge base and give us insight into what patients or parents may be doing that can impact overall health,” said Alfonso J. Padilla, MD, assistant clinical professor of sleep medicine at the University of California, Los Angeles, in a written comment. “Often, in normal encounters with our patients we may not be able to gather this information easily. It may help open conversations about sleep issues that are not being addressed.”

Dr. Padilla suggested that parents may believe that melatonin is safe because it is not regulated by the Food and Drug Administration, when in fact they could be negatively impacting their children’s sleep. He noted that short-term risks include altered circadian rhythm and vivid dreams or nightmares, while long-term safety remains unclear.

“As a sleep physician, I use melatonin for specific indications only,” Dr. Padilla said. “I may use it in small children that are having difficulty falling asleep, especially in children with autism or special needs. I also use it for help in adjustment in circadian rhythm, especially in adolescents.”

He recommends melatonin, he added, if he has a complete case history, and melatonin is suitable for that patient.

Typically, it’s not.

“Most often a medication is not the answer for the sleep concern that parents are having about their child,” he said.

The investigators disclosed grants from the Eunice Kennedy Shriver National Institute of Child Health and Human Development and the Colorado Clinical and Translational Science Award Program of the National Center for Advancing Translational Sciences of the National Institutes of Health. They reported no conflicts of interest.

Melatonin usage has become increasingly common among children in the United States, with almost one in five kids over the age of 5 having taken the sleep aid in the past 30 days, according to a recent study.

These findings should prompt clinicians to discuss with parents the various factors that could be driving sleep disturbances, and potential safety issues associated with melatonin usage, lead author Lauren E. Hartstein, PhD, a postdoctoral fellow in the Sleep and Development Lab at the University of Colorado, Boulder, and colleagues reported.

Dr. Lauren E. Hartstein

Writing in JAMA Pediatrics, the investigators noted that melatonin products are notorious for mislabeling, with active ingredient quantities as much as three times higher than the labeled amount. This issue is particularly concerning, they added, as calls to poison control for melatonin ingestion jumped more than fivefold from 2012 to 2021, with most cases involving children younger than 5 years. Meanwhile, scant evidence is available to characterize intentional usage in the same population.

“Current data are lacking on the prevalence of melatonin use and the frequency, dosing, and timing of melatonin administration in U.S. youth,” Dr. Hartstein and colleagues wrote.

To address this knowledge gap, the investigators conducted an online survey of parents with children and adolescents aged 1.0-13.9 years. The survey asked parents to report any melatonin usage in their children in the past 30 days.

Parents reporting melatonin usage were asked about frequency, dose, timing of administration before bedtime, and duration of use.

Findings were reported within three age groups: preschool (1-4 years), school aged (5-9 years), and preteen (10-13 years).

The survey revealed that almost one in five children in the older age groups were using melatonin, with a rate of 18.5% in the school-aged group and 19.4% in the preteen group. In comparison, 5.6% of preschool children had received melatonin for sleep in the past 30 days.
 

A significant uptick in usage

These findings point to a significant uptick in usage, according to Dr. Hartstein and colleagues, who cited a 2017-2018 study that found just 1.3% of U.S. children had taken melatonin in the past 30 days.

In the present study, melatonin was typically administered 30 minutes before bedtime, most often as a gummy (64.3%) or chewable tablet (27.0%).

Frequency of administration was similar between age groups and trended toward a bimodal pattern, with melatonin often given either 1 day per week or 7 days per week.

Median dose increased significantly with age, from 0.5 mg in the preschool group to 1.0 mg in the school-aged group and 2.0 mg in the preteen group. Median duration also showed a significant upward trend, with 12-month, 18-month, and 21-month durations, respectively, for ascending age groups.

The investigators concluded that melatonin usage among U.S. adolescents and children is “exceedingly common,” despite a lack of evidence to support long-term safety or guide optimal dosing.
 

Is melatonin use masking other sleep issues?

“Widespread melatonin use across developmental stages may suggest a high prevalence of sleep disruption, which deserves accurate diagnosis and effective treatment,” Dr. Hartstein and colleagues wrote. “Dissemination of information regarding safety concerns, such as overdose and supplement mislabeling, is necessary. Clinicians should discuss with parents the factors associated with sleep difficulties and effective behavioral strategies.”

Large-scale, long-term studies are needed, they added, to generate relevant safety and efficacy data, and to characterize the factors driving melatonin administration by parents.

courtesy UCLA
Dr. Alfonso J. Padilla

“Studies like these add to our knowledge base and give us insight into what patients or parents may be doing that can impact overall health,” said Alfonso J. Padilla, MD, assistant clinical professor of sleep medicine at the University of California, Los Angeles, in a written comment. “Often, in normal encounters with our patients we may not be able to gather this information easily. It may help open conversations about sleep issues that are not being addressed.”

Dr. Padilla suggested that parents may believe that melatonin is safe because it is not regulated by the Food and Drug Administration, when in fact they could be negatively impacting their children’s sleep. He noted that short-term risks include altered circadian rhythm and vivid dreams or nightmares, while long-term safety remains unclear.

“As a sleep physician, I use melatonin for specific indications only,” Dr. Padilla said. “I may use it in small children that are having difficulty falling asleep, especially in children with autism or special needs. I also use it for help in adjustment in circadian rhythm, especially in adolescents.”

He recommends melatonin, he added, if he has a complete case history, and melatonin is suitable for that patient.

Typically, it’s not.

“Most often a medication is not the answer for the sleep concern that parents are having about their child,” he said.

The investigators disclosed grants from the Eunice Kennedy Shriver National Institute of Child Health and Human Development and the Colorado Clinical and Translational Science Award Program of the National Center for Advancing Translational Sciences of the National Institutes of Health. They reported no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA PEDIATRICS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Liver-resident T cells provide early protection against Listeria infection

Article Type
Changed
Mon, 11/13/2023 - 09:50

Liver-resident gamma delta T cells that produce interleukin(IL)-17 coordinate with hepatic macrophages to offer early protection against Listeria monocytogenes infection, according to investigators.

These finding suggest that gamma delta T17 cells could be a target for novel cell-based therapies against liver diseases, reported lead author Yanan Wang, PhD, of Shandong University, Jinan, China, and colleagues.

“Gamma delta T cells are located in mucosal tissues and other peripheral lymphoid tissues and are considered to act as the first line of defense within the immune system,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology. “Several studies have reported that IL-17A produced by gamma delta T cells plays a critical role in host defense after Listeria monocytogenes [infection] in the liver. However, in those studies, the details of the phenotypes, dynamic changes, proliferation activity, and cytokine production of the responding gamma delta T cell populations in the overall process of hepatic infection are unclear, and how they accumulated into the infection sites has not been elucidated.”

To address this knowledge gap, Dr. Wang and colleagues conducted a series of experiments involving gamma delta T cells from murine liver samples.

First, using single-cell RNA-sequencing (scRNA-seq), the investigators identified six clusters of hepatic gamma delta T cells.

“[This first step] revealed the unique gene expression characteristics and indicated the possible important roles in immune responses of hepatic gamma delta T17 cells,” they noted.

Next, the investigators measured expression of CD44 and CD27 in liver gamma delta cells.

“Expression of CD44 and CD27 has been used to distinguish IL-17A–, interferon gamma–producing, and other subsets of gamma delta T cells in the thymus, lymph nodes, lungs, and other peripheral lymphoid tissues,” they wrote.

These efforts revealed three subsets of hepatic gamma delta T cells, of which CD44hiCD27– gamma delta T cells were most abundant. Further analysis revealed expression profiles consistent with liver residency.

The next phases of the study characterized the immune roles of hepatic gamma delta T cells.

A comparison of Listeria monocytogenes infection in wild-type versus T-cell antigen receptor knockout mice, for example, showed that knockout mice had significantly more weight loss than did wild-type mice, greater bacterial load in the liver, and shorter survival times.

“As expected, the proportion and absolute numbers of gamma delta T cells in the liver of wild-type mice increased at day 3 and reached a peak at day 7 after infection,” the investigators wrote. “These data suggested that hepatic gamma delta T cells proliferated after infection and contributed to Lm clearance.”

Parabiosis experiments showed that the increased number of CD44hiCD27– gamma delta T cells in the livers of Listeria monocytogenes-infected mice were due to migration and proliferation of liver-resident gamma delta T cells instead of circulating gamma delta T cells. A transwell assay revealed that Kupffer cells and monocyte-derived macrophages promoted migration of CD44hiCD27– gamma delta T cells upon infection.

“Our study provides additional insight into liver-resident lymphocytes and will aid in targeting such tissue-resident lymphocyte populations to promote local immune surveillance,” the investigators concluded.

The study was supported by grants from the National Natural Science Foundation of China and the Shandong Provincial Natural Science Foundation. The investigators disclosed no conflicts of interest.

Publications
Topics
Sections

Liver-resident gamma delta T cells that produce interleukin(IL)-17 coordinate with hepatic macrophages to offer early protection against Listeria monocytogenes infection, according to investigators.

These finding suggest that gamma delta T17 cells could be a target for novel cell-based therapies against liver diseases, reported lead author Yanan Wang, PhD, of Shandong University, Jinan, China, and colleagues.

“Gamma delta T cells are located in mucosal tissues and other peripheral lymphoid tissues and are considered to act as the first line of defense within the immune system,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology. “Several studies have reported that IL-17A produced by gamma delta T cells plays a critical role in host defense after Listeria monocytogenes [infection] in the liver. However, in those studies, the details of the phenotypes, dynamic changes, proliferation activity, and cytokine production of the responding gamma delta T cell populations in the overall process of hepatic infection are unclear, and how they accumulated into the infection sites has not been elucidated.”

To address this knowledge gap, Dr. Wang and colleagues conducted a series of experiments involving gamma delta T cells from murine liver samples.

First, using single-cell RNA-sequencing (scRNA-seq), the investigators identified six clusters of hepatic gamma delta T cells.

“[This first step] revealed the unique gene expression characteristics and indicated the possible important roles in immune responses of hepatic gamma delta T17 cells,” they noted.

Next, the investigators measured expression of CD44 and CD27 in liver gamma delta cells.

“Expression of CD44 and CD27 has been used to distinguish IL-17A–, interferon gamma–producing, and other subsets of gamma delta T cells in the thymus, lymph nodes, lungs, and other peripheral lymphoid tissues,” they wrote.

These efforts revealed three subsets of hepatic gamma delta T cells, of which CD44hiCD27– gamma delta T cells were most abundant. Further analysis revealed expression profiles consistent with liver residency.

The next phases of the study characterized the immune roles of hepatic gamma delta T cells.

A comparison of Listeria monocytogenes infection in wild-type versus T-cell antigen receptor knockout mice, for example, showed that knockout mice had significantly more weight loss than did wild-type mice, greater bacterial load in the liver, and shorter survival times.

“As expected, the proportion and absolute numbers of gamma delta T cells in the liver of wild-type mice increased at day 3 and reached a peak at day 7 after infection,” the investigators wrote. “These data suggested that hepatic gamma delta T cells proliferated after infection and contributed to Lm clearance.”

Parabiosis experiments showed that the increased number of CD44hiCD27– gamma delta T cells in the livers of Listeria monocytogenes-infected mice were due to migration and proliferation of liver-resident gamma delta T cells instead of circulating gamma delta T cells. A transwell assay revealed that Kupffer cells and monocyte-derived macrophages promoted migration of CD44hiCD27– gamma delta T cells upon infection.

“Our study provides additional insight into liver-resident lymphocytes and will aid in targeting such tissue-resident lymphocyte populations to promote local immune surveillance,” the investigators concluded.

The study was supported by grants from the National Natural Science Foundation of China and the Shandong Provincial Natural Science Foundation. The investigators disclosed no conflicts of interest.

Liver-resident gamma delta T cells that produce interleukin(IL)-17 coordinate with hepatic macrophages to offer early protection against Listeria monocytogenes infection, according to investigators.

These finding suggest that gamma delta T17 cells could be a target for novel cell-based therapies against liver diseases, reported lead author Yanan Wang, PhD, of Shandong University, Jinan, China, and colleagues.

“Gamma delta T cells are located in mucosal tissues and other peripheral lymphoid tissues and are considered to act as the first line of defense within the immune system,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology. “Several studies have reported that IL-17A produced by gamma delta T cells plays a critical role in host defense after Listeria monocytogenes [infection] in the liver. However, in those studies, the details of the phenotypes, dynamic changes, proliferation activity, and cytokine production of the responding gamma delta T cell populations in the overall process of hepatic infection are unclear, and how they accumulated into the infection sites has not been elucidated.”

To address this knowledge gap, Dr. Wang and colleagues conducted a series of experiments involving gamma delta T cells from murine liver samples.

First, using single-cell RNA-sequencing (scRNA-seq), the investigators identified six clusters of hepatic gamma delta T cells.

“[This first step] revealed the unique gene expression characteristics and indicated the possible important roles in immune responses of hepatic gamma delta T17 cells,” they noted.

Next, the investigators measured expression of CD44 and CD27 in liver gamma delta cells.

“Expression of CD44 and CD27 has been used to distinguish IL-17A–, interferon gamma–producing, and other subsets of gamma delta T cells in the thymus, lymph nodes, lungs, and other peripheral lymphoid tissues,” they wrote.

These efforts revealed three subsets of hepatic gamma delta T cells, of which CD44hiCD27– gamma delta T cells were most abundant. Further analysis revealed expression profiles consistent with liver residency.

The next phases of the study characterized the immune roles of hepatic gamma delta T cells.

A comparison of Listeria monocytogenes infection in wild-type versus T-cell antigen receptor knockout mice, for example, showed that knockout mice had significantly more weight loss than did wild-type mice, greater bacterial load in the liver, and shorter survival times.

“As expected, the proportion and absolute numbers of gamma delta T cells in the liver of wild-type mice increased at day 3 and reached a peak at day 7 after infection,” the investigators wrote. “These data suggested that hepatic gamma delta T cells proliferated after infection and contributed to Lm clearance.”

Parabiosis experiments showed that the increased number of CD44hiCD27– gamma delta T cells in the livers of Listeria monocytogenes-infected mice were due to migration and proliferation of liver-resident gamma delta T cells instead of circulating gamma delta T cells. A transwell assay revealed that Kupffer cells and monocyte-derived macrophages promoted migration of CD44hiCD27– gamma delta T cells upon infection.

“Our study provides additional insight into liver-resident lymphocytes and will aid in targeting such tissue-resident lymphocyte populations to promote local immune surveillance,” the investigators concluded.

The study was supported by grants from the National Natural Science Foundation of China and the Shandong Provincial Natural Science Foundation. The investigators disclosed no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

AGA publishes CPU for AI in colon polyp diagnosis and management

Article Type
Changed
Fri, 11/10/2023 - 09:07

The American Gastroenterological Association has published a Clinical Practice Update (CPU) on artificial intelligence (AI) for diagnosing and managing colorectal polyps.

The CPU, authored by Jason Samarasena, MD, of UCI Health, Orange, Calif., and colleagues, draws on recent studies and clinical experience to discuss ways that AI is already reshaping colonoscopy, and what opportunities may lie ahead.

Dr. Jason Samarasena

“As with any emerging technology, there are important questions and challenges that need to be addressed to ensure that AI tools are introduced safely and effectively into clinical endoscopic practice, ”they wrote in Gastroenterology.

With advances in processing speed and deep-learning technology, AI “computer vision” can now analyze live video of a colonoscopy in progress, enabling computer-aided detection (CADe) and computer-aided diagnosis (CADx), which the panelists described as the two most important developments in the area.
 

CADe

“In the last several years, numerous prospective, multicenter studies have found that real-time use of AI CADe tools during colonoscopy leads to improvements in adenoma detection and other related performance metrics,” Dr. Samarasena and colleagues wrote.

CADe has yielded mixed success in real-world practice, however, with some studies reporting worse detection metrics after implementing the new technology. Dr. Samarasena and colleagues offered a variety of possible explanations for these findings, including a “ceiling effect” among highly adept endoscopists, reduced operator vigilance caused by false confidence in the technology, and potential confounding inherent to unblinded trials.

CADe may also increase health care costs and burden, they suggested, as the technology tends to catch small benign polyps, prompting unnecessary resections and shortened colonoscopy surveillance intervals.
 

CADx

The above, unintended consequences of CADe may be counteracted by CADx, which uses computer vision to predict which lesions have benign histology, enabling “resect-and discard” or “diagnose-and-leave” strategies.

Such approaches could significantly reduce rates of polypectomy and/or histopathology, saving an estimated $33 million–150 million per year, according to the update.

Results of real-time CADx clinical trials have been “encouraging,” Dr. Samarasena and colleagues wrote, noting that emerging technology–compatible white-light endoscopy can achieve a negative predictive value of almost 98% for lesions less than 5 mm in diameter, potentially reducing polypectomy rate by almost half.

“Increasing endoscopist confidence in optical diagnosis may be an important step toward broader implementation of leave in situ and resect-and-discard strategies, but successful implementation will also require CADx tools that seamlessly integrate the endoscopic work flow, without the need for image enhancement or magnification,” the panelists wrote.

Reimbursement models may also need to be reworked, they suggested, as many GI practices depend on a steady stream of revenue from pathology services.
 

Computer-aided quality assessment systems

Beyond optical detection and diagnosis, AI tools are also being developed to improve colonoscopy technique.

Investigators are studying quality assessment systems that use AI offer feedback on a range of endoscopist skills, including colonic-fold evaluation, level of mucosal exposure, and withdrawal time, the latter of which is visualized by a “speedometer” that “paints” the mucosa with “a graphical representation of the colon.”

“In the future, these types of AI-based systems may support trainees and lower-performing endoscopists to reduce exposure errors and, more broadly, may empower physician practices and hospital systems with more nuanced and actionable data on an array of factors that contribute to colonoscopy quality,” the panelists wrote.
 

 

 

Looking ahead

Dr. Samarasena and colleagues concluded by suggesting that the AI tools in usage and development are just the beginning of a wave of technology that will revolutionize how colonoscopies are performed.

“Eventually, we predict an AI suite of tools for colonoscopy will seem indispensable, as a powerful adjunct to support safe and efficient clinical practice,” they wrote. “As technological innovation progresses, we can expect that the future for AI in endoscopy will be a hybrid model, where the unique capabilities of physicians and our AI tools will be seamlessly intertwined to optimize patient care.”

This CPU was commissioned and approved by the AGA Institute Clinical Practice Updates Committee and the AGA Governing Board. The investigators disclosed relationships with Olympus, Neptune Medical, Conmed, and others.

Publications
Topics
Sections

The American Gastroenterological Association has published a Clinical Practice Update (CPU) on artificial intelligence (AI) for diagnosing and managing colorectal polyps.

The CPU, authored by Jason Samarasena, MD, of UCI Health, Orange, Calif., and colleagues, draws on recent studies and clinical experience to discuss ways that AI is already reshaping colonoscopy, and what opportunities may lie ahead.

Dr. Jason Samarasena

“As with any emerging technology, there are important questions and challenges that need to be addressed to ensure that AI tools are introduced safely and effectively into clinical endoscopic practice, ”they wrote in Gastroenterology.

With advances in processing speed and deep-learning technology, AI “computer vision” can now analyze live video of a colonoscopy in progress, enabling computer-aided detection (CADe) and computer-aided diagnosis (CADx), which the panelists described as the two most important developments in the area.
 

CADe

“In the last several years, numerous prospective, multicenter studies have found that real-time use of AI CADe tools during colonoscopy leads to improvements in adenoma detection and other related performance metrics,” Dr. Samarasena and colleagues wrote.

CADe has yielded mixed success in real-world practice, however, with some studies reporting worse detection metrics after implementing the new technology. Dr. Samarasena and colleagues offered a variety of possible explanations for these findings, including a “ceiling effect” among highly adept endoscopists, reduced operator vigilance caused by false confidence in the technology, and potential confounding inherent to unblinded trials.

CADe may also increase health care costs and burden, they suggested, as the technology tends to catch small benign polyps, prompting unnecessary resections and shortened colonoscopy surveillance intervals.
 

CADx

The above, unintended consequences of CADe may be counteracted by CADx, which uses computer vision to predict which lesions have benign histology, enabling “resect-and discard” or “diagnose-and-leave” strategies.

Such approaches could significantly reduce rates of polypectomy and/or histopathology, saving an estimated $33 million–150 million per year, according to the update.

Results of real-time CADx clinical trials have been “encouraging,” Dr. Samarasena and colleagues wrote, noting that emerging technology–compatible white-light endoscopy can achieve a negative predictive value of almost 98% for lesions less than 5 mm in diameter, potentially reducing polypectomy rate by almost half.

“Increasing endoscopist confidence in optical diagnosis may be an important step toward broader implementation of leave in situ and resect-and-discard strategies, but successful implementation will also require CADx tools that seamlessly integrate the endoscopic work flow, without the need for image enhancement or magnification,” the panelists wrote.

Reimbursement models may also need to be reworked, they suggested, as many GI practices depend on a steady stream of revenue from pathology services.
 

Computer-aided quality assessment systems

Beyond optical detection and diagnosis, AI tools are also being developed to improve colonoscopy technique.

Investigators are studying quality assessment systems that use AI offer feedback on a range of endoscopist skills, including colonic-fold evaluation, level of mucosal exposure, and withdrawal time, the latter of which is visualized by a “speedometer” that “paints” the mucosa with “a graphical representation of the colon.”

“In the future, these types of AI-based systems may support trainees and lower-performing endoscopists to reduce exposure errors and, more broadly, may empower physician practices and hospital systems with more nuanced and actionable data on an array of factors that contribute to colonoscopy quality,” the panelists wrote.
 

 

 

Looking ahead

Dr. Samarasena and colleagues concluded by suggesting that the AI tools in usage and development are just the beginning of a wave of technology that will revolutionize how colonoscopies are performed.

“Eventually, we predict an AI suite of tools for colonoscopy will seem indispensable, as a powerful adjunct to support safe and efficient clinical practice,” they wrote. “As technological innovation progresses, we can expect that the future for AI in endoscopy will be a hybrid model, where the unique capabilities of physicians and our AI tools will be seamlessly intertwined to optimize patient care.”

This CPU was commissioned and approved by the AGA Institute Clinical Practice Updates Committee and the AGA Governing Board. The investigators disclosed relationships with Olympus, Neptune Medical, Conmed, and others.

The American Gastroenterological Association has published a Clinical Practice Update (CPU) on artificial intelligence (AI) for diagnosing and managing colorectal polyps.

The CPU, authored by Jason Samarasena, MD, of UCI Health, Orange, Calif., and colleagues, draws on recent studies and clinical experience to discuss ways that AI is already reshaping colonoscopy, and what opportunities may lie ahead.

Dr. Jason Samarasena

“As with any emerging technology, there are important questions and challenges that need to be addressed to ensure that AI tools are introduced safely and effectively into clinical endoscopic practice, ”they wrote in Gastroenterology.

With advances in processing speed and deep-learning technology, AI “computer vision” can now analyze live video of a colonoscopy in progress, enabling computer-aided detection (CADe) and computer-aided diagnosis (CADx), which the panelists described as the two most important developments in the area.
 

CADe

“In the last several years, numerous prospective, multicenter studies have found that real-time use of AI CADe tools during colonoscopy leads to improvements in adenoma detection and other related performance metrics,” Dr. Samarasena and colleagues wrote.

CADe has yielded mixed success in real-world practice, however, with some studies reporting worse detection metrics after implementing the new technology. Dr. Samarasena and colleagues offered a variety of possible explanations for these findings, including a “ceiling effect” among highly adept endoscopists, reduced operator vigilance caused by false confidence in the technology, and potential confounding inherent to unblinded trials.

CADe may also increase health care costs and burden, they suggested, as the technology tends to catch small benign polyps, prompting unnecessary resections and shortened colonoscopy surveillance intervals.
 

CADx

The above, unintended consequences of CADe may be counteracted by CADx, which uses computer vision to predict which lesions have benign histology, enabling “resect-and discard” or “diagnose-and-leave” strategies.

Such approaches could significantly reduce rates of polypectomy and/or histopathology, saving an estimated $33 million–150 million per year, according to the update.

Results of real-time CADx clinical trials have been “encouraging,” Dr. Samarasena and colleagues wrote, noting that emerging technology–compatible white-light endoscopy can achieve a negative predictive value of almost 98% for lesions less than 5 mm in diameter, potentially reducing polypectomy rate by almost half.

“Increasing endoscopist confidence in optical diagnosis may be an important step toward broader implementation of leave in situ and resect-and-discard strategies, but successful implementation will also require CADx tools that seamlessly integrate the endoscopic work flow, without the need for image enhancement or magnification,” the panelists wrote.

Reimbursement models may also need to be reworked, they suggested, as many GI practices depend on a steady stream of revenue from pathology services.
 

Computer-aided quality assessment systems

Beyond optical detection and diagnosis, AI tools are also being developed to improve colonoscopy technique.

Investigators are studying quality assessment systems that use AI offer feedback on a range of endoscopist skills, including colonic-fold evaluation, level of mucosal exposure, and withdrawal time, the latter of which is visualized by a “speedometer” that “paints” the mucosa with “a graphical representation of the colon.”

“In the future, these types of AI-based systems may support trainees and lower-performing endoscopists to reduce exposure errors and, more broadly, may empower physician practices and hospital systems with more nuanced and actionable data on an array of factors that contribute to colonoscopy quality,” the panelists wrote.
 

 

 

Looking ahead

Dr. Samarasena and colleagues concluded by suggesting that the AI tools in usage and development are just the beginning of a wave of technology that will revolutionize how colonoscopies are performed.

“Eventually, we predict an AI suite of tools for colonoscopy will seem indispensable, as a powerful adjunct to support safe and efficient clinical practice,” they wrote. “As technological innovation progresses, we can expect that the future for AI in endoscopy will be a hybrid model, where the unique capabilities of physicians and our AI tools will be seamlessly intertwined to optimize patient care.”

This CPU was commissioned and approved by the AGA Institute Clinical Practice Updates Committee and the AGA Governing Board. The investigators disclosed relationships with Olympus, Neptune Medical, Conmed, and others.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article