User login
Lasting norovirus immunity may depend on T cells
Protection against norovirus gastroenteritis is supported in part by norovirus-specific CD8+ T cells that reside in peripheral, intestinal, and lymphoid tissues, according to investigators.
These findings, and the molecular tools used to discover them, could guide development of a norovirus vaccine and novel cellular therapies, according to lead author Ajinkya Pattekar, MD, of the University of Pennsylvania, Philadelphia, and colleagues.
“Currently, there are no approved pharmacologic therapies against norovirus, and despite several promising clinical trials, an effective vaccine is not available,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology, which may stem from an incomplete understanding of norovirus immunity, according to Dr. Pattekar and colleagues.
They noted that most previous research has focused on humoral immunity, which appears variable between individuals, with some people exhibiting a strong humoral response, while others mount only partial humoral protection. The investigators also noted that, depending on which studies were examined, this type of defense could last years or fade within weeks to months and that “immune mechanisms other than antibodies may be important for protection against noroviruses.”
Specifically, cellular immunity may be at work. A 2020 study involving volunteers showed that T cells were cross-reactive to a type of norovirus the participants had never been exposed to.
“These findings suggest that T cells may target conserved epitopes and could offer cross-protection against a broad range of noroviruses,” Dr. Pattekar and colleagues wrote.
To test this hypothesis, they first collected peripheral blood mononuclear cells (PBMCs) from three healthy volunteers with unknown norovirus exposure history. Then serum samples were screened for norovirus functional antibodies via the binding between virus-like particles (VLPs) and histo–blood group antigens (HBGAs). This revealed disparate profiles of blocking antibodies against various norovirus strains. While donor 1 and donor 2 had antibodies against multiple strains, donor 3 lacked norovirus antibodies. Further testing showed that this latter individual was a nonsecretor with limited exposure history.
Next, the investigators tested donor PBMCs for norovirus-specific T-cell responses with use of overlapping libraries of peptides for each of the three norovirus open reading frames (ORF1, ORF2, and ORF3). T-cell responses, predominantly involving CD8+ T cells, were observed in all donors. While donor 1 had the greatest response to ORF1, donors 2 and 3 had responses that focused on ORF2.
“Thus, norovirus-specific T cells targeting ORF1 and ORF2 epitopes are present in peripheral blood from healthy donors regardless of secretor status,” the investigators wrote.
To better characterize T-cell epitopes, the investigators subdivided the overlapping peptide libraries into groups of shorter peptides, then exposed serum to these smaller component pools. This revealed eight HLA class I restricted epitopes that were derived from a genogroup II.4 pandemic norovirus strain; this group of variants has been responsible for all six of the norovirus pandemics since 1996.
Closer examination of the epitopes showed that they were “broadly conserved beyond GII.4.” Only one epitope exhibited variation in the C-terminal aromatic anchor, and it was nondominant. The investigators therefore identified seven immunodominant CD8+ epitopes, which they considered “valuable targets for vaccine and cell-based therapies.
“These data further confirm that epitope-specific CD8+ T cells are a universal feature of the overall norovirus immune response and could be an attractive target for future vaccines,” the investigators wrote.
Additional testing involving samples of spleen, mesenteric lymph nodes, and duodenum from deceased individuals showed presence of norovirus-specific CD8+ T cells, with particular abundance in intestinal tissue, and distinct phenotypes and functional properties in different tissue types.
“Future studies using tetramers and intestinal samples should build on these observations and fully define the location and microenvironment of norovirus-specific T cells,” the investigators wrote. “If carried out in the context of a vaccine trial, such studies could be highly valuable in elucidating tissue-resident memory correlates of norovirus immunity.”
The study was funded by the National Institutes of Health, the Wellcome Trust, and Deutsche Forschungsgemeinschaft. The investigators reported no conflicts of interest.
Protection against norovirus gastroenteritis is supported in part by norovirus-specific CD8+ T cells that reside in peripheral, intestinal, and lymphoid tissues, according to investigators.
These findings, and the molecular tools used to discover them, could guide development of a norovirus vaccine and novel cellular therapies, according to lead author Ajinkya Pattekar, MD, of the University of Pennsylvania, Philadelphia, and colleagues.
“Currently, there are no approved pharmacologic therapies against norovirus, and despite several promising clinical trials, an effective vaccine is not available,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology, which may stem from an incomplete understanding of norovirus immunity, according to Dr. Pattekar and colleagues.
They noted that most previous research has focused on humoral immunity, which appears variable between individuals, with some people exhibiting a strong humoral response, while others mount only partial humoral protection. The investigators also noted that, depending on which studies were examined, this type of defense could last years or fade within weeks to months and that “immune mechanisms other than antibodies may be important for protection against noroviruses.”
Specifically, cellular immunity may be at work. A 2020 study involving volunteers showed that T cells were cross-reactive to a type of norovirus the participants had never been exposed to.
“These findings suggest that T cells may target conserved epitopes and could offer cross-protection against a broad range of noroviruses,” Dr. Pattekar and colleagues wrote.
To test this hypothesis, they first collected peripheral blood mononuclear cells (PBMCs) from three healthy volunteers with unknown norovirus exposure history. Then serum samples were screened for norovirus functional antibodies via the binding between virus-like particles (VLPs) and histo–blood group antigens (HBGAs). This revealed disparate profiles of blocking antibodies against various norovirus strains. While donor 1 and donor 2 had antibodies against multiple strains, donor 3 lacked norovirus antibodies. Further testing showed that this latter individual was a nonsecretor with limited exposure history.
Next, the investigators tested donor PBMCs for norovirus-specific T-cell responses with use of overlapping libraries of peptides for each of the three norovirus open reading frames (ORF1, ORF2, and ORF3). T-cell responses, predominantly involving CD8+ T cells, were observed in all donors. While donor 1 had the greatest response to ORF1, donors 2 and 3 had responses that focused on ORF2.
“Thus, norovirus-specific T cells targeting ORF1 and ORF2 epitopes are present in peripheral blood from healthy donors regardless of secretor status,” the investigators wrote.
To better characterize T-cell epitopes, the investigators subdivided the overlapping peptide libraries into groups of shorter peptides, then exposed serum to these smaller component pools. This revealed eight HLA class I restricted epitopes that were derived from a genogroup II.4 pandemic norovirus strain; this group of variants has been responsible for all six of the norovirus pandemics since 1996.
Closer examination of the epitopes showed that they were “broadly conserved beyond GII.4.” Only one epitope exhibited variation in the C-terminal aromatic anchor, and it was nondominant. The investigators therefore identified seven immunodominant CD8+ epitopes, which they considered “valuable targets for vaccine and cell-based therapies.
“These data further confirm that epitope-specific CD8+ T cells are a universal feature of the overall norovirus immune response and could be an attractive target for future vaccines,” the investigators wrote.
Additional testing involving samples of spleen, mesenteric lymph nodes, and duodenum from deceased individuals showed presence of norovirus-specific CD8+ T cells, with particular abundance in intestinal tissue, and distinct phenotypes and functional properties in different tissue types.
“Future studies using tetramers and intestinal samples should build on these observations and fully define the location and microenvironment of norovirus-specific T cells,” the investigators wrote. “If carried out in the context of a vaccine trial, such studies could be highly valuable in elucidating tissue-resident memory correlates of norovirus immunity.”
The study was funded by the National Institutes of Health, the Wellcome Trust, and Deutsche Forschungsgemeinschaft. The investigators reported no conflicts of interest.
Protection against norovirus gastroenteritis is supported in part by norovirus-specific CD8+ T cells that reside in peripheral, intestinal, and lymphoid tissues, according to investigators.
These findings, and the molecular tools used to discover them, could guide development of a norovirus vaccine and novel cellular therapies, according to lead author Ajinkya Pattekar, MD, of the University of Pennsylvania, Philadelphia, and colleagues.
“Currently, there are no approved pharmacologic therapies against norovirus, and despite several promising clinical trials, an effective vaccine is not available,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology, which may stem from an incomplete understanding of norovirus immunity, according to Dr. Pattekar and colleagues.
They noted that most previous research has focused on humoral immunity, which appears variable between individuals, with some people exhibiting a strong humoral response, while others mount only partial humoral protection. The investigators also noted that, depending on which studies were examined, this type of defense could last years or fade within weeks to months and that “immune mechanisms other than antibodies may be important for protection against noroviruses.”
Specifically, cellular immunity may be at work. A 2020 study involving volunteers showed that T cells were cross-reactive to a type of norovirus the participants had never been exposed to.
“These findings suggest that T cells may target conserved epitopes and could offer cross-protection against a broad range of noroviruses,” Dr. Pattekar and colleagues wrote.
To test this hypothesis, they first collected peripheral blood mononuclear cells (PBMCs) from three healthy volunteers with unknown norovirus exposure history. Then serum samples were screened for norovirus functional antibodies via the binding between virus-like particles (VLPs) and histo–blood group antigens (HBGAs). This revealed disparate profiles of blocking antibodies against various norovirus strains. While donor 1 and donor 2 had antibodies against multiple strains, donor 3 lacked norovirus antibodies. Further testing showed that this latter individual was a nonsecretor with limited exposure history.
Next, the investigators tested donor PBMCs for norovirus-specific T-cell responses with use of overlapping libraries of peptides for each of the three norovirus open reading frames (ORF1, ORF2, and ORF3). T-cell responses, predominantly involving CD8+ T cells, were observed in all donors. While donor 1 had the greatest response to ORF1, donors 2 and 3 had responses that focused on ORF2.
“Thus, norovirus-specific T cells targeting ORF1 and ORF2 epitopes are present in peripheral blood from healthy donors regardless of secretor status,” the investigators wrote.
To better characterize T-cell epitopes, the investigators subdivided the overlapping peptide libraries into groups of shorter peptides, then exposed serum to these smaller component pools. This revealed eight HLA class I restricted epitopes that were derived from a genogroup II.4 pandemic norovirus strain; this group of variants has been responsible for all six of the norovirus pandemics since 1996.
Closer examination of the epitopes showed that they were “broadly conserved beyond GII.4.” Only one epitope exhibited variation in the C-terminal aromatic anchor, and it was nondominant. The investigators therefore identified seven immunodominant CD8+ epitopes, which they considered “valuable targets for vaccine and cell-based therapies.
“These data further confirm that epitope-specific CD8+ T cells are a universal feature of the overall norovirus immune response and could be an attractive target for future vaccines,” the investigators wrote.
Additional testing involving samples of spleen, mesenteric lymph nodes, and duodenum from deceased individuals showed presence of norovirus-specific CD8+ T cells, with particular abundance in intestinal tissue, and distinct phenotypes and functional properties in different tissue types.
“Future studies using tetramers and intestinal samples should build on these observations and fully define the location and microenvironment of norovirus-specific T cells,” the investigators wrote. “If carried out in the context of a vaccine trial, such studies could be highly valuable in elucidating tissue-resident memory correlates of norovirus immunity.”
The study was funded by the National Institutes of Health, the Wellcome Trust, and Deutsche Forschungsgemeinschaft. The investigators reported no conflicts of interest.
FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY
Pediatric NAFLD almost always stems from excess body weight, not other etiologies
Nonalcoholic fatty liver disease (NAFLD) in children is almost always caused by excess body weight, not other etiologies, based on a retrospective analysis of 900 patients.
Just 2% of children with overweight or obesity and suspected NAFLD had other causes of liver disease, and none tested positive for autoimmune hepatitis (AIH), reported lead author Toshifumi Yodoshi, MD, PhD, of Cincinnati Children’s Hospital Medical Center, and colleagues.
“Currently, recommended testing of patients with suspected NAFLD includes ruling out the following conditions: AIH, Wilson disease, hemochromatosis, alpha-1 antitrypsin [A1AT] deficiency, viral hepatitis, celiac disease, and thyroid dysfunction,” the investigators wrote in Pediatrics.
Yet evidence supporting this particular battery of tests is scant; just one previous pediatric study has estimated the prevalence of other liver diseases among children with suspected NAFLD. The study showed that the second-most common etiology, after NAFLD, was AIH, at a rate of 4%.
But “the generalizability of these findings is uncertain,” noted Dr. Yodoshi and colleagues, as the study was conducted at one tertiary center in the western United States, among a population that was predominantly Hispanic.
This uncertainty spurred the present study, which was conducted at two pediatric centers: Cincinnati Children’s Hospital Medical Center (2009-2017) and Yale New Haven (Conn.) Children’s Hospital (2012-2017).
The final analysis involved 900 patients aged 18 years or younger with suspected NAFLD based on hepatic steatosis detected via imaging and/or elevated serum aminotransferases. Demographically, a slight majority of the patients were boys (63%), and approximately one-quarter (26%) were Hispanic. Median BMI z score was 2.45, with three out of four patients (76%) exhibiting severe obesity. Out of 900 patients, 358 (40%) underwent liver biopsy, among whom 46% had confirmed nonalcoholic steatohepatitis.
All patients underwent testing to exclude the aforementioned conditions using various diagnostics, revealing that just 2% of the population had etiologies other than NAFLD. Specifically, 11 children had thyroid dysfunction (1.2%), 3 had celiac disease (0.4%), 3 had A1AT deficiency (0.4%), 1 had hemophagocytic lymphohistiocytosis, and 1 had Hodgkin’s lymphoma. None of the children had Wilson disease, hepatitis B or C, or AIH.
Dr. Yodoshi and colleagues highlighted the latter finding, noting that 13% of the patients had autoantibodies for AIH, but “none met composite criteria.” This contrasts with the previous study from 2013, which found an AIH rate of 4%.
“Nonetheless,” the investigators went on, “NAFLD remains a diagnosis of exclusion, and key conditions that require specific treatments must be ruled out in the workup of patients with suspected NAFLD. In the future, the cost-effectiveness of this approach will need to be investigated.”
Interpreting the findings, Francis E. Rushton, MD, of Beaufort (S.C.) Memorial Hospital emphasized the implications for preventive and interventional health care.
“This study showing an absence of etiologies other than obesity in overweight children with NAFLD provides further impetus for pediatricians to work on both preventive and treatment regimens for weight issues,” Dr. Rushton said. “Linking community-based initiatives focused on adequate nutritional support with pediatric clinical support services is critical in solving issues related to overweight in children. Tracking BMI over time and developing healthy habit goals for patients are key parts of clinical interventions.”
The study was funded by the National Institutes of Health. The investigators reported no conflicts of interest.
Nonalcoholic fatty liver disease (NAFLD) in children is almost always caused by excess body weight, not other etiologies, based on a retrospective analysis of 900 patients.
Just 2% of children with overweight or obesity and suspected NAFLD had other causes of liver disease, and none tested positive for autoimmune hepatitis (AIH), reported lead author Toshifumi Yodoshi, MD, PhD, of Cincinnati Children’s Hospital Medical Center, and colleagues.
“Currently, recommended testing of patients with suspected NAFLD includes ruling out the following conditions: AIH, Wilson disease, hemochromatosis, alpha-1 antitrypsin [A1AT] deficiency, viral hepatitis, celiac disease, and thyroid dysfunction,” the investigators wrote in Pediatrics.
Yet evidence supporting this particular battery of tests is scant; just one previous pediatric study has estimated the prevalence of other liver diseases among children with suspected NAFLD. The study showed that the second-most common etiology, after NAFLD, was AIH, at a rate of 4%.
But “the generalizability of these findings is uncertain,” noted Dr. Yodoshi and colleagues, as the study was conducted at one tertiary center in the western United States, among a population that was predominantly Hispanic.
This uncertainty spurred the present study, which was conducted at two pediatric centers: Cincinnati Children’s Hospital Medical Center (2009-2017) and Yale New Haven (Conn.) Children’s Hospital (2012-2017).
The final analysis involved 900 patients aged 18 years or younger with suspected NAFLD based on hepatic steatosis detected via imaging and/or elevated serum aminotransferases. Demographically, a slight majority of the patients were boys (63%), and approximately one-quarter (26%) were Hispanic. Median BMI z score was 2.45, with three out of four patients (76%) exhibiting severe obesity. Out of 900 patients, 358 (40%) underwent liver biopsy, among whom 46% had confirmed nonalcoholic steatohepatitis.
All patients underwent testing to exclude the aforementioned conditions using various diagnostics, revealing that just 2% of the population had etiologies other than NAFLD. Specifically, 11 children had thyroid dysfunction (1.2%), 3 had celiac disease (0.4%), 3 had A1AT deficiency (0.4%), 1 had hemophagocytic lymphohistiocytosis, and 1 had Hodgkin’s lymphoma. None of the children had Wilson disease, hepatitis B or C, or AIH.
Dr. Yodoshi and colleagues highlighted the latter finding, noting that 13% of the patients had autoantibodies for AIH, but “none met composite criteria.” This contrasts with the previous study from 2013, which found an AIH rate of 4%.
“Nonetheless,” the investigators went on, “NAFLD remains a diagnosis of exclusion, and key conditions that require specific treatments must be ruled out in the workup of patients with suspected NAFLD. In the future, the cost-effectiveness of this approach will need to be investigated.”
Interpreting the findings, Francis E. Rushton, MD, of Beaufort (S.C.) Memorial Hospital emphasized the implications for preventive and interventional health care.
“This study showing an absence of etiologies other than obesity in overweight children with NAFLD provides further impetus for pediatricians to work on both preventive and treatment regimens for weight issues,” Dr. Rushton said. “Linking community-based initiatives focused on adequate nutritional support with pediatric clinical support services is critical in solving issues related to overweight in children. Tracking BMI over time and developing healthy habit goals for patients are key parts of clinical interventions.”
The study was funded by the National Institutes of Health. The investigators reported no conflicts of interest.
Nonalcoholic fatty liver disease (NAFLD) in children is almost always caused by excess body weight, not other etiologies, based on a retrospective analysis of 900 patients.
Just 2% of children with overweight or obesity and suspected NAFLD had other causes of liver disease, and none tested positive for autoimmune hepatitis (AIH), reported lead author Toshifumi Yodoshi, MD, PhD, of Cincinnati Children’s Hospital Medical Center, and colleagues.
“Currently, recommended testing of patients with suspected NAFLD includes ruling out the following conditions: AIH, Wilson disease, hemochromatosis, alpha-1 antitrypsin [A1AT] deficiency, viral hepatitis, celiac disease, and thyroid dysfunction,” the investigators wrote in Pediatrics.
Yet evidence supporting this particular battery of tests is scant; just one previous pediatric study has estimated the prevalence of other liver diseases among children with suspected NAFLD. The study showed that the second-most common etiology, after NAFLD, was AIH, at a rate of 4%.
But “the generalizability of these findings is uncertain,” noted Dr. Yodoshi and colleagues, as the study was conducted at one tertiary center in the western United States, among a population that was predominantly Hispanic.
This uncertainty spurred the present study, which was conducted at two pediatric centers: Cincinnati Children’s Hospital Medical Center (2009-2017) and Yale New Haven (Conn.) Children’s Hospital (2012-2017).
The final analysis involved 900 patients aged 18 years or younger with suspected NAFLD based on hepatic steatosis detected via imaging and/or elevated serum aminotransferases. Demographically, a slight majority of the patients were boys (63%), and approximately one-quarter (26%) were Hispanic. Median BMI z score was 2.45, with three out of four patients (76%) exhibiting severe obesity. Out of 900 patients, 358 (40%) underwent liver biopsy, among whom 46% had confirmed nonalcoholic steatohepatitis.
All patients underwent testing to exclude the aforementioned conditions using various diagnostics, revealing that just 2% of the population had etiologies other than NAFLD. Specifically, 11 children had thyroid dysfunction (1.2%), 3 had celiac disease (0.4%), 3 had A1AT deficiency (0.4%), 1 had hemophagocytic lymphohistiocytosis, and 1 had Hodgkin’s lymphoma. None of the children had Wilson disease, hepatitis B or C, or AIH.
Dr. Yodoshi and colleagues highlighted the latter finding, noting that 13% of the patients had autoantibodies for AIH, but “none met composite criteria.” This contrasts with the previous study from 2013, which found an AIH rate of 4%.
“Nonetheless,” the investigators went on, “NAFLD remains a diagnosis of exclusion, and key conditions that require specific treatments must be ruled out in the workup of patients with suspected NAFLD. In the future, the cost-effectiveness of this approach will need to be investigated.”
Interpreting the findings, Francis E. Rushton, MD, of Beaufort (S.C.) Memorial Hospital emphasized the implications for preventive and interventional health care.
“This study showing an absence of etiologies other than obesity in overweight children with NAFLD provides further impetus for pediatricians to work on both preventive and treatment regimens for weight issues,” Dr. Rushton said. “Linking community-based initiatives focused on adequate nutritional support with pediatric clinical support services is critical in solving issues related to overweight in children. Tracking BMI over time and developing healthy habit goals for patients are key parts of clinical interventions.”
The study was funded by the National Institutes of Health. The investigators reported no conflicts of interest.
FROM PEDIATRICS
Maternal caffeine consumption, even small amounts, may reduce neonatal size
For pregnant women, just half a cup of coffee a day may reduce neonatal birth size and body weight, according to a prospective study involving more than 2,500 women.
That’s only 50 mg of a caffeine day, which falls below the upper threshold of 200 mg set by the American College of Obstetricians and Gynecologists, lead author Jessica Gleason, PhD, MPH, of the Eunice Kennedy Shriver National Institute of Child Health and Human Development, Bethesda, Md, and colleagues reported.
“Systematic reviews and meta-analyses have reported that maternal caffeine consumption, even in doses lower than 200 mg, is associated with a higher risk for low birth weight, small for gestational age (SGA), and fetal growth restriction, suggesting there may be no safe amount of caffeine during pregnancy,” the investigators wrote in JAMA Network Open.
Findings to date have been inconsistent, with a 2014 meta-analysis reporting contrary or null results in four out of nine studies.
Dr. Gleason and colleagues suggested that such discrepancies may be caused by uncontrolled confounding factors in some of the studies, such as smoking, as well as the inadequacy of self-reporting, which fails to incorporate variations in caffeine content between beverages, or differences in rates of metabolism between individuals.
“To our knowledge, no studies have examined the association between caffeine intake and neonatal anthropometric measures beyond weight, length, and head circumference, and few have analyzed plasma concentrations of caffeine and its metabolites or genetic variations in the rate of metabolism associated with neonatal size,” the investigators wrote.
Dr. Gleason and colleagues set out to address this knowledge gap with a prospective cohort study, including 2,055 nonsmoking women with low risk of birth defects who presented at 12 centers between 2009 and 2013. Mean participant age was 28.3 years and mean body mass index was 23.6. Races and ethnicities were represented almost evenly even across four groups: Hispanic (28.2%), White (27.4%), Black (25.2%), and Asian/Pacific Islander (19.2%). Rate of caffeine metabolism was defined by the single-nucleotide variant rs762551 (CYP1A2*1F), according to which, slightly more women had slow metabolism (52.7%) than fast metabolism (47.3%).
Women were enrolled at 8-13 weeks’ gestational age, at which time they underwent interviews and blood draws, allowing for measurement of caffeine and paraxanthine plasma levels, as well as self-reported caffeine consumption during the preceding week.
Over the course of six visits, fetal growth was observed via ultrasound. Medical records were used to determine birth weights and neonatal anthropometric measures, including fat and skin fold mass, body length, and circumferences of the thigh, arm, abdomen, and head.
Neonatal measurements were compared with plasma levels of caffeine and paraxanthine, both continuously and as quartiles (Q1, ≤ 28.3 ng/mL; Q2, 28.4-157.1 ng/mL; Q3, 157.2-658.8 ng/mL; Q4, > 658.8 ng/mL). Comparisons were also made with self-reported caffeine intake.
Women who reported drinking 1-50 mg of caffeine per day had neonates with smaller subscapular skin folds (beta = –0.14 mm; 95% confidence interval, –0.27 to -–0.01 mm), while those who reported more than 50 mg per day had newborns with lower birth weight (beta = –66 g; 95% CI, –121 to –10 g), and smaller circumferences of mid-upper thigh (beta = –0.32 cm; 95% CI, –0.55 to –0.09 cm), anterior thigh skin fold (beta = –0.24 mm; 95% CI, –0.47 to -.01 mm), and mid-upper arm (beta = –0.17 cm; 95% CI, –0.31 to –0.02 cm).
Caffeine plasma concentrations supported these findings.
Compared with women who had caffeine plasma concentrations in the lowest quartile, those in the highest quartile gave birth to neonates with shorter length (beta = –0.44 cm; P = .04 for trend) and lower body weight (beta = –84.3 g; P = .04 for trend), as well as smaller mid-upper arm circumference (beta = -0.25 cm; P = .02 for trend), mid-upper thigh circumference (beta = –0.29 cm; P = .07 for trend), and head circumference (beta = –0.28 cm; P < .001 for trend). A comparison of lower and upper paraxanthine quartiles revealed the similar trends, as did analyses of continuous measures.
“Our results suggest that caffeine consumption during pregnancy, even at levels much lower than the recommended 200 mg per day of caffeine may be associated with decreased fetal growth,” the investigators concluded.
Sarah W. Prager, MD, of the University of Washington, Seattle, suggested that the findings “do not demonstrate that caffeine has a clinically meaningful negative clinical impact on newborn size and weight.”
She noted that there was no difference in the rate of SGA between plasma caffeine quartiles, and that most patients were thin, which may not accurately represent the U.S. population.
“Based on these new data, my take home message to patients would be that increasing amounts of caffeine can have a small but real impact on the size of their baby at birth, though it is unlikely to result in a diagnosis of SGA,” she said. “Pregnant patients may want to limit caffeine intake even more than the ACOG recommendation of 200 mg per day.”
According to Robert M. Silver, MD, of the University of Utah Health Sciences Center, Salt Lake City, “data from this study are of high quality, owing to the prospective cohort design, large numbers, assessment of biomarkers, and sophisticated analyses.”
Still, he urged a cautious interpretation from a clinical perspective.
“It is important to not overreact to these data,” he said. “The decrease in fetal growth associated with caffeine is small and may prove to be clinically meaningless. Accordingly, clinical recommendations regarding caffeine intake during pregnancy should not be modified solely based on this study.”
Dr. Silver suggested that the findings deserve additional investigation.
“These observations warrant further research about the effects of caffeine exposure during pregnancy,” he said. “Ideally, studies should assess the effect of caffeine exposure on fetal growth in various pregnancy epochs as well as on neonatal and childhood growth.”
The study was funded by the Intramural Research Program of the NICHD. Dr. Gerlanc is an employee of The Prospective Group, which was contracted to provide statistical support.
For pregnant women, just half a cup of coffee a day may reduce neonatal birth size and body weight, according to a prospective study involving more than 2,500 women.
That’s only 50 mg of a caffeine day, which falls below the upper threshold of 200 mg set by the American College of Obstetricians and Gynecologists, lead author Jessica Gleason, PhD, MPH, of the Eunice Kennedy Shriver National Institute of Child Health and Human Development, Bethesda, Md, and colleagues reported.
“Systematic reviews and meta-analyses have reported that maternal caffeine consumption, even in doses lower than 200 mg, is associated with a higher risk for low birth weight, small for gestational age (SGA), and fetal growth restriction, suggesting there may be no safe amount of caffeine during pregnancy,” the investigators wrote in JAMA Network Open.
Findings to date have been inconsistent, with a 2014 meta-analysis reporting contrary or null results in four out of nine studies.
Dr. Gleason and colleagues suggested that such discrepancies may be caused by uncontrolled confounding factors in some of the studies, such as smoking, as well as the inadequacy of self-reporting, which fails to incorporate variations in caffeine content between beverages, or differences in rates of metabolism between individuals.
“To our knowledge, no studies have examined the association between caffeine intake and neonatal anthropometric measures beyond weight, length, and head circumference, and few have analyzed plasma concentrations of caffeine and its metabolites or genetic variations in the rate of metabolism associated with neonatal size,” the investigators wrote.
Dr. Gleason and colleagues set out to address this knowledge gap with a prospective cohort study, including 2,055 nonsmoking women with low risk of birth defects who presented at 12 centers between 2009 and 2013. Mean participant age was 28.3 years and mean body mass index was 23.6. Races and ethnicities were represented almost evenly even across four groups: Hispanic (28.2%), White (27.4%), Black (25.2%), and Asian/Pacific Islander (19.2%). Rate of caffeine metabolism was defined by the single-nucleotide variant rs762551 (CYP1A2*1F), according to which, slightly more women had slow metabolism (52.7%) than fast metabolism (47.3%).
Women were enrolled at 8-13 weeks’ gestational age, at which time they underwent interviews and blood draws, allowing for measurement of caffeine and paraxanthine plasma levels, as well as self-reported caffeine consumption during the preceding week.
Over the course of six visits, fetal growth was observed via ultrasound. Medical records were used to determine birth weights and neonatal anthropometric measures, including fat and skin fold mass, body length, and circumferences of the thigh, arm, abdomen, and head.
Neonatal measurements were compared with plasma levels of caffeine and paraxanthine, both continuously and as quartiles (Q1, ≤ 28.3 ng/mL; Q2, 28.4-157.1 ng/mL; Q3, 157.2-658.8 ng/mL; Q4, > 658.8 ng/mL). Comparisons were also made with self-reported caffeine intake.
Women who reported drinking 1-50 mg of caffeine per day had neonates with smaller subscapular skin folds (beta = –0.14 mm; 95% confidence interval, –0.27 to -–0.01 mm), while those who reported more than 50 mg per day had newborns with lower birth weight (beta = –66 g; 95% CI, –121 to –10 g), and smaller circumferences of mid-upper thigh (beta = –0.32 cm; 95% CI, –0.55 to –0.09 cm), anterior thigh skin fold (beta = –0.24 mm; 95% CI, –0.47 to -.01 mm), and mid-upper arm (beta = –0.17 cm; 95% CI, –0.31 to –0.02 cm).
Caffeine plasma concentrations supported these findings.
Compared with women who had caffeine plasma concentrations in the lowest quartile, those in the highest quartile gave birth to neonates with shorter length (beta = –0.44 cm; P = .04 for trend) and lower body weight (beta = –84.3 g; P = .04 for trend), as well as smaller mid-upper arm circumference (beta = -0.25 cm; P = .02 for trend), mid-upper thigh circumference (beta = –0.29 cm; P = .07 for trend), and head circumference (beta = –0.28 cm; P < .001 for trend). A comparison of lower and upper paraxanthine quartiles revealed the similar trends, as did analyses of continuous measures.
“Our results suggest that caffeine consumption during pregnancy, even at levels much lower than the recommended 200 mg per day of caffeine may be associated with decreased fetal growth,” the investigators concluded.
Sarah W. Prager, MD, of the University of Washington, Seattle, suggested that the findings “do not demonstrate that caffeine has a clinically meaningful negative clinical impact on newborn size and weight.”
She noted that there was no difference in the rate of SGA between plasma caffeine quartiles, and that most patients were thin, which may not accurately represent the U.S. population.
“Based on these new data, my take home message to patients would be that increasing amounts of caffeine can have a small but real impact on the size of their baby at birth, though it is unlikely to result in a diagnosis of SGA,” she said. “Pregnant patients may want to limit caffeine intake even more than the ACOG recommendation of 200 mg per day.”
According to Robert M. Silver, MD, of the University of Utah Health Sciences Center, Salt Lake City, “data from this study are of high quality, owing to the prospective cohort design, large numbers, assessment of biomarkers, and sophisticated analyses.”
Still, he urged a cautious interpretation from a clinical perspective.
“It is important to not overreact to these data,” he said. “The decrease in fetal growth associated with caffeine is small and may prove to be clinically meaningless. Accordingly, clinical recommendations regarding caffeine intake during pregnancy should not be modified solely based on this study.”
Dr. Silver suggested that the findings deserve additional investigation.
“These observations warrant further research about the effects of caffeine exposure during pregnancy,” he said. “Ideally, studies should assess the effect of caffeine exposure on fetal growth in various pregnancy epochs as well as on neonatal and childhood growth.”
The study was funded by the Intramural Research Program of the NICHD. Dr. Gerlanc is an employee of The Prospective Group, which was contracted to provide statistical support.
For pregnant women, just half a cup of coffee a day may reduce neonatal birth size and body weight, according to a prospective study involving more than 2,500 women.
That’s only 50 mg of a caffeine day, which falls below the upper threshold of 200 mg set by the American College of Obstetricians and Gynecologists, lead author Jessica Gleason, PhD, MPH, of the Eunice Kennedy Shriver National Institute of Child Health and Human Development, Bethesda, Md, and colleagues reported.
“Systematic reviews and meta-analyses have reported that maternal caffeine consumption, even in doses lower than 200 mg, is associated with a higher risk for low birth weight, small for gestational age (SGA), and fetal growth restriction, suggesting there may be no safe amount of caffeine during pregnancy,” the investigators wrote in JAMA Network Open.
Findings to date have been inconsistent, with a 2014 meta-analysis reporting contrary or null results in four out of nine studies.
Dr. Gleason and colleagues suggested that such discrepancies may be caused by uncontrolled confounding factors in some of the studies, such as smoking, as well as the inadequacy of self-reporting, which fails to incorporate variations in caffeine content between beverages, or differences in rates of metabolism between individuals.
“To our knowledge, no studies have examined the association between caffeine intake and neonatal anthropometric measures beyond weight, length, and head circumference, and few have analyzed plasma concentrations of caffeine and its metabolites or genetic variations in the rate of metabolism associated with neonatal size,” the investigators wrote.
Dr. Gleason and colleagues set out to address this knowledge gap with a prospective cohort study, including 2,055 nonsmoking women with low risk of birth defects who presented at 12 centers between 2009 and 2013. Mean participant age was 28.3 years and mean body mass index was 23.6. Races and ethnicities were represented almost evenly even across four groups: Hispanic (28.2%), White (27.4%), Black (25.2%), and Asian/Pacific Islander (19.2%). Rate of caffeine metabolism was defined by the single-nucleotide variant rs762551 (CYP1A2*1F), according to which, slightly more women had slow metabolism (52.7%) than fast metabolism (47.3%).
Women were enrolled at 8-13 weeks’ gestational age, at which time they underwent interviews and blood draws, allowing for measurement of caffeine and paraxanthine plasma levels, as well as self-reported caffeine consumption during the preceding week.
Over the course of six visits, fetal growth was observed via ultrasound. Medical records were used to determine birth weights and neonatal anthropometric measures, including fat and skin fold mass, body length, and circumferences of the thigh, arm, abdomen, and head.
Neonatal measurements were compared with plasma levels of caffeine and paraxanthine, both continuously and as quartiles (Q1, ≤ 28.3 ng/mL; Q2, 28.4-157.1 ng/mL; Q3, 157.2-658.8 ng/mL; Q4, > 658.8 ng/mL). Comparisons were also made with self-reported caffeine intake.
Women who reported drinking 1-50 mg of caffeine per day had neonates with smaller subscapular skin folds (beta = –0.14 mm; 95% confidence interval, –0.27 to -–0.01 mm), while those who reported more than 50 mg per day had newborns with lower birth weight (beta = –66 g; 95% CI, –121 to –10 g), and smaller circumferences of mid-upper thigh (beta = –0.32 cm; 95% CI, –0.55 to –0.09 cm), anterior thigh skin fold (beta = –0.24 mm; 95% CI, –0.47 to -.01 mm), and mid-upper arm (beta = –0.17 cm; 95% CI, –0.31 to –0.02 cm).
Caffeine plasma concentrations supported these findings.
Compared with women who had caffeine plasma concentrations in the lowest quartile, those in the highest quartile gave birth to neonates with shorter length (beta = –0.44 cm; P = .04 for trend) and lower body weight (beta = –84.3 g; P = .04 for trend), as well as smaller mid-upper arm circumference (beta = -0.25 cm; P = .02 for trend), mid-upper thigh circumference (beta = –0.29 cm; P = .07 for trend), and head circumference (beta = –0.28 cm; P < .001 for trend). A comparison of lower and upper paraxanthine quartiles revealed the similar trends, as did analyses of continuous measures.
“Our results suggest that caffeine consumption during pregnancy, even at levels much lower than the recommended 200 mg per day of caffeine may be associated with decreased fetal growth,” the investigators concluded.
Sarah W. Prager, MD, of the University of Washington, Seattle, suggested that the findings “do not demonstrate that caffeine has a clinically meaningful negative clinical impact on newborn size and weight.”
She noted that there was no difference in the rate of SGA between plasma caffeine quartiles, and that most patients were thin, which may not accurately represent the U.S. population.
“Based on these new data, my take home message to patients would be that increasing amounts of caffeine can have a small but real impact on the size of their baby at birth, though it is unlikely to result in a diagnosis of SGA,” she said. “Pregnant patients may want to limit caffeine intake even more than the ACOG recommendation of 200 mg per day.”
According to Robert M. Silver, MD, of the University of Utah Health Sciences Center, Salt Lake City, “data from this study are of high quality, owing to the prospective cohort design, large numbers, assessment of biomarkers, and sophisticated analyses.”
Still, he urged a cautious interpretation from a clinical perspective.
“It is important to not overreact to these data,” he said. “The decrease in fetal growth associated with caffeine is small and may prove to be clinically meaningless. Accordingly, clinical recommendations regarding caffeine intake during pregnancy should not be modified solely based on this study.”
Dr. Silver suggested that the findings deserve additional investigation.
“These observations warrant further research about the effects of caffeine exposure during pregnancy,” he said. “Ideally, studies should assess the effect of caffeine exposure on fetal growth in various pregnancy epochs as well as on neonatal and childhood growth.”
The study was funded by the Intramural Research Program of the NICHD. Dr. Gerlanc is an employee of The Prospective Group, which was contracted to provide statistical support.
FROM JAMA NETWORK OPEN
Preterm infant supine sleep positioning becoming more common, but racial/ethnic disparities remain
Although supine sleep positioning of preterm infants is becoming more common, racial disparities remain, according to a retrospective analysis involving more than 66,000 mothers.
Non-Hispanic Black preterm infants were 39%-56% less likely to sleep on their backs than were non-Hispanic White preterm infants, reported lead author Sunah S. Hwang, MD, MPH, of the University Colorado, Aurora, and colleagues.
According to the investigators, these findings may explain, in part, why the risk of sudden unexpected infant death (SUID) is more than twofold higher among non-Hispanic Black preterm infants than non-Hispanic White preterm infants.
“During the first year of life, one of the most effective and modifiable parental behaviors that may reduce the risk for SUID is adhering to safe infant sleep practices, including supine sleep positioning or back-sleeping,” wrote Dr. Hwang and colleagues. The report is in the Journal of Pediatrics. “For the healthy-term population, research on the racial/ethnic disparity in adherence to safe sleep practices is robust, but for preterm infants who are at much higher risk for SUID, less is known.”
To address this knowledge gap, the investigators conducted a retrospective study using data from the Pregnancy Risk Assessment Monitoring System (PRAMS), a population-based perinatal surveillance system. The final dataset involved 66,131 mothers who gave birth to preterm infants in 16 states between 2000 and 2015. The sample size was weighted to 1,020,986 mothers.
The investigators evaluated annual marginal prevalence of supine sleep positioning among two cohorts: early preterm infants (gestational age less than 34 weeks) and late preterm infants (gestational age 34-36 weeks). The primary outcome was rate of supine sleep positioning, a practice that must have been followed consistently, excluding other positions (i.e. prone or side). Mothers were grouped by race/ethnicity into four categories: non-Hispanic Black, non-Hispanic White, Hispanic, and other. Several other maternal and infant characteristics were recorded, including marital status, maternal age, education, insurance prior to birth, history of previous live birth, insurance, method of delivery, birth weight, and sex.
From 2000 to 2015, the overall adjusted odds of supine sleep positioning increased by 8.5% in the early preterm group and 5.2% in the late preterm group. This intergroup difference may be due to disparate levels of in-hospital education, the investigators suggested.
“Perhaps the longer NICU hospitalization for early preterm infants compared with late preterm infants affords greater opportunities for parental education and engagement about safe sleep practices,” they wrote.
Among early preterm infants, odds percentages increased by 7.3%, 7.7%, and 10.0% for non-Hispanic Black, Hispanic, and non-Hispanic White mothers, respectively. For late preterm infants, respective rates increased by 5.9%, 4.8%, and 5.8% for non-Hispanic Black, Hispanic, and non-Hispanic White mothers.
Despite these improvements, racial disparities were still observed. Non-Hispanic Black mothers reported lower rates of supine sleep positioning for both early preterm infants (odds ratio [OR], 0.61; P less than .0001) and late preterm infants (OR, 0.44; P less than .0001) compared with non-Hispanic White mothers.
These disparities seem “to be in line with racial/ethnic disparity trends in infant mortality and in SUID rates that have persisted for decades among infants,” the investigators wrote.
To a lesser degree, and lacking statistical significance, Hispanic mothers reported lower odds of supine sleep positioning than the odds of White mothers for both early preterm infants (OR, 0.80; P = .1670) and late preterm infants (OR, 0.81; P = .1054).
According to Dr. Hwang and colleagues, more specific demographic data are needed to accurately describe supine sleep positioning rates among Hispanic mothers, partly because of the heterogeneity of this cohort.
“A large body of literature has shown significant variability by immigrant status and country of origin in several infant health outcomes among the Hispanic population,” the investigators wrote. “This study was unable to stratify the Hispanic cohort by these characteristics and thus the distribution of supine sleep positioning prevalence across different Hispanic subgroups could not be demonstrated in this study.”
The investigators also suggested that interventional studies are needed.
“Additional efforts to understand the barriers and facilitators to SSP [supine sleep positioning] adherence among all preterm infant caregivers, particularly non-Hispanic Black and Hispanic parents, are needed so that novel interventions can then be developed,” they wrote.
According to Denice Cora-Bramble, MD, MBA, chief diversity officer at Children’s National Hospital and professor of pediatrics at George Washington University, Washington, the observed improvements in supine sleep positioning may predict lower rates of infant mortality, but more work in the area is needed.
“In spite of improvement in infants’ supine sleep positioning during the study period, racial/ethnic disparities persisted among non-Hispanic Blacks and Hispanics,” Dr. Cora-Bramble said. “That there was improvement among the populations included in the study is significant because of the associated and expected decrease in infant mortality. However, the study results need to be evaluated within the context of [the study’s] limitations, such as the inclusion of only sixteen states in the data analysis. More research is needed to understand and effectively address the disparities highlighted in the study.”
The investigators and Dr. Cora-Bramble reported no conflicts of interest.
Although supine sleep positioning of preterm infants is becoming more common, racial disparities remain, according to a retrospective analysis involving more than 66,000 mothers.
Non-Hispanic Black preterm infants were 39%-56% less likely to sleep on their backs than were non-Hispanic White preterm infants, reported lead author Sunah S. Hwang, MD, MPH, of the University Colorado, Aurora, and colleagues.
According to the investigators, these findings may explain, in part, why the risk of sudden unexpected infant death (SUID) is more than twofold higher among non-Hispanic Black preterm infants than non-Hispanic White preterm infants.
“During the first year of life, one of the most effective and modifiable parental behaviors that may reduce the risk for SUID is adhering to safe infant sleep practices, including supine sleep positioning or back-sleeping,” wrote Dr. Hwang and colleagues. The report is in the Journal of Pediatrics. “For the healthy-term population, research on the racial/ethnic disparity in adherence to safe sleep practices is robust, but for preterm infants who are at much higher risk for SUID, less is known.”
To address this knowledge gap, the investigators conducted a retrospective study using data from the Pregnancy Risk Assessment Monitoring System (PRAMS), a population-based perinatal surveillance system. The final dataset involved 66,131 mothers who gave birth to preterm infants in 16 states between 2000 and 2015. The sample size was weighted to 1,020,986 mothers.
The investigators evaluated annual marginal prevalence of supine sleep positioning among two cohorts: early preterm infants (gestational age less than 34 weeks) and late preterm infants (gestational age 34-36 weeks). The primary outcome was rate of supine sleep positioning, a practice that must have been followed consistently, excluding other positions (i.e. prone or side). Mothers were grouped by race/ethnicity into four categories: non-Hispanic Black, non-Hispanic White, Hispanic, and other. Several other maternal and infant characteristics were recorded, including marital status, maternal age, education, insurance prior to birth, history of previous live birth, insurance, method of delivery, birth weight, and sex.
From 2000 to 2015, the overall adjusted odds of supine sleep positioning increased by 8.5% in the early preterm group and 5.2% in the late preterm group. This intergroup difference may be due to disparate levels of in-hospital education, the investigators suggested.
“Perhaps the longer NICU hospitalization for early preterm infants compared with late preterm infants affords greater opportunities for parental education and engagement about safe sleep practices,” they wrote.
Among early preterm infants, odds percentages increased by 7.3%, 7.7%, and 10.0% for non-Hispanic Black, Hispanic, and non-Hispanic White mothers, respectively. For late preterm infants, respective rates increased by 5.9%, 4.8%, and 5.8% for non-Hispanic Black, Hispanic, and non-Hispanic White mothers.
Despite these improvements, racial disparities were still observed. Non-Hispanic Black mothers reported lower rates of supine sleep positioning for both early preterm infants (odds ratio [OR], 0.61; P less than .0001) and late preterm infants (OR, 0.44; P less than .0001) compared with non-Hispanic White mothers.
These disparities seem “to be in line with racial/ethnic disparity trends in infant mortality and in SUID rates that have persisted for decades among infants,” the investigators wrote.
To a lesser degree, and lacking statistical significance, Hispanic mothers reported lower odds of supine sleep positioning than the odds of White mothers for both early preterm infants (OR, 0.80; P = .1670) and late preterm infants (OR, 0.81; P = .1054).
According to Dr. Hwang and colleagues, more specific demographic data are needed to accurately describe supine sleep positioning rates among Hispanic mothers, partly because of the heterogeneity of this cohort.
“A large body of literature has shown significant variability by immigrant status and country of origin in several infant health outcomes among the Hispanic population,” the investigators wrote. “This study was unable to stratify the Hispanic cohort by these characteristics and thus the distribution of supine sleep positioning prevalence across different Hispanic subgroups could not be demonstrated in this study.”
The investigators also suggested that interventional studies are needed.
“Additional efforts to understand the barriers and facilitators to SSP [supine sleep positioning] adherence among all preterm infant caregivers, particularly non-Hispanic Black and Hispanic parents, are needed so that novel interventions can then be developed,” they wrote.
According to Denice Cora-Bramble, MD, MBA, chief diversity officer at Children’s National Hospital and professor of pediatrics at George Washington University, Washington, the observed improvements in supine sleep positioning may predict lower rates of infant mortality, but more work in the area is needed.
“In spite of improvement in infants’ supine sleep positioning during the study period, racial/ethnic disparities persisted among non-Hispanic Blacks and Hispanics,” Dr. Cora-Bramble said. “That there was improvement among the populations included in the study is significant because of the associated and expected decrease in infant mortality. However, the study results need to be evaluated within the context of [the study’s] limitations, such as the inclusion of only sixteen states in the data analysis. More research is needed to understand and effectively address the disparities highlighted in the study.”
The investigators and Dr. Cora-Bramble reported no conflicts of interest.
Although supine sleep positioning of preterm infants is becoming more common, racial disparities remain, according to a retrospective analysis involving more than 66,000 mothers.
Non-Hispanic Black preterm infants were 39%-56% less likely to sleep on their backs than were non-Hispanic White preterm infants, reported lead author Sunah S. Hwang, MD, MPH, of the University Colorado, Aurora, and colleagues.
According to the investigators, these findings may explain, in part, why the risk of sudden unexpected infant death (SUID) is more than twofold higher among non-Hispanic Black preterm infants than non-Hispanic White preterm infants.
“During the first year of life, one of the most effective and modifiable parental behaviors that may reduce the risk for SUID is adhering to safe infant sleep practices, including supine sleep positioning or back-sleeping,” wrote Dr. Hwang and colleagues. The report is in the Journal of Pediatrics. “For the healthy-term population, research on the racial/ethnic disparity in adherence to safe sleep practices is robust, but for preterm infants who are at much higher risk for SUID, less is known.”
To address this knowledge gap, the investigators conducted a retrospective study using data from the Pregnancy Risk Assessment Monitoring System (PRAMS), a population-based perinatal surveillance system. The final dataset involved 66,131 mothers who gave birth to preterm infants in 16 states between 2000 and 2015. The sample size was weighted to 1,020,986 mothers.
The investigators evaluated annual marginal prevalence of supine sleep positioning among two cohorts: early preterm infants (gestational age less than 34 weeks) and late preterm infants (gestational age 34-36 weeks). The primary outcome was rate of supine sleep positioning, a practice that must have been followed consistently, excluding other positions (i.e. prone or side). Mothers were grouped by race/ethnicity into four categories: non-Hispanic Black, non-Hispanic White, Hispanic, and other. Several other maternal and infant characteristics were recorded, including marital status, maternal age, education, insurance prior to birth, history of previous live birth, insurance, method of delivery, birth weight, and sex.
From 2000 to 2015, the overall adjusted odds of supine sleep positioning increased by 8.5% in the early preterm group and 5.2% in the late preterm group. This intergroup difference may be due to disparate levels of in-hospital education, the investigators suggested.
“Perhaps the longer NICU hospitalization for early preterm infants compared with late preterm infants affords greater opportunities for parental education and engagement about safe sleep practices,” they wrote.
Among early preterm infants, odds percentages increased by 7.3%, 7.7%, and 10.0% for non-Hispanic Black, Hispanic, and non-Hispanic White mothers, respectively. For late preterm infants, respective rates increased by 5.9%, 4.8%, and 5.8% for non-Hispanic Black, Hispanic, and non-Hispanic White mothers.
Despite these improvements, racial disparities were still observed. Non-Hispanic Black mothers reported lower rates of supine sleep positioning for both early preterm infants (odds ratio [OR], 0.61; P less than .0001) and late preterm infants (OR, 0.44; P less than .0001) compared with non-Hispanic White mothers.
These disparities seem “to be in line with racial/ethnic disparity trends in infant mortality and in SUID rates that have persisted for decades among infants,” the investigators wrote.
To a lesser degree, and lacking statistical significance, Hispanic mothers reported lower odds of supine sleep positioning than the odds of White mothers for both early preterm infants (OR, 0.80; P = .1670) and late preterm infants (OR, 0.81; P = .1054).
According to Dr. Hwang and colleagues, more specific demographic data are needed to accurately describe supine sleep positioning rates among Hispanic mothers, partly because of the heterogeneity of this cohort.
“A large body of literature has shown significant variability by immigrant status and country of origin in several infant health outcomes among the Hispanic population,” the investigators wrote. “This study was unable to stratify the Hispanic cohort by these characteristics and thus the distribution of supine sleep positioning prevalence across different Hispanic subgroups could not be demonstrated in this study.”
The investigators also suggested that interventional studies are needed.
“Additional efforts to understand the barriers and facilitators to SSP [supine sleep positioning] adherence among all preterm infant caregivers, particularly non-Hispanic Black and Hispanic parents, are needed so that novel interventions can then be developed,” they wrote.
According to Denice Cora-Bramble, MD, MBA, chief diversity officer at Children’s National Hospital and professor of pediatrics at George Washington University, Washington, the observed improvements in supine sleep positioning may predict lower rates of infant mortality, but more work in the area is needed.
“In spite of improvement in infants’ supine sleep positioning during the study period, racial/ethnic disparities persisted among non-Hispanic Blacks and Hispanics,” Dr. Cora-Bramble said. “That there was improvement among the populations included in the study is significant because of the associated and expected decrease in infant mortality. However, the study results need to be evaluated within the context of [the study’s] limitations, such as the inclusion of only sixteen states in the data analysis. More research is needed to understand and effectively address the disparities highlighted in the study.”
The investigators and Dr. Cora-Bramble reported no conflicts of interest.
FROM JOURNAL OF PEDIATRICS
Time is of the essence: DST up for debate again
Seasonal time change is now up for consideration in the U.S. Congress, prompting sleep medicine specialists to weigh in on the health impact of a major policy change.
As lawmakers in Washington propose an end to seasonal time changes by permanently establishing daylight saving time (DST), the American Academy of Sleep Medicine (AASM) is pushing for a Congressional hearing so scientists can present evidence in favor of converse legislation – to make standard time the new norm.
According to the AASM, ; however, the switch from standard time to DST incurs more risk.
“Current evidence best supports the adoption of year-round standard time, which aligns best with human circadian biology and provides distinct benefits for public health and safety,” the AASM noted in a 2020 position statement on DST.
The statement cites a number of studies that have reported associations between the switch to DST and acute, negative health outcomes, including higher rates of hospital admission, cardiovascular morbidity, atrial fibrillation, and stroke. The time shift has been associated with a spectrum of cellular, metabolic, and circadian derangements, from increased production of inflammatory markers, to higher blood pressure, and loss of sleep. These biological effects may have far-reaching consequences, including increased rates of fatal motor accidents in the days following the time change, and even increased volatility in the stock market, which may stem from cognitive deficits.
U.S. Senator Marco Rubio (R-Fla.) and others in the U.S. Congress have reintroduced the 2019 Sunshine Protection Act, legislation that would make DST permanent across the country. According to a statement on Sen. Rubio’s website, “The bill reflects the Florida legislature’s 2018 enactment of year-round DST; however, for Florida’s change to apply, a change in the federal statute is required. Fifteen other states – Arkansas, Alabama, California, Delaware, Georgia, Idaho, Louisiana, Maine, Ohio, Oregon, South Carolina, Tennessee, Utah, Washington, and Wyoming – have passed similar laws, resolutions, or voter initiatives, and dozens more are looking. The legislation, if enacted, would apply to those states [that] currently participate in DST, which most states observe for eight months out of the year.”
A stitch in time
“The sudden change in clock time disrupts sleep/wake patterns, decreasing total sleep time and sleep quality, leading to decrements in daytime cognition,” said Kannan Ramar, MBBS, MD, president of the AASM and a sleep medicine specialist at Mayo Clinic, Rochester, Minn.
Emphasizing this point, Dr. Ramar noted a recent study that reported an 18% increase in “patient safety-related incidents associated with human error” among health care workers within a week of the spring time change.
“Irregular bedtimes and wake times disrupt the timing of our circadian rhythms, which can lead to symptoms of insomnia or long-term, excessive daytime sleepiness. Lack of sleep can lead to numerous adverse effects on our minds, including decreased cognitive function, trouble concentrating, and general moodiness,” Dr. Ramar said.
He noted that these impacts may be more significant among certain individuals.
“The daylight saving time changes can be especially problematic for any populations that already experience chronic insufficient sleep or other sleep difficulties,” Dr. Ramar said. “Populations at greatest risk include teenagers, who tend to experience chronic sleep restriction during the school week, and night shift workers, who often struggle to sleep well during daytime hours.”
While fewer studies have evaluated the long-term effects of seasonal time changes, the AASM position statement cited evidence that “the body clock does not adjust to daylight saving time after several months,” possibly because “daylight saving time is less well-aligned with intrinsic human circadian physiology, and it disrupts the natural seasonal adjustment of the human clock due to the effect of late-evening light on the circadian rhythm.”
According to the AASM, permanent DST, as proposed by Sen. Rubio and colleagues, could “result in permanent phase delay, a condition that can also lead to a perpetual discrepancy between the innate biological clock and the extrinsic environmental clock, as well as chronic sleep loss due to early morning social demands that truncate the opportunity to sleep.” This mismatch between sleep/wake cycles and social demands, known as “social jet lag,” has been associated with chronic health risks, including metabolic syndrome, obesity, depression, and cardiovascular disease.
Cardiac impacts of seasonal time change
Muhammad Adeel Rishi, MD, a sleep specialist at Mayo Clinic, Eau Claire, Wis., and lead author of the AASM position statement, highlighted cardiovascular risks in a written statement for this article, noting increased rates of heart attack following the spring time change, and a higher risk of atrial fibrillation.
“Mayo Clinic has not taken a position on this issue,” Dr. Rishi noted. Still, he advocated for permanent standard time as the author of the AASM position statement and vice chair of the AASM public safety committee.
Jay Chudow, MD, and Andrew K. Krumerman, MD, of Montefiore Medical Center, New York, lead author and principal author, respectively, of a recent study that reported increased rates of atrial fibrillation admissions after DST transitions, had the same stance.
“We support elimination of seasonal time changes from a health perspective,” they wrote in a joint comment. “There is mounting evidence of a negative health impact with these seasonal time changes related to effects on sleep and circadian rhythm. Our work found the spring change was associated with more admissions for atrial fibrillation. This added to prior evidence of increased cardiovascular events related to these time changes. If physicians counsel patients on reducing risk factors for disease, shouldn’t we do the same as a society?”
Pros and cons
Not all sleep experts are convinced. Mary Jo Farmer, MD, PhD, FCCP, a sleep specialist and director of pulmonary hypertension services at Baystate Medical Center, and assistant professor of medicine at the University of Massachusetts, Springfield, considers perspectives from both sides of the issue.
“Daylight saving time promotes active lifestyles as people engage in more outdoor activities after work and school, [and] daylight saving time produces economic and safety benefits to society as retail revenues are higher and crimes are lower,” Dr. Farmer said. “Alternatively, moving the clocks forward is a cost burden to the U.S. economy when health issues, decreased productivity, and workplace injuries are considered.”
If one time system is permanently established, Dr. Farmer anticipates divided opinions from patients with sleep issues, regardless of which system is chosen.
“I can tell you, I have a cohort of sleep patients who prefer more evening light and look forward to the spring time change to daylight saving time,” she said. “However, they would not want the sun coming up at 9:00 a.m. in the winter months if we stayed on daylight saving time year-round. Similarly, patients would not want the sun coming up at 4:00 a.m. on the longest day of the year if we stayed on standard time all year round.”
Dr. Farmer called for more research before a decision is made.
“I suggest we need more information about the dangers of staying on daylight saving or standard time year-round because perhaps the current strategy of keeping morning light consistent is not so bad,” she said.
Time for a Congressional hearing?
According to Dr. Ramar, the time is now for a Congressional hearing, as lawmakers and the public need to be adequately informed when considering new legislation.
“There are public misconceptions about daylight saving time and standard time,” Dr. Ramar said. “People often like the idea of daylight saving time because they think it provides more light, and they dislike the concept of standard time because they think it provides more darkness. The reality is that neither time system provides more light or darkness than the other; it is only the timing that changes.”
Until new legislation is introduced, Dr. Ramar offered some practical advice for navigating seasonal time shifts.
“Beginning 2-3 days before the time change, it can be helpful to gradually adjust sleep and wake times, as well as other daily routines such as meal times,” he said. “After the time change, going outside for some morning light can help adjust the timing of your internal body clock.”
The investigators reported no conflicts of interest.
Seasonal time change is now up for consideration in the U.S. Congress, prompting sleep medicine specialists to weigh in on the health impact of a major policy change.
As lawmakers in Washington propose an end to seasonal time changes by permanently establishing daylight saving time (DST), the American Academy of Sleep Medicine (AASM) is pushing for a Congressional hearing so scientists can present evidence in favor of converse legislation – to make standard time the new norm.
According to the AASM, ; however, the switch from standard time to DST incurs more risk.
“Current evidence best supports the adoption of year-round standard time, which aligns best with human circadian biology and provides distinct benefits for public health and safety,” the AASM noted in a 2020 position statement on DST.
The statement cites a number of studies that have reported associations between the switch to DST and acute, negative health outcomes, including higher rates of hospital admission, cardiovascular morbidity, atrial fibrillation, and stroke. The time shift has been associated with a spectrum of cellular, metabolic, and circadian derangements, from increased production of inflammatory markers, to higher blood pressure, and loss of sleep. These biological effects may have far-reaching consequences, including increased rates of fatal motor accidents in the days following the time change, and even increased volatility in the stock market, which may stem from cognitive deficits.
U.S. Senator Marco Rubio (R-Fla.) and others in the U.S. Congress have reintroduced the 2019 Sunshine Protection Act, legislation that would make DST permanent across the country. According to a statement on Sen. Rubio’s website, “The bill reflects the Florida legislature’s 2018 enactment of year-round DST; however, for Florida’s change to apply, a change in the federal statute is required. Fifteen other states – Arkansas, Alabama, California, Delaware, Georgia, Idaho, Louisiana, Maine, Ohio, Oregon, South Carolina, Tennessee, Utah, Washington, and Wyoming – have passed similar laws, resolutions, or voter initiatives, and dozens more are looking. The legislation, if enacted, would apply to those states [that] currently participate in DST, which most states observe for eight months out of the year.”
A stitch in time
“The sudden change in clock time disrupts sleep/wake patterns, decreasing total sleep time and sleep quality, leading to decrements in daytime cognition,” said Kannan Ramar, MBBS, MD, president of the AASM and a sleep medicine specialist at Mayo Clinic, Rochester, Minn.
Emphasizing this point, Dr. Ramar noted a recent study that reported an 18% increase in “patient safety-related incidents associated with human error” among health care workers within a week of the spring time change.
“Irregular bedtimes and wake times disrupt the timing of our circadian rhythms, which can lead to symptoms of insomnia or long-term, excessive daytime sleepiness. Lack of sleep can lead to numerous adverse effects on our minds, including decreased cognitive function, trouble concentrating, and general moodiness,” Dr. Ramar said.
He noted that these impacts may be more significant among certain individuals.
“The daylight saving time changes can be especially problematic for any populations that already experience chronic insufficient sleep or other sleep difficulties,” Dr. Ramar said. “Populations at greatest risk include teenagers, who tend to experience chronic sleep restriction during the school week, and night shift workers, who often struggle to sleep well during daytime hours.”
While fewer studies have evaluated the long-term effects of seasonal time changes, the AASM position statement cited evidence that “the body clock does not adjust to daylight saving time after several months,” possibly because “daylight saving time is less well-aligned with intrinsic human circadian physiology, and it disrupts the natural seasonal adjustment of the human clock due to the effect of late-evening light on the circadian rhythm.”
According to the AASM, permanent DST, as proposed by Sen. Rubio and colleagues, could “result in permanent phase delay, a condition that can also lead to a perpetual discrepancy between the innate biological clock and the extrinsic environmental clock, as well as chronic sleep loss due to early morning social demands that truncate the opportunity to sleep.” This mismatch between sleep/wake cycles and social demands, known as “social jet lag,” has been associated with chronic health risks, including metabolic syndrome, obesity, depression, and cardiovascular disease.
Cardiac impacts of seasonal time change
Muhammad Adeel Rishi, MD, a sleep specialist at Mayo Clinic, Eau Claire, Wis., and lead author of the AASM position statement, highlighted cardiovascular risks in a written statement for this article, noting increased rates of heart attack following the spring time change, and a higher risk of atrial fibrillation.
“Mayo Clinic has not taken a position on this issue,” Dr. Rishi noted. Still, he advocated for permanent standard time as the author of the AASM position statement and vice chair of the AASM public safety committee.
Jay Chudow, MD, and Andrew K. Krumerman, MD, of Montefiore Medical Center, New York, lead author and principal author, respectively, of a recent study that reported increased rates of atrial fibrillation admissions after DST transitions, had the same stance.
“We support elimination of seasonal time changes from a health perspective,” they wrote in a joint comment. “There is mounting evidence of a negative health impact with these seasonal time changes related to effects on sleep and circadian rhythm. Our work found the spring change was associated with more admissions for atrial fibrillation. This added to prior evidence of increased cardiovascular events related to these time changes. If physicians counsel patients on reducing risk factors for disease, shouldn’t we do the same as a society?”
Pros and cons
Not all sleep experts are convinced. Mary Jo Farmer, MD, PhD, FCCP, a sleep specialist and director of pulmonary hypertension services at Baystate Medical Center, and assistant professor of medicine at the University of Massachusetts, Springfield, considers perspectives from both sides of the issue.
“Daylight saving time promotes active lifestyles as people engage in more outdoor activities after work and school, [and] daylight saving time produces economic and safety benefits to society as retail revenues are higher and crimes are lower,” Dr. Farmer said. “Alternatively, moving the clocks forward is a cost burden to the U.S. economy when health issues, decreased productivity, and workplace injuries are considered.”
If one time system is permanently established, Dr. Farmer anticipates divided opinions from patients with sleep issues, regardless of which system is chosen.
“I can tell you, I have a cohort of sleep patients who prefer more evening light and look forward to the spring time change to daylight saving time,” she said. “However, they would not want the sun coming up at 9:00 a.m. in the winter months if we stayed on daylight saving time year-round. Similarly, patients would not want the sun coming up at 4:00 a.m. on the longest day of the year if we stayed on standard time all year round.”
Dr. Farmer called for more research before a decision is made.
“I suggest we need more information about the dangers of staying on daylight saving or standard time year-round because perhaps the current strategy of keeping morning light consistent is not so bad,” she said.
Time for a Congressional hearing?
According to Dr. Ramar, the time is now for a Congressional hearing, as lawmakers and the public need to be adequately informed when considering new legislation.
“There are public misconceptions about daylight saving time and standard time,” Dr. Ramar said. “People often like the idea of daylight saving time because they think it provides more light, and they dislike the concept of standard time because they think it provides more darkness. The reality is that neither time system provides more light or darkness than the other; it is only the timing that changes.”
Until new legislation is introduced, Dr. Ramar offered some practical advice for navigating seasonal time shifts.
“Beginning 2-3 days before the time change, it can be helpful to gradually adjust sleep and wake times, as well as other daily routines such as meal times,” he said. “After the time change, going outside for some morning light can help adjust the timing of your internal body clock.”
The investigators reported no conflicts of interest.
Seasonal time change is now up for consideration in the U.S. Congress, prompting sleep medicine specialists to weigh in on the health impact of a major policy change.
As lawmakers in Washington propose an end to seasonal time changes by permanently establishing daylight saving time (DST), the American Academy of Sleep Medicine (AASM) is pushing for a Congressional hearing so scientists can present evidence in favor of converse legislation – to make standard time the new norm.
According to the AASM, ; however, the switch from standard time to DST incurs more risk.
“Current evidence best supports the adoption of year-round standard time, which aligns best with human circadian biology and provides distinct benefits for public health and safety,” the AASM noted in a 2020 position statement on DST.
The statement cites a number of studies that have reported associations between the switch to DST and acute, negative health outcomes, including higher rates of hospital admission, cardiovascular morbidity, atrial fibrillation, and stroke. The time shift has been associated with a spectrum of cellular, metabolic, and circadian derangements, from increased production of inflammatory markers, to higher blood pressure, and loss of sleep. These biological effects may have far-reaching consequences, including increased rates of fatal motor accidents in the days following the time change, and even increased volatility in the stock market, which may stem from cognitive deficits.
U.S. Senator Marco Rubio (R-Fla.) and others in the U.S. Congress have reintroduced the 2019 Sunshine Protection Act, legislation that would make DST permanent across the country. According to a statement on Sen. Rubio’s website, “The bill reflects the Florida legislature’s 2018 enactment of year-round DST; however, for Florida’s change to apply, a change in the federal statute is required. Fifteen other states – Arkansas, Alabama, California, Delaware, Georgia, Idaho, Louisiana, Maine, Ohio, Oregon, South Carolina, Tennessee, Utah, Washington, and Wyoming – have passed similar laws, resolutions, or voter initiatives, and dozens more are looking. The legislation, if enacted, would apply to those states [that] currently participate in DST, which most states observe for eight months out of the year.”
A stitch in time
“The sudden change in clock time disrupts sleep/wake patterns, decreasing total sleep time and sleep quality, leading to decrements in daytime cognition,” said Kannan Ramar, MBBS, MD, president of the AASM and a sleep medicine specialist at Mayo Clinic, Rochester, Minn.
Emphasizing this point, Dr. Ramar noted a recent study that reported an 18% increase in “patient safety-related incidents associated with human error” among health care workers within a week of the spring time change.
“Irregular bedtimes and wake times disrupt the timing of our circadian rhythms, which can lead to symptoms of insomnia or long-term, excessive daytime sleepiness. Lack of sleep can lead to numerous adverse effects on our minds, including decreased cognitive function, trouble concentrating, and general moodiness,” Dr. Ramar said.
He noted that these impacts may be more significant among certain individuals.
“The daylight saving time changes can be especially problematic for any populations that already experience chronic insufficient sleep or other sleep difficulties,” Dr. Ramar said. “Populations at greatest risk include teenagers, who tend to experience chronic sleep restriction during the school week, and night shift workers, who often struggle to sleep well during daytime hours.”
While fewer studies have evaluated the long-term effects of seasonal time changes, the AASM position statement cited evidence that “the body clock does not adjust to daylight saving time after several months,” possibly because “daylight saving time is less well-aligned with intrinsic human circadian physiology, and it disrupts the natural seasonal adjustment of the human clock due to the effect of late-evening light on the circadian rhythm.”
According to the AASM, permanent DST, as proposed by Sen. Rubio and colleagues, could “result in permanent phase delay, a condition that can also lead to a perpetual discrepancy between the innate biological clock and the extrinsic environmental clock, as well as chronic sleep loss due to early morning social demands that truncate the opportunity to sleep.” This mismatch between sleep/wake cycles and social demands, known as “social jet lag,” has been associated with chronic health risks, including metabolic syndrome, obesity, depression, and cardiovascular disease.
Cardiac impacts of seasonal time change
Muhammad Adeel Rishi, MD, a sleep specialist at Mayo Clinic, Eau Claire, Wis., and lead author of the AASM position statement, highlighted cardiovascular risks in a written statement for this article, noting increased rates of heart attack following the spring time change, and a higher risk of atrial fibrillation.
“Mayo Clinic has not taken a position on this issue,” Dr. Rishi noted. Still, he advocated for permanent standard time as the author of the AASM position statement and vice chair of the AASM public safety committee.
Jay Chudow, MD, and Andrew K. Krumerman, MD, of Montefiore Medical Center, New York, lead author and principal author, respectively, of a recent study that reported increased rates of atrial fibrillation admissions after DST transitions, had the same stance.
“We support elimination of seasonal time changes from a health perspective,” they wrote in a joint comment. “There is mounting evidence of a negative health impact with these seasonal time changes related to effects on sleep and circadian rhythm. Our work found the spring change was associated with more admissions for atrial fibrillation. This added to prior evidence of increased cardiovascular events related to these time changes. If physicians counsel patients on reducing risk factors for disease, shouldn’t we do the same as a society?”
Pros and cons
Not all sleep experts are convinced. Mary Jo Farmer, MD, PhD, FCCP, a sleep specialist and director of pulmonary hypertension services at Baystate Medical Center, and assistant professor of medicine at the University of Massachusetts, Springfield, considers perspectives from both sides of the issue.
“Daylight saving time promotes active lifestyles as people engage in more outdoor activities after work and school, [and] daylight saving time produces economic and safety benefits to society as retail revenues are higher and crimes are lower,” Dr. Farmer said. “Alternatively, moving the clocks forward is a cost burden to the U.S. economy when health issues, decreased productivity, and workplace injuries are considered.”
If one time system is permanently established, Dr. Farmer anticipates divided opinions from patients with sleep issues, regardless of which system is chosen.
“I can tell you, I have a cohort of sleep patients who prefer more evening light and look forward to the spring time change to daylight saving time,” she said. “However, they would not want the sun coming up at 9:00 a.m. in the winter months if we stayed on daylight saving time year-round. Similarly, patients would not want the sun coming up at 4:00 a.m. on the longest day of the year if we stayed on standard time all year round.”
Dr. Farmer called for more research before a decision is made.
“I suggest we need more information about the dangers of staying on daylight saving or standard time year-round because perhaps the current strategy of keeping morning light consistent is not so bad,” she said.
Time for a Congressional hearing?
According to Dr. Ramar, the time is now for a Congressional hearing, as lawmakers and the public need to be adequately informed when considering new legislation.
“There are public misconceptions about daylight saving time and standard time,” Dr. Ramar said. “People often like the idea of daylight saving time because they think it provides more light, and they dislike the concept of standard time because they think it provides more darkness. The reality is that neither time system provides more light or darkness than the other; it is only the timing that changes.”
Until new legislation is introduced, Dr. Ramar offered some practical advice for navigating seasonal time shifts.
“Beginning 2-3 days before the time change, it can be helpful to gradually adjust sleep and wake times, as well as other daily routines such as meal times,” he said. “After the time change, going outside for some morning light can help adjust the timing of your internal body clock.”
The investigators reported no conflicts of interest.
Adherence to antireflux lifestyle factors shows benefit in women
Antireflux lifestyle factors may significantly reduce the risk of gastroesophageal reflux disease (GERD), according to an analysis involving almost 43,000 women.
Even alongside therapy with a proton-pump inhibitor (PPI) and/or a histamine-receptor antagonist (H2RA), adherence to five antireflux lifestyle factors had a meaningful impact on risk for GERD symptoms, possibly preventing nearly 40% of cases with weekly GERD symptoms, reported lead author Raaj S. Mehta, MD, of Massachusetts General Hospital and Harvard Medical School, both in Boston, and colleagues.
“Clinicians recommend dietary and lifestyle modifications to prevent GERD symptoms, but no prospective data are available to inform these recommendations,” Dr. Mehta and colleagues wrote in JAMA Internal Medicine.
To address this gap, the investigators turned to the Nurses’ Health Study II, a nationwide, prospective study involving 116,671 women. The study, which has a follow-up rate exceeding 90%, began in 1989 and is ongoing. Participants complete biennial questionnaires that include a variety of health and lifestyle factors. In 2005, 2009, 2013, and 2017, the questionnaire inquired about heartburn or acid reflux.
The present analysis included data from 42,955 women aged 42-62 years. Participants were excluded at baseline if they had cancer, lacked dietary data, were lost to follow-up, already had GERD symptoms at least weekly, or used a PPI and/or H2RA on a regular basis. The final dataset included 392,215 person-years of follow-up, with 9,291 incident cases of GERD symptoms.
For each participant, the presence of five possible antireflux lifestyle factors were added together for a score ranging from 0 to 5: no more than two cups of soda, tea, or coffee per day; never smoking; normal body weight (BMI ≥18.5 and <25.0 kg/m2); “prudent” diet, based on top 40% of dietary pattern score; and at least 30 minutes of moderate to vigorous physical activity each day.
Multivariate logistic regression modeling showed that women who reported all five antireflux lifestyle factors had a 50% decreased risk of GERD symptoms (hazard ratio, 0.50; 95% confidence interval, 0.42-0.59), compared with women who adhered to none of them. Further analysis suggested that the collective effect of all five factors could reduce GERD symptom case volume by 37% (95% CI, 28%-46%).
Nonadherence to each antireflux lifestyle factor was independently associated with an increased risk of GERD symptoms. After mutual adjustment for other variables, BMI was associated with the highest population-attributable risk (19%), followed by physical activity (8%), food intake (7%), beverage intake (4%), and nonsmoker status (3%).
Dr. Mehta and colleagues also explored the relationship between GERD symptoms, antireflux medications, and lifestyle factors. Presence of all five antireflux factors was associated with a 53% decreased risk of GERD symptoms or initiation of PPI and/or H2RA therapy (HR, 0.47; 95% CI, 0.41-0.54). Among a group of 3,625 women who reported regular use of a PPI and/or H2RA and were free of GERD symptoms at baseline, adherence to all five lifestyle factors reduced risk of GERD symptoms by 68% (HR, 0.32; 95% CI, 0.18-0.57).
One limitation of the study was that its population was primarily White women; however, the authors noted a study suggesting GERD is more common in White women aged 30-60 years.
“Adherence to an antireflux lifestyle, even among regular users of PPIs and/or H2RAs, was associated with a decreased risk of GERD symptoms,” the investigators concluded.
Lifestyle matters
According to Ronnie Fass, MD, medical director of the Digestive Health Center at Case Western Reserve University, Cleveland, “This is the first study to show the incremental effect and thus the benefit of lifestyle factors in reducing the risk of GERD symptoms. While only five lifestyle factors were assessed in this study, potentially others may further decrease the risk for symptoms.”
Dr. Fass suggested that the nature of the data, which was self-reported, and the entirely female patient population, should inform interpretation of the findings.
“While nonerosive reflux disease is relatively more common in women, erosive esophagitis and Barrett’s esophagus are more common in men,” he said. “Furthermore, male gender is associated with more severe GERD and GERD complications.”
Yet Dr. Fass concluded by again emphasizing the merit of the analysis: “This is an important study that further supports the value of certain lifestyle factors in reducing the risk of GERD symptoms,” he said. “What is challenging for practicing physicians is to get patients to follow these lifestyle factors long term.”
The study was funded by the National Institutes of Health and by a Stuart and Suzanne Steele Massachusetts General Hospital Research Scholar Award. The investigators and Dr. Fass disclosed no conflicts of interest.
Antireflux lifestyle factors may significantly reduce the risk of gastroesophageal reflux disease (GERD), according to an analysis involving almost 43,000 women.
Even alongside therapy with a proton-pump inhibitor (PPI) and/or a histamine-receptor antagonist (H2RA), adherence to five antireflux lifestyle factors had a meaningful impact on risk for GERD symptoms, possibly preventing nearly 40% of cases with weekly GERD symptoms, reported lead author Raaj S. Mehta, MD, of Massachusetts General Hospital and Harvard Medical School, both in Boston, and colleagues.
“Clinicians recommend dietary and lifestyle modifications to prevent GERD symptoms, but no prospective data are available to inform these recommendations,” Dr. Mehta and colleagues wrote in JAMA Internal Medicine.
To address this gap, the investigators turned to the Nurses’ Health Study II, a nationwide, prospective study involving 116,671 women. The study, which has a follow-up rate exceeding 90%, began in 1989 and is ongoing. Participants complete biennial questionnaires that include a variety of health and lifestyle factors. In 2005, 2009, 2013, and 2017, the questionnaire inquired about heartburn or acid reflux.
The present analysis included data from 42,955 women aged 42-62 years. Participants were excluded at baseline if they had cancer, lacked dietary data, were lost to follow-up, already had GERD symptoms at least weekly, or used a PPI and/or H2RA on a regular basis. The final dataset included 392,215 person-years of follow-up, with 9,291 incident cases of GERD symptoms.
For each participant, the presence of five possible antireflux lifestyle factors were added together for a score ranging from 0 to 5: no more than two cups of soda, tea, or coffee per day; never smoking; normal body weight (BMI ≥18.5 and <25.0 kg/m2); “prudent” diet, based on top 40% of dietary pattern score; and at least 30 minutes of moderate to vigorous physical activity each day.
Multivariate logistic regression modeling showed that women who reported all five antireflux lifestyle factors had a 50% decreased risk of GERD symptoms (hazard ratio, 0.50; 95% confidence interval, 0.42-0.59), compared with women who adhered to none of them. Further analysis suggested that the collective effect of all five factors could reduce GERD symptom case volume by 37% (95% CI, 28%-46%).
Nonadherence to each antireflux lifestyle factor was independently associated with an increased risk of GERD symptoms. After mutual adjustment for other variables, BMI was associated with the highest population-attributable risk (19%), followed by physical activity (8%), food intake (7%), beverage intake (4%), and nonsmoker status (3%).
Dr. Mehta and colleagues also explored the relationship between GERD symptoms, antireflux medications, and lifestyle factors. Presence of all five antireflux factors was associated with a 53% decreased risk of GERD symptoms or initiation of PPI and/or H2RA therapy (HR, 0.47; 95% CI, 0.41-0.54). Among a group of 3,625 women who reported regular use of a PPI and/or H2RA and were free of GERD symptoms at baseline, adherence to all five lifestyle factors reduced risk of GERD symptoms by 68% (HR, 0.32; 95% CI, 0.18-0.57).
One limitation of the study was that its population was primarily White women; however, the authors noted a study suggesting GERD is more common in White women aged 30-60 years.
“Adherence to an antireflux lifestyle, even among regular users of PPIs and/or H2RAs, was associated with a decreased risk of GERD symptoms,” the investigators concluded.
Lifestyle matters
According to Ronnie Fass, MD, medical director of the Digestive Health Center at Case Western Reserve University, Cleveland, “This is the first study to show the incremental effect and thus the benefit of lifestyle factors in reducing the risk of GERD symptoms. While only five lifestyle factors were assessed in this study, potentially others may further decrease the risk for symptoms.”
Dr. Fass suggested that the nature of the data, which was self-reported, and the entirely female patient population, should inform interpretation of the findings.
“While nonerosive reflux disease is relatively more common in women, erosive esophagitis and Barrett’s esophagus are more common in men,” he said. “Furthermore, male gender is associated with more severe GERD and GERD complications.”
Yet Dr. Fass concluded by again emphasizing the merit of the analysis: “This is an important study that further supports the value of certain lifestyle factors in reducing the risk of GERD symptoms,” he said. “What is challenging for practicing physicians is to get patients to follow these lifestyle factors long term.”
The study was funded by the National Institutes of Health and by a Stuart and Suzanne Steele Massachusetts General Hospital Research Scholar Award. The investigators and Dr. Fass disclosed no conflicts of interest.
Antireflux lifestyle factors may significantly reduce the risk of gastroesophageal reflux disease (GERD), according to an analysis involving almost 43,000 women.
Even alongside therapy with a proton-pump inhibitor (PPI) and/or a histamine-receptor antagonist (H2RA), adherence to five antireflux lifestyle factors had a meaningful impact on risk for GERD symptoms, possibly preventing nearly 40% of cases with weekly GERD symptoms, reported lead author Raaj S. Mehta, MD, of Massachusetts General Hospital and Harvard Medical School, both in Boston, and colleagues.
“Clinicians recommend dietary and lifestyle modifications to prevent GERD symptoms, but no prospective data are available to inform these recommendations,” Dr. Mehta and colleagues wrote in JAMA Internal Medicine.
To address this gap, the investigators turned to the Nurses’ Health Study II, a nationwide, prospective study involving 116,671 women. The study, which has a follow-up rate exceeding 90%, began in 1989 and is ongoing. Participants complete biennial questionnaires that include a variety of health and lifestyle factors. In 2005, 2009, 2013, and 2017, the questionnaire inquired about heartburn or acid reflux.
The present analysis included data from 42,955 women aged 42-62 years. Participants were excluded at baseline if they had cancer, lacked dietary data, were lost to follow-up, already had GERD symptoms at least weekly, or used a PPI and/or H2RA on a regular basis. The final dataset included 392,215 person-years of follow-up, with 9,291 incident cases of GERD symptoms.
For each participant, the presence of five possible antireflux lifestyle factors were added together for a score ranging from 0 to 5: no more than two cups of soda, tea, or coffee per day; never smoking; normal body weight (BMI ≥18.5 and <25.0 kg/m2); “prudent” diet, based on top 40% of dietary pattern score; and at least 30 minutes of moderate to vigorous physical activity each day.
Multivariate logistic regression modeling showed that women who reported all five antireflux lifestyle factors had a 50% decreased risk of GERD symptoms (hazard ratio, 0.50; 95% confidence interval, 0.42-0.59), compared with women who adhered to none of them. Further analysis suggested that the collective effect of all five factors could reduce GERD symptom case volume by 37% (95% CI, 28%-46%).
Nonadherence to each antireflux lifestyle factor was independently associated with an increased risk of GERD symptoms. After mutual adjustment for other variables, BMI was associated with the highest population-attributable risk (19%), followed by physical activity (8%), food intake (7%), beverage intake (4%), and nonsmoker status (3%).
Dr. Mehta and colleagues also explored the relationship between GERD symptoms, antireflux medications, and lifestyle factors. Presence of all five antireflux factors was associated with a 53% decreased risk of GERD symptoms or initiation of PPI and/or H2RA therapy (HR, 0.47; 95% CI, 0.41-0.54). Among a group of 3,625 women who reported regular use of a PPI and/or H2RA and were free of GERD symptoms at baseline, adherence to all five lifestyle factors reduced risk of GERD symptoms by 68% (HR, 0.32; 95% CI, 0.18-0.57).
One limitation of the study was that its population was primarily White women; however, the authors noted a study suggesting GERD is more common in White women aged 30-60 years.
“Adherence to an antireflux lifestyle, even among regular users of PPIs and/or H2RAs, was associated with a decreased risk of GERD symptoms,” the investigators concluded.
Lifestyle matters
According to Ronnie Fass, MD, medical director of the Digestive Health Center at Case Western Reserve University, Cleveland, “This is the first study to show the incremental effect and thus the benefit of lifestyle factors in reducing the risk of GERD symptoms. While only five lifestyle factors were assessed in this study, potentially others may further decrease the risk for symptoms.”
Dr. Fass suggested that the nature of the data, which was self-reported, and the entirely female patient population, should inform interpretation of the findings.
“While nonerosive reflux disease is relatively more common in women, erosive esophagitis and Barrett’s esophagus are more common in men,” he said. “Furthermore, male gender is associated with more severe GERD and GERD complications.”
Yet Dr. Fass concluded by again emphasizing the merit of the analysis: “This is an important study that further supports the value of certain lifestyle factors in reducing the risk of GERD symptoms,” he said. “What is challenging for practicing physicians is to get patients to follow these lifestyle factors long term.”
The study was funded by the National Institutes of Health and by a Stuart and Suzanne Steele Massachusetts General Hospital Research Scholar Award. The investigators and Dr. Fass disclosed no conflicts of interest.
FROM JAMA INTERNAL MEDICINE
AAP issues five recommendations for common dermatologic problems
The American Academy of Pediatrics recently issued five recommendations for the most common dermatologic problems in primary care pediatrics.
Topics include diagnostic and management strategies for a variety of conditions, including atopic dermatitis, fungal infections, and autoimmune conditions.
The AAP Section on Dermatology created the recommendations, which were then reviewed and approved by “more than a dozen relevant AAP committees, councils, and sections,” before final approval by the AAP executive committee and board of directors.
The final list represents a collaborative effort with the Choosing Wisely initiative of the American Board of Internal Medicine Foundation, which aims “to promote conversations between clinicians and patients by helping patients choose care that is supported by evidence, not duplicative of other tests or procedures already received, free from harm, [and] truly necessary.”
Lawrence Eichenfield, MD, professor of dermatology and pediatrics at the University of California, San Diego, and chief of pediatric and adolescent dermatology at Rady Children’s Hospital, San Diego, said that the recommendations are “a fine set of suggestions to help health care providers with some of their pediatric dermatology issues.”
• To begin, the AAP recommended against use of combination topical steroid antifungals for candida skin infections, diaper dermatitis, and tinea corporis, despite approvals for these indications.
“Many providers are unaware that the combination products contain a relatively high-potency topical steroid,” the AAP wrote, noting that “combination products are also often expensive and not covered by pharmacy plans.”
Diaper dermatitis responds best to barrier creams and ointments alone, according to the AAP. If needed, a topical, low-potency steroid may be used no more than twice a day, and tapered with improvement. Similarly, the AAP recommended a separate, low-potency steroid for tinea corporis if pruritus is severe.
• In contrast with this call for minimal treatment intensity, the AAP recommended a more intensive approach to tinea capitis, advising against topical medications alone.
“Topical treatments cannot penetrate the hair shaft itself, which is where the infection lies; thus, monotherapy with topical medications is insufficient to effectively treat the infection,” the AAP wrote. “This insufficient treatment can lead to increased health care costs resulting from multiple visits and the prescribing of ineffective medications.”
While medicated shampoos may still be used as adjunctive treatments for tinea capitis, the AAP recommended primary therapy with either griseofulvin or terbinafine, slightly favoring terbinafine because of adequate efficacy, lesser expense, and shorter regimen.
According to Dr. Eichenfield, a more thorough workup should also be considered.
“Consider culturing possible tinea capitis, so that oral antifungals can be used judiciously and not used for other scaling scalp diagnoses,” he said.
• For most cases of atopic dermatitis, the AAP advised against oral or injected corticosteroids, despite rapid efficacy, because of potential for adverse events, such as adrenal suppression, growth retardation, and disease worsening upon discontinuation. Instead, they recommended topical therapies, “good skin care practices,” and if necessary, “phototherapy and/or steroid-sparing systemic agents.”
“Systemic corticosteroids should only be prescribed for severe flares once all other treatment options have been exhausted and should be limited to a short course for the purpose of bridging to a steroid-sparing agent,” the AAP wrote.
Dr. Eichenfield emphasized this point, noting that new therapies have expanded treatment options.
“Be aware of the advances in atopic dermatitis,” he said, “with newer topical medications and with a new systemic biologic agent approved for moderate to severe refractory atopic dermatitis for ages 6 and older.”
• Turning to diagnostic strategies, the AAP recommended against routine laboratory testing for associated autoimmune diseases among patients with vitiligo, unless clinical signs and/or symptoms of such diseases are present.
“There is no convincing evidence that extensive workups in the absence of specific clinical suspicion improves outcomes for patients and may in fact beget additional costs and harms,” the AAP wrote. “Although many studies suggest ordering these tests, it is based largely on the increased cosegregation of vitiligo and thyroid disease and not on improved outcomes from having identified an abnormal laboratory test result.”
• Similarly, the AAP advised practitioners to avoid routinely testing patients with alopecia areata for other diseases if relevant symptoms and signs aren’t present.
“As in the case of vitiligo, it is more common to find thyroid autoantibodies or subclinical hypothyroidism than overt thyroid disease, unless there are clinically suspicious findings,” the AAP wrote. “Patients identified as having subclinical hypothyroidism are not currently treated and may even have resolution of the abnormal TSH.”
Before drawing blood, Dr. Eichenfield suggested that clinicians first ask the right questions.
“Be comfortable with screening questions about growth, weight, or activity changes to assist with decisions for thyroid screening in a patient with vitiligo or alopecia areata,” he said.
Choosing Wisely is an initiative of the American Board of Internal Medicine. The AAP and Dr. Eichenfield reported no conflicts of interest.
The American Academy of Pediatrics recently issued five recommendations for the most common dermatologic problems in primary care pediatrics.
Topics include diagnostic and management strategies for a variety of conditions, including atopic dermatitis, fungal infections, and autoimmune conditions.
The AAP Section on Dermatology created the recommendations, which were then reviewed and approved by “more than a dozen relevant AAP committees, councils, and sections,” before final approval by the AAP executive committee and board of directors.
The final list represents a collaborative effort with the Choosing Wisely initiative of the American Board of Internal Medicine Foundation, which aims “to promote conversations between clinicians and patients by helping patients choose care that is supported by evidence, not duplicative of other tests or procedures already received, free from harm, [and] truly necessary.”
Lawrence Eichenfield, MD, professor of dermatology and pediatrics at the University of California, San Diego, and chief of pediatric and adolescent dermatology at Rady Children’s Hospital, San Diego, said that the recommendations are “a fine set of suggestions to help health care providers with some of their pediatric dermatology issues.”
• To begin, the AAP recommended against use of combination topical steroid antifungals for candida skin infections, diaper dermatitis, and tinea corporis, despite approvals for these indications.
“Many providers are unaware that the combination products contain a relatively high-potency topical steroid,” the AAP wrote, noting that “combination products are also often expensive and not covered by pharmacy plans.”
Diaper dermatitis responds best to barrier creams and ointments alone, according to the AAP. If needed, a topical, low-potency steroid may be used no more than twice a day, and tapered with improvement. Similarly, the AAP recommended a separate, low-potency steroid for tinea corporis if pruritus is severe.
• In contrast with this call for minimal treatment intensity, the AAP recommended a more intensive approach to tinea capitis, advising against topical medications alone.
“Topical treatments cannot penetrate the hair shaft itself, which is where the infection lies; thus, monotherapy with topical medications is insufficient to effectively treat the infection,” the AAP wrote. “This insufficient treatment can lead to increased health care costs resulting from multiple visits and the prescribing of ineffective medications.”
While medicated shampoos may still be used as adjunctive treatments for tinea capitis, the AAP recommended primary therapy with either griseofulvin or terbinafine, slightly favoring terbinafine because of adequate efficacy, lesser expense, and shorter regimen.
According to Dr. Eichenfield, a more thorough workup should also be considered.
“Consider culturing possible tinea capitis, so that oral antifungals can be used judiciously and not used for other scaling scalp diagnoses,” he said.
• For most cases of atopic dermatitis, the AAP advised against oral or injected corticosteroids, despite rapid efficacy, because of potential for adverse events, such as adrenal suppression, growth retardation, and disease worsening upon discontinuation. Instead, they recommended topical therapies, “good skin care practices,” and if necessary, “phototherapy and/or steroid-sparing systemic agents.”
“Systemic corticosteroids should only be prescribed for severe flares once all other treatment options have been exhausted and should be limited to a short course for the purpose of bridging to a steroid-sparing agent,” the AAP wrote.
Dr. Eichenfield emphasized this point, noting that new therapies have expanded treatment options.
“Be aware of the advances in atopic dermatitis,” he said, “with newer topical medications and with a new systemic biologic agent approved for moderate to severe refractory atopic dermatitis for ages 6 and older.”
• Turning to diagnostic strategies, the AAP recommended against routine laboratory testing for associated autoimmune diseases among patients with vitiligo, unless clinical signs and/or symptoms of such diseases are present.
“There is no convincing evidence that extensive workups in the absence of specific clinical suspicion improves outcomes for patients and may in fact beget additional costs and harms,” the AAP wrote. “Although many studies suggest ordering these tests, it is based largely on the increased cosegregation of vitiligo and thyroid disease and not on improved outcomes from having identified an abnormal laboratory test result.”
• Similarly, the AAP advised practitioners to avoid routinely testing patients with alopecia areata for other diseases if relevant symptoms and signs aren’t present.
“As in the case of vitiligo, it is more common to find thyroid autoantibodies or subclinical hypothyroidism than overt thyroid disease, unless there are clinically suspicious findings,” the AAP wrote. “Patients identified as having subclinical hypothyroidism are not currently treated and may even have resolution of the abnormal TSH.”
Before drawing blood, Dr. Eichenfield suggested that clinicians first ask the right questions.
“Be comfortable with screening questions about growth, weight, or activity changes to assist with decisions for thyroid screening in a patient with vitiligo or alopecia areata,” he said.
Choosing Wisely is an initiative of the American Board of Internal Medicine. The AAP and Dr. Eichenfield reported no conflicts of interest.
The American Academy of Pediatrics recently issued five recommendations for the most common dermatologic problems in primary care pediatrics.
Topics include diagnostic and management strategies for a variety of conditions, including atopic dermatitis, fungal infections, and autoimmune conditions.
The AAP Section on Dermatology created the recommendations, which were then reviewed and approved by “more than a dozen relevant AAP committees, councils, and sections,” before final approval by the AAP executive committee and board of directors.
The final list represents a collaborative effort with the Choosing Wisely initiative of the American Board of Internal Medicine Foundation, which aims “to promote conversations between clinicians and patients by helping patients choose care that is supported by evidence, not duplicative of other tests or procedures already received, free from harm, [and] truly necessary.”
Lawrence Eichenfield, MD, professor of dermatology and pediatrics at the University of California, San Diego, and chief of pediatric and adolescent dermatology at Rady Children’s Hospital, San Diego, said that the recommendations are “a fine set of suggestions to help health care providers with some of their pediatric dermatology issues.”
• To begin, the AAP recommended against use of combination topical steroid antifungals for candida skin infections, diaper dermatitis, and tinea corporis, despite approvals for these indications.
“Many providers are unaware that the combination products contain a relatively high-potency topical steroid,” the AAP wrote, noting that “combination products are also often expensive and not covered by pharmacy plans.”
Diaper dermatitis responds best to barrier creams and ointments alone, according to the AAP. If needed, a topical, low-potency steroid may be used no more than twice a day, and tapered with improvement. Similarly, the AAP recommended a separate, low-potency steroid for tinea corporis if pruritus is severe.
• In contrast with this call for minimal treatment intensity, the AAP recommended a more intensive approach to tinea capitis, advising against topical medications alone.
“Topical treatments cannot penetrate the hair shaft itself, which is where the infection lies; thus, monotherapy with topical medications is insufficient to effectively treat the infection,” the AAP wrote. “This insufficient treatment can lead to increased health care costs resulting from multiple visits and the prescribing of ineffective medications.”
While medicated shampoos may still be used as adjunctive treatments for tinea capitis, the AAP recommended primary therapy with either griseofulvin or terbinafine, slightly favoring terbinafine because of adequate efficacy, lesser expense, and shorter regimen.
According to Dr. Eichenfield, a more thorough workup should also be considered.
“Consider culturing possible tinea capitis, so that oral antifungals can be used judiciously and not used for other scaling scalp diagnoses,” he said.
• For most cases of atopic dermatitis, the AAP advised against oral or injected corticosteroids, despite rapid efficacy, because of potential for adverse events, such as adrenal suppression, growth retardation, and disease worsening upon discontinuation. Instead, they recommended topical therapies, “good skin care practices,” and if necessary, “phototherapy and/or steroid-sparing systemic agents.”
“Systemic corticosteroids should only be prescribed for severe flares once all other treatment options have been exhausted and should be limited to a short course for the purpose of bridging to a steroid-sparing agent,” the AAP wrote.
Dr. Eichenfield emphasized this point, noting that new therapies have expanded treatment options.
“Be aware of the advances in atopic dermatitis,” he said, “with newer topical medications and with a new systemic biologic agent approved for moderate to severe refractory atopic dermatitis for ages 6 and older.”
• Turning to diagnostic strategies, the AAP recommended against routine laboratory testing for associated autoimmune diseases among patients with vitiligo, unless clinical signs and/or symptoms of such diseases are present.
“There is no convincing evidence that extensive workups in the absence of specific clinical suspicion improves outcomes for patients and may in fact beget additional costs and harms,” the AAP wrote. “Although many studies suggest ordering these tests, it is based largely on the increased cosegregation of vitiligo and thyroid disease and not on improved outcomes from having identified an abnormal laboratory test result.”
• Similarly, the AAP advised practitioners to avoid routinely testing patients with alopecia areata for other diseases if relevant symptoms and signs aren’t present.
“As in the case of vitiligo, it is more common to find thyroid autoantibodies or subclinical hypothyroidism than overt thyroid disease, unless there are clinically suspicious findings,” the AAP wrote. “Patients identified as having subclinical hypothyroidism are not currently treated and may even have resolution of the abnormal TSH.”
Before drawing blood, Dr. Eichenfield suggested that clinicians first ask the right questions.
“Be comfortable with screening questions about growth, weight, or activity changes to assist with decisions for thyroid screening in a patient with vitiligo or alopecia areata,” he said.
Choosing Wisely is an initiative of the American Board of Internal Medicine. The AAP and Dr. Eichenfield reported no conflicts of interest.
FROM CHOOSING WISELY AND THE AAP
Liver stiffness predicts hepatic events in NAFLD
Among patients with nonalcoholic fatty liver disease (NAFLD) and compensated advanced chronic liver disease, liver stiffness measurements (LSMs) are associated with risks of hepatic events, according to a retrospective analysis of more than 1,000 patients.
“[N]oninvasive markers that can predict liver disease severity and outcomes in patients with NAFLD and advanced fibrosis are a major unmet need,” wrote lead author Salvatore Petta, MD, of the University of Palermo, Italy, and colleagues. Their report is in Clinical Gastroenterology and Hepatology. “Data about the accuracy of LSM in the prediction of events in NAFLD, and especially in patients with NAFLD and F3-F4 fibrosis, are scarce.”
To address this knowledge gap, the investigators retrospectively analyzed data from 1,039 consecutive patients with NAFLD who had baseline LSMs of more than 10 kPa and/or histologically diagnosed F3-F4 fibrosis. Patients were prospectively recruited at 10 centers in 6 countries, then followed for a median of 35 months, ranging from 19 to 63 months.
All patients had their liver stiffness measured with an M or XL probe at baseline. In addition, approximately half of the patients (n = 533) had a follow-up measurement using the same method, generating a subgroup with changes in liver stiffness. “Improved” liver stiffness was defined as a decrease in LSM greater than 20% from baseline, “impaired” liver stiffness was defined as an increase in LSM greater than 20% from baseline, and “stable” liver stiffness was defined as a change falling between 20% lower and 20% higher than baseline.
At baseline, mean LSM was 17.6 kPa. Cox regression analysis revealed that baseline LSM was independently associated with HCC (hazard ratio, 1.03; 95% confidence interval, 1.00-1.04; P = .003), liver decompensation (HR, 1.03; 95% CI, 1.02-1.04; P < .001), and liver-related death (HR, 1.02; 95% CI, 1.00-1.03; P = .005), but not extrahepatic events.
According to the investigators, the association between LSM at baseline and risk of liver decompensation was maintained after adjustment for the severity of liver disease and for surrogate markers of portal hypertension, they noted. Furthermore, patients with a baseline LSM of at least 21 kPa – which indicates high risk of clinically significant portal hypertension (CSPH) – were at greater risk of liver decompensation than were those with an LSM less than 21 kPa (HR, 3.71; 95% CI, 1.89-6.78; P = .04).
In the subgroup with follow-up measurements, approximately half of the patients had an improved LSM (53.3%), while 27.2% had a stable LSM, and 19.5% had an impaired LSM, a pattern that was significantly associated with diabetes at baseline (P = .01).
“These data agree with the available literature identifying diabetes as a risk factor for liver disease progression and liver-related complications,” the investigators wrote.
Cox regression showed that, among those with follow-up LSM, changes in LSM were independently associated with HCC (HR, 1.72; 95% CI, 1.01-3.02; P = .04), liver decompensation (HR, 1.56; 95% CI, 1.05-2.51; P = . 04), liver-related mortality (HR, 1.96; 95% CI, 1.10-3.38; P = .02), and mortality of any cause (HR, 1.73; 95% CI, 1.11-2.69; P = .01).
These risks could be further stratified by level of change in liver stiffness, with greater impairment predicting greater risk: The crude rate of liver decompensation was 14.4% among those with impaired LSM, compared with 6.2% among those with stable LSM and 3.8% among those with LSM improvement. That said, the categories of changes in LSM were not predictive of decompensation among patients with high risk of CSPH at baseline; however, they remained predictive among those with low risk of CSPH at baseline.
“[T]his study … showed that an integrated assessment of baseline LSM or [changes in LSM] can help in stratifying the risk of development of liver-related complications and of both hepatic and overall mortality,” the investigators concluded. “These data, if further validated, could help personalize prognosis and follow-up in NAFLD with [compensated advanced chronic liver disease].”
The investigators disclosed relationships with AbbVie, Novo Nordisk, Gilead, and others.
As the prevalence of nonalcoholic fatty liver disease (NAFLD) continues to rise, risk-stratifying those who will develop liver-related complications remains a major challenge. Although progression of liver fibrosis is a key risk factor for developing liver-related complications, the clinical application of noninvasive fibrosis markers for prognostication has been largely unexplored in NAFLD.
This study by Dr. Petta and colleagues highlights the potential for liver stiffness measurements (LSMs) as a noninvasive method. Increased LSM that was suggestive of clinically significant portal hypertension (kPa >21) had a nearly fourfold risk of hepatic decompensation. Furthermore, a longitudinal increase in LSM by greater than 20% was associated with a greater than 50% increased risk for hepatic decompensation, hepatocellular carcinoma, and death.
Transient elastography is a widely available and accurate tool for the noninvasive assessment of liver fibrosis for NAFLD in routine clinical practice. Routine serial measurements of LSM with transient elastography during clinic visits can provide clinicians with important information in the management of NAFLD, which can aid in treatment decisions, response to therapy, and monitoring of disease progression.
Further research is needed to validate these findings and to evaluate how longitudinal changes in LSM and other noninvasive fibrosis markers can prognosticate outcomes in NAFLD.
George Cholankeril MD, MS, is an assistant professor in the section of gastroenterology & hepatology of the department of medicine and in the division of abdominal transplantation of the department of surgery at Baylor College of Medicine in Houston. He reported having no conflicts of interest.
As the prevalence of nonalcoholic fatty liver disease (NAFLD) continues to rise, risk-stratifying those who will develop liver-related complications remains a major challenge. Although progression of liver fibrosis is a key risk factor for developing liver-related complications, the clinical application of noninvasive fibrosis markers for prognostication has been largely unexplored in NAFLD.
This study by Dr. Petta and colleagues highlights the potential for liver stiffness measurements (LSMs) as a noninvasive method. Increased LSM that was suggestive of clinically significant portal hypertension (kPa >21) had a nearly fourfold risk of hepatic decompensation. Furthermore, a longitudinal increase in LSM by greater than 20% was associated with a greater than 50% increased risk for hepatic decompensation, hepatocellular carcinoma, and death.
Transient elastography is a widely available and accurate tool for the noninvasive assessment of liver fibrosis for NAFLD in routine clinical practice. Routine serial measurements of LSM with transient elastography during clinic visits can provide clinicians with important information in the management of NAFLD, which can aid in treatment decisions, response to therapy, and monitoring of disease progression.
Further research is needed to validate these findings and to evaluate how longitudinal changes in LSM and other noninvasive fibrosis markers can prognosticate outcomes in NAFLD.
George Cholankeril MD, MS, is an assistant professor in the section of gastroenterology & hepatology of the department of medicine and in the division of abdominal transplantation of the department of surgery at Baylor College of Medicine in Houston. He reported having no conflicts of interest.
As the prevalence of nonalcoholic fatty liver disease (NAFLD) continues to rise, risk-stratifying those who will develop liver-related complications remains a major challenge. Although progression of liver fibrosis is a key risk factor for developing liver-related complications, the clinical application of noninvasive fibrosis markers for prognostication has been largely unexplored in NAFLD.
This study by Dr. Petta and colleagues highlights the potential for liver stiffness measurements (LSMs) as a noninvasive method. Increased LSM that was suggestive of clinically significant portal hypertension (kPa >21) had a nearly fourfold risk of hepatic decompensation. Furthermore, a longitudinal increase in LSM by greater than 20% was associated with a greater than 50% increased risk for hepatic decompensation, hepatocellular carcinoma, and death.
Transient elastography is a widely available and accurate tool for the noninvasive assessment of liver fibrosis for NAFLD in routine clinical practice. Routine serial measurements of LSM with transient elastography during clinic visits can provide clinicians with important information in the management of NAFLD, which can aid in treatment decisions, response to therapy, and monitoring of disease progression.
Further research is needed to validate these findings and to evaluate how longitudinal changes in LSM and other noninvasive fibrosis markers can prognosticate outcomes in NAFLD.
George Cholankeril MD, MS, is an assistant professor in the section of gastroenterology & hepatology of the department of medicine and in the division of abdominal transplantation of the department of surgery at Baylor College of Medicine in Houston. He reported having no conflicts of interest.
Among patients with nonalcoholic fatty liver disease (NAFLD) and compensated advanced chronic liver disease, liver stiffness measurements (LSMs) are associated with risks of hepatic events, according to a retrospective analysis of more than 1,000 patients.
“[N]oninvasive markers that can predict liver disease severity and outcomes in patients with NAFLD and advanced fibrosis are a major unmet need,” wrote lead author Salvatore Petta, MD, of the University of Palermo, Italy, and colleagues. Their report is in Clinical Gastroenterology and Hepatology. “Data about the accuracy of LSM in the prediction of events in NAFLD, and especially in patients with NAFLD and F3-F4 fibrosis, are scarce.”
To address this knowledge gap, the investigators retrospectively analyzed data from 1,039 consecutive patients with NAFLD who had baseline LSMs of more than 10 kPa and/or histologically diagnosed F3-F4 fibrosis. Patients were prospectively recruited at 10 centers in 6 countries, then followed for a median of 35 months, ranging from 19 to 63 months.
All patients had their liver stiffness measured with an M or XL probe at baseline. In addition, approximately half of the patients (n = 533) had a follow-up measurement using the same method, generating a subgroup with changes in liver stiffness. “Improved” liver stiffness was defined as a decrease in LSM greater than 20% from baseline, “impaired” liver stiffness was defined as an increase in LSM greater than 20% from baseline, and “stable” liver stiffness was defined as a change falling between 20% lower and 20% higher than baseline.
At baseline, mean LSM was 17.6 kPa. Cox regression analysis revealed that baseline LSM was independently associated with HCC (hazard ratio, 1.03; 95% confidence interval, 1.00-1.04; P = .003), liver decompensation (HR, 1.03; 95% CI, 1.02-1.04; P < .001), and liver-related death (HR, 1.02; 95% CI, 1.00-1.03; P = .005), but not extrahepatic events.
According to the investigators, the association between LSM at baseline and risk of liver decompensation was maintained after adjustment for the severity of liver disease and for surrogate markers of portal hypertension, they noted. Furthermore, patients with a baseline LSM of at least 21 kPa – which indicates high risk of clinically significant portal hypertension (CSPH) – were at greater risk of liver decompensation than were those with an LSM less than 21 kPa (HR, 3.71; 95% CI, 1.89-6.78; P = .04).
In the subgroup with follow-up measurements, approximately half of the patients had an improved LSM (53.3%), while 27.2% had a stable LSM, and 19.5% had an impaired LSM, a pattern that was significantly associated with diabetes at baseline (P = .01).
“These data agree with the available literature identifying diabetes as a risk factor for liver disease progression and liver-related complications,” the investigators wrote.
Cox regression showed that, among those with follow-up LSM, changes in LSM were independently associated with HCC (HR, 1.72; 95% CI, 1.01-3.02; P = .04), liver decompensation (HR, 1.56; 95% CI, 1.05-2.51; P = . 04), liver-related mortality (HR, 1.96; 95% CI, 1.10-3.38; P = .02), and mortality of any cause (HR, 1.73; 95% CI, 1.11-2.69; P = .01).
These risks could be further stratified by level of change in liver stiffness, with greater impairment predicting greater risk: The crude rate of liver decompensation was 14.4% among those with impaired LSM, compared with 6.2% among those with stable LSM and 3.8% among those with LSM improvement. That said, the categories of changes in LSM were not predictive of decompensation among patients with high risk of CSPH at baseline; however, they remained predictive among those with low risk of CSPH at baseline.
“[T]his study … showed that an integrated assessment of baseline LSM or [changes in LSM] can help in stratifying the risk of development of liver-related complications and of both hepatic and overall mortality,” the investigators concluded. “These data, if further validated, could help personalize prognosis and follow-up in NAFLD with [compensated advanced chronic liver disease].”
The investigators disclosed relationships with AbbVie, Novo Nordisk, Gilead, and others.
Among patients with nonalcoholic fatty liver disease (NAFLD) and compensated advanced chronic liver disease, liver stiffness measurements (LSMs) are associated with risks of hepatic events, according to a retrospective analysis of more than 1,000 patients.
“[N]oninvasive markers that can predict liver disease severity and outcomes in patients with NAFLD and advanced fibrosis are a major unmet need,” wrote lead author Salvatore Petta, MD, of the University of Palermo, Italy, and colleagues. Their report is in Clinical Gastroenterology and Hepatology. “Data about the accuracy of LSM in the prediction of events in NAFLD, and especially in patients with NAFLD and F3-F4 fibrosis, are scarce.”
To address this knowledge gap, the investigators retrospectively analyzed data from 1,039 consecutive patients with NAFLD who had baseline LSMs of more than 10 kPa and/or histologically diagnosed F3-F4 fibrosis. Patients were prospectively recruited at 10 centers in 6 countries, then followed for a median of 35 months, ranging from 19 to 63 months.
All patients had their liver stiffness measured with an M or XL probe at baseline. In addition, approximately half of the patients (n = 533) had a follow-up measurement using the same method, generating a subgroup with changes in liver stiffness. “Improved” liver stiffness was defined as a decrease in LSM greater than 20% from baseline, “impaired” liver stiffness was defined as an increase in LSM greater than 20% from baseline, and “stable” liver stiffness was defined as a change falling between 20% lower and 20% higher than baseline.
At baseline, mean LSM was 17.6 kPa. Cox regression analysis revealed that baseline LSM was independently associated with HCC (hazard ratio, 1.03; 95% confidence interval, 1.00-1.04; P = .003), liver decompensation (HR, 1.03; 95% CI, 1.02-1.04; P < .001), and liver-related death (HR, 1.02; 95% CI, 1.00-1.03; P = .005), but not extrahepatic events.
According to the investigators, the association between LSM at baseline and risk of liver decompensation was maintained after adjustment for the severity of liver disease and for surrogate markers of portal hypertension, they noted. Furthermore, patients with a baseline LSM of at least 21 kPa – which indicates high risk of clinically significant portal hypertension (CSPH) – were at greater risk of liver decompensation than were those with an LSM less than 21 kPa (HR, 3.71; 95% CI, 1.89-6.78; P = .04).
In the subgroup with follow-up measurements, approximately half of the patients had an improved LSM (53.3%), while 27.2% had a stable LSM, and 19.5% had an impaired LSM, a pattern that was significantly associated with diabetes at baseline (P = .01).
“These data agree with the available literature identifying diabetes as a risk factor for liver disease progression and liver-related complications,” the investigators wrote.
Cox regression showed that, among those with follow-up LSM, changes in LSM were independently associated with HCC (HR, 1.72; 95% CI, 1.01-3.02; P = .04), liver decompensation (HR, 1.56; 95% CI, 1.05-2.51; P = . 04), liver-related mortality (HR, 1.96; 95% CI, 1.10-3.38; P = .02), and mortality of any cause (HR, 1.73; 95% CI, 1.11-2.69; P = .01).
These risks could be further stratified by level of change in liver stiffness, with greater impairment predicting greater risk: The crude rate of liver decompensation was 14.4% among those with impaired LSM, compared with 6.2% among those with stable LSM and 3.8% among those with LSM improvement. That said, the categories of changes in LSM were not predictive of decompensation among patients with high risk of CSPH at baseline; however, they remained predictive among those with low risk of CSPH at baseline.
“[T]his study … showed that an integrated assessment of baseline LSM or [changes in LSM] can help in stratifying the risk of development of liver-related complications and of both hepatic and overall mortality,” the investigators concluded. “These data, if further validated, could help personalize prognosis and follow-up in NAFLD with [compensated advanced chronic liver disease].”
The investigators disclosed relationships with AbbVie, Novo Nordisk, Gilead, and others.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Liver stiffness predicts hepatic events in NAFLD
Among patients with nonalcoholic fatty liver disease (NAFLD) and compensated advanced chronic liver disease, liver stiffness measurements (LSMs) are associated with risks of hepatic events, according to a retrospective analysis of more than 1,000 patients.
“[N]oninvasive markers that can predict liver disease severity and outcomes in patients with NAFLD and advanced fibrosis are a major unmet need,” wrote lead author Salvatore Petta, MD, of the University of Palermo, Italy, and colleagues. Their report is in Clinical Gastroenterology and Hepatology. “Data about the accuracy of LSM in the prediction of events in NAFLD, and especially in patients with NAFLD and F3-F4 fibrosis, are scarce.”
To address this knowledge gap, the investigators retrospectively analyzed data from 1,039 consecutive patients with NAFLD who had baseline LSMs of more than 10 kPa and/or histologically diagnosed F3-F4 fibrosis. Patients were prospectively recruited at 10 centers in 6 countries, then followed for a median of 35 months, ranging from 19 to 63 months.
All patients had their liver stiffness measured with an M or XL probe at baseline. In addition, approximately half of the patients (n = 533) had a follow-up measurement using the same method, generating a subgroup with changes in liver stiffness. “Improved” liver stiffness was defined as a decrease in LSM greater than 20% from baseline, “impaired” liver stiffness was defined as an increase in LSM greater than 20% from baseline, and “stable” liver stiffness was defined as a change falling between 20% lower and 20% higher than baseline.
At baseline, mean LSM was 17.6 kPa. Cox regression analysis revealed that baseline LSM was independently associated with HCC (hazard ratio, 1.03; 95% confidence interval, 1.00-1.04; P = .003), liver decompensation (HR, 1.03; 95% CI, 1.02-1.04; P < .001), and liver-related death (HR, 1.02; 95% CI, 1.00-1.03; P = .005), but not extrahepatic events.
According to the investigators, the association between LSM at baseline and risk of liver decompensation was maintained after adjustment for the severity of liver disease and for surrogate markers of portal hypertension, they noted. Furthermore, patients with a baseline LSM of at least 21 kPa – which indicates high risk of clinically significant portal hypertension (CSPH) – were at greater risk of liver decompensation than were those with an LSM less than 21 kPa (HR, 3.71; 95% CI, 1.89-6.78; P = .04).
In the subgroup with follow-up measurements, approximately half of the patients had an improved LSM (53.3%), while 27.2% had a stable LSM, and 19.5% had an impaired LSM, a pattern that was significantly associated with diabetes at baseline (P = .01).
“These data agree with the available literature identifying diabetes as a risk factor for liver disease progression and liver-related complications,” the investigators wrote.
Cox regression showed that, among those with follow-up LSM, changes in LSM were independently associated with HCC (HR, 1.72; 95% CI, 1.01-3.02; P = .04), liver decompensation (HR, 1.56; 95% CI, 1.05-2.51; P = . 04), liver-related mortality (HR, 1.96; 95% CI, 1.10-3.38; P = .02), and mortality of any cause (HR, 1.73; 95% CI, 1.11-2.69; P = .01).
These risks could be further stratified by level of change in liver stiffness, with greater impairment predicting greater risk: The crude rate of liver decompensation was 14.4% among those with impaired LSM, compared with 6.2% among those with stable LSM and 3.8% among those with LSM improvement. That said, the categories of changes in LSM were not predictive of decompensation among patients with high risk of CSPH at baseline; however, they remained predictive among those with low risk of CSPH at baseline.
“[T]his study … showed that an integrated assessment of baseline LSM or [changes in LSM] can help in stratifying the risk of development of liver-related complications and of both hepatic and overall mortality,” the investigators concluded. “These data, if further validated, could help personalize prognosis and follow-up in NAFLD with [compensated advanced chronic liver disease].”
The investigators disclosed relationships with AbbVie, Novo Nordisk, Gilead, and others.
Among patients with nonalcoholic fatty liver disease (NAFLD) and compensated advanced chronic liver disease, liver stiffness measurements (LSMs) are associated with risks of hepatic events, according to a retrospective analysis of more than 1,000 patients.
“[N]oninvasive markers that can predict liver disease severity and outcomes in patients with NAFLD and advanced fibrosis are a major unmet need,” wrote lead author Salvatore Petta, MD, of the University of Palermo, Italy, and colleagues. Their report is in Clinical Gastroenterology and Hepatology. “Data about the accuracy of LSM in the prediction of events in NAFLD, and especially in patients with NAFLD and F3-F4 fibrosis, are scarce.”
To address this knowledge gap, the investigators retrospectively analyzed data from 1,039 consecutive patients with NAFLD who had baseline LSMs of more than 10 kPa and/or histologically diagnosed F3-F4 fibrosis. Patients were prospectively recruited at 10 centers in 6 countries, then followed for a median of 35 months, ranging from 19 to 63 months.
All patients had their liver stiffness measured with an M or XL probe at baseline. In addition, approximately half of the patients (n = 533) had a follow-up measurement using the same method, generating a subgroup with changes in liver stiffness. “Improved” liver stiffness was defined as a decrease in LSM greater than 20% from baseline, “impaired” liver stiffness was defined as an increase in LSM greater than 20% from baseline, and “stable” liver stiffness was defined as a change falling between 20% lower and 20% higher than baseline.
At baseline, mean LSM was 17.6 kPa. Cox regression analysis revealed that baseline LSM was independently associated with HCC (hazard ratio, 1.03; 95% confidence interval, 1.00-1.04; P = .003), liver decompensation (HR, 1.03; 95% CI, 1.02-1.04; P < .001), and liver-related death (HR, 1.02; 95% CI, 1.00-1.03; P = .005), but not extrahepatic events.
According to the investigators, the association between LSM at baseline and risk of liver decompensation was maintained after adjustment for the severity of liver disease and for surrogate markers of portal hypertension, they noted. Furthermore, patients with a baseline LSM of at least 21 kPa – which indicates high risk of clinically significant portal hypertension (CSPH) – were at greater risk of liver decompensation than were those with an LSM less than 21 kPa (HR, 3.71; 95% CI, 1.89-6.78; P = .04).
In the subgroup with follow-up measurements, approximately half of the patients had an improved LSM (53.3%), while 27.2% had a stable LSM, and 19.5% had an impaired LSM, a pattern that was significantly associated with diabetes at baseline (P = .01).
“These data agree with the available literature identifying diabetes as a risk factor for liver disease progression and liver-related complications,” the investigators wrote.
Cox regression showed that, among those with follow-up LSM, changes in LSM were independently associated with HCC (HR, 1.72; 95% CI, 1.01-3.02; P = .04), liver decompensation (HR, 1.56; 95% CI, 1.05-2.51; P = . 04), liver-related mortality (HR, 1.96; 95% CI, 1.10-3.38; P = .02), and mortality of any cause (HR, 1.73; 95% CI, 1.11-2.69; P = .01).
These risks could be further stratified by level of change in liver stiffness, with greater impairment predicting greater risk: The crude rate of liver decompensation was 14.4% among those with impaired LSM, compared with 6.2% among those with stable LSM and 3.8% among those with LSM improvement. That said, the categories of changes in LSM were not predictive of decompensation among patients with high risk of CSPH at baseline; however, they remained predictive among those with low risk of CSPH at baseline.
“[T]his study … showed that an integrated assessment of baseline LSM or [changes in LSM] can help in stratifying the risk of development of liver-related complications and of both hepatic and overall mortality,” the investigators concluded. “These data, if further validated, could help personalize prognosis and follow-up in NAFLD with [compensated advanced chronic liver disease].”
The investigators disclosed relationships with AbbVie, Novo Nordisk, Gilead, and others.
Among patients with nonalcoholic fatty liver disease (NAFLD) and compensated advanced chronic liver disease, liver stiffness measurements (LSMs) are associated with risks of hepatic events, according to a retrospective analysis of more than 1,000 patients.
“[N]oninvasive markers that can predict liver disease severity and outcomes in patients with NAFLD and advanced fibrosis are a major unmet need,” wrote lead author Salvatore Petta, MD, of the University of Palermo, Italy, and colleagues. Their report is in Clinical Gastroenterology and Hepatology. “Data about the accuracy of LSM in the prediction of events in NAFLD, and especially in patients with NAFLD and F3-F4 fibrosis, are scarce.”
To address this knowledge gap, the investigators retrospectively analyzed data from 1,039 consecutive patients with NAFLD who had baseline LSMs of more than 10 kPa and/or histologically diagnosed F3-F4 fibrosis. Patients were prospectively recruited at 10 centers in 6 countries, then followed for a median of 35 months, ranging from 19 to 63 months.
All patients had their liver stiffness measured with an M or XL probe at baseline. In addition, approximately half of the patients (n = 533) had a follow-up measurement using the same method, generating a subgroup with changes in liver stiffness. “Improved” liver stiffness was defined as a decrease in LSM greater than 20% from baseline, “impaired” liver stiffness was defined as an increase in LSM greater than 20% from baseline, and “stable” liver stiffness was defined as a change falling between 20% lower and 20% higher than baseline.
At baseline, mean LSM was 17.6 kPa. Cox regression analysis revealed that baseline LSM was independently associated with HCC (hazard ratio, 1.03; 95% confidence interval, 1.00-1.04; P = .003), liver decompensation (HR, 1.03; 95% CI, 1.02-1.04; P < .001), and liver-related death (HR, 1.02; 95% CI, 1.00-1.03; P = .005), but not extrahepatic events.
According to the investigators, the association between LSM at baseline and risk of liver decompensation was maintained after adjustment for the severity of liver disease and for surrogate markers of portal hypertension, they noted. Furthermore, patients with a baseline LSM of at least 21 kPa – which indicates high risk of clinically significant portal hypertension (CSPH) – were at greater risk of liver decompensation than were those with an LSM less than 21 kPa (HR, 3.71; 95% CI, 1.89-6.78; P = .04).
In the subgroup with follow-up measurements, approximately half of the patients had an improved LSM (53.3%), while 27.2% had a stable LSM, and 19.5% had an impaired LSM, a pattern that was significantly associated with diabetes at baseline (P = .01).
“These data agree with the available literature identifying diabetes as a risk factor for liver disease progression and liver-related complications,” the investigators wrote.
Cox regression showed that, among those with follow-up LSM, changes in LSM were independently associated with HCC (HR, 1.72; 95% CI, 1.01-3.02; P = .04), liver decompensation (HR, 1.56; 95% CI, 1.05-2.51; P = . 04), liver-related mortality (HR, 1.96; 95% CI, 1.10-3.38; P = .02), and mortality of any cause (HR, 1.73; 95% CI, 1.11-2.69; P = .01).
These risks could be further stratified by level of change in liver stiffness, with greater impairment predicting greater risk: The crude rate of liver decompensation was 14.4% among those with impaired LSM, compared with 6.2% among those with stable LSM and 3.8% among those with LSM improvement. That said, the categories of changes in LSM were not predictive of decompensation among patients with high risk of CSPH at baseline; however, they remained predictive among those with low risk of CSPH at baseline.
“[T]his study … showed that an integrated assessment of baseline LSM or [changes in LSM] can help in stratifying the risk of development of liver-related complications and of both hepatic and overall mortality,” the investigators concluded. “These data, if further validated, could help personalize prognosis and follow-up in NAFLD with [compensated advanced chronic liver disease].”
The investigators disclosed relationships with AbbVie, Novo Nordisk, Gilead, and others.
FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY
Mitochondrial DNA variant increases gallstone risk
A mitochondrial DNA variant may increase the risk of gallstone disease more than fourfold, according to investigators.
Mitochondrial DNA 827A>G disrupts mitochondrial function and leads to abnormal cholesterol transport, which increases gallstone development, reported Dayan Sun, of Fudan University, Shanghai, China, and colleagues.
The investigators noted that the findings add support to a genetic role in gallstone development, which could allow for identification of at-risk individuals and implementation of preventive measures.
“The etiology of gallstone disease is multifactorial; age, sex, pregnancy, diet (macronutrients, alcohol, and coffee), and other factors are involved,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology. “Moreover, the significant familial predisposition and ethnic differences in prevalence of this disease indicate the potential influences of genetic factors.”
In 2002, Nakeeb and colleagues reported that at least 30% of gallstone disease cases stemmed from genetic factors. And genetics may play an even greater role in certain populations, such as Native Americans, among whom more than 70% of women have gallstone disease, based on a study by Everhart and colleagues.
According to Ms. Sun and colleagues, a variety of genetic drivers of gallstone disease have been identified, such as ABCG8, identified as the most common genetic risk factor by at least one study, along with a list of other rare mutations, such as one affecting CFTR that leads to altered bile composition.
Based on previous research that linked mitochondrial DNA variants with metabolic defects and, more specifically, aberrations in lipid metabolism, as well as an observed “maternal bias in the maternal transmission of gallstone disease” that suggest mitochondrial influence, the investigators looked for patterns specifically in mitochondrial DNA variants among patients with gallstones.
The study enrolled 104 probands with confirmed gallstone disease and 300 unrelated controls. After collecting DNA samples from all participants, the investigators sequenced mitochondrial DNA HVS1 regions. A comparison of haplogroups showed that B4b’d’e’j was more common among patients with gallstone disease than among controls (odds ratio, 4.428; P = .00012), and further analysis pinpointed 827A>G, a variant in 12S ribosomal RNA.
“During the evolutionary history of modern humans, haplogroup B4 might have originated in East Asia approximately 40,000 years ago,” the investigators wrote, noting that B2, a subhaplogroup of B4, “was a founder haplogroup and expanded in the Americas after the Last Glacial Maximum (approximately 20,000 years ago).”
According to the investigators, this may explain why Native Americans have a higher prevalence of gallstones than East Asians (14%-35% vs. 3%-12%) because they are more often carriers of B4 (14%-44% vs. 2%-8%).
The investigators sought to characterize the impact that the 827A>G variant has on mitochondrial function and found effects ranging from lower respiratory chain complex activity, diminished mitochondrial function, activated mitochondrial protein quality control and retrograde signaling pathways, abnormal lipid metabolism, and abnormal cholesterol transport processes.
For example, the investigators investigated respiratory chain complex activity by creating two sister branch haplogroup cell models, including six cybrids for 827A and six more for 827G, which is how they detected the lower activity. Another step the investigators took was corroborating this finding by detecting OXPHOS function in the 827A and 827G cybrids to determine mitochondrial function.
“In summary, our study demonstrates a potential link between mitochondrial DNA 827A>G and gallstone disease,” the investigators wrote. “Our findings provide a significant biological basis for the clinical diagnosis and prevention of gallstone disease in the future.”
The study was funded by the National Natural Science Foundation of China, the 111 Project, the Shanghai Municipal Science and Technology Major Project, the Scientific and Technology Committee of Shanghai Municipality, and the CAMS Innovation Fund for Medical Sciences. The investigators reported no conflicts of interest.
Cholesterol gallstone disease results from imbalances in cholesterol metabolism. Other than the well-known lifestyle risk factors, there is also a strong genetic predisposition to gallstone formation. This study by Sun and colleagues examined the possible association between mitochondrial DNA (mtDNA) variants and cholesterol gallstone development because of the importance of the mitochondria in cellular metabolism and the increased maternal transmission of gallstone disease.
This study highlighted gallstone disease as a multifactorial condition that results from complex interaction between genetic and environmental factors. Interestingly, the allele frequency of the 827A>G mtDNA variant was noted to be higher in Native Americans, which may partially explain the high prevalence of gallstones in this population. Further studies are needed to identify additional genetic risk factors in ethnic groups that also have a significant burden of cholelithiasis.
Xiao Zhao, MD, is an assistant professor of medicine of division of digestive diseases in the department of medicine at Columbia University, New York. She reported having no conflicts of interest.
Cholesterol gallstone disease results from imbalances in cholesterol metabolism. Other than the well-known lifestyle risk factors, there is also a strong genetic predisposition to gallstone formation. This study by Sun and colleagues examined the possible association between mitochondrial DNA (mtDNA) variants and cholesterol gallstone development because of the importance of the mitochondria in cellular metabolism and the increased maternal transmission of gallstone disease.
This study highlighted gallstone disease as a multifactorial condition that results from complex interaction between genetic and environmental factors. Interestingly, the allele frequency of the 827A>G mtDNA variant was noted to be higher in Native Americans, which may partially explain the high prevalence of gallstones in this population. Further studies are needed to identify additional genetic risk factors in ethnic groups that also have a significant burden of cholelithiasis.
Xiao Zhao, MD, is an assistant professor of medicine of division of digestive diseases in the department of medicine at Columbia University, New York. She reported having no conflicts of interest.
Cholesterol gallstone disease results from imbalances in cholesterol metabolism. Other than the well-known lifestyle risk factors, there is also a strong genetic predisposition to gallstone formation. This study by Sun and colleagues examined the possible association between mitochondrial DNA (mtDNA) variants and cholesterol gallstone development because of the importance of the mitochondria in cellular metabolism and the increased maternal transmission of gallstone disease.
This study highlighted gallstone disease as a multifactorial condition that results from complex interaction between genetic and environmental factors. Interestingly, the allele frequency of the 827A>G mtDNA variant was noted to be higher in Native Americans, which may partially explain the high prevalence of gallstones in this population. Further studies are needed to identify additional genetic risk factors in ethnic groups that also have a significant burden of cholelithiasis.
Xiao Zhao, MD, is an assistant professor of medicine of division of digestive diseases in the department of medicine at Columbia University, New York. She reported having no conflicts of interest.
A mitochondrial DNA variant may increase the risk of gallstone disease more than fourfold, according to investigators.
Mitochondrial DNA 827A>G disrupts mitochondrial function and leads to abnormal cholesterol transport, which increases gallstone development, reported Dayan Sun, of Fudan University, Shanghai, China, and colleagues.
The investigators noted that the findings add support to a genetic role in gallstone development, which could allow for identification of at-risk individuals and implementation of preventive measures.
“The etiology of gallstone disease is multifactorial; age, sex, pregnancy, diet (macronutrients, alcohol, and coffee), and other factors are involved,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology. “Moreover, the significant familial predisposition and ethnic differences in prevalence of this disease indicate the potential influences of genetic factors.”
In 2002, Nakeeb and colleagues reported that at least 30% of gallstone disease cases stemmed from genetic factors. And genetics may play an even greater role in certain populations, such as Native Americans, among whom more than 70% of women have gallstone disease, based on a study by Everhart and colleagues.
According to Ms. Sun and colleagues, a variety of genetic drivers of gallstone disease have been identified, such as ABCG8, identified as the most common genetic risk factor by at least one study, along with a list of other rare mutations, such as one affecting CFTR that leads to altered bile composition.
Based on previous research that linked mitochondrial DNA variants with metabolic defects and, more specifically, aberrations in lipid metabolism, as well as an observed “maternal bias in the maternal transmission of gallstone disease” that suggest mitochondrial influence, the investigators looked for patterns specifically in mitochondrial DNA variants among patients with gallstones.
The study enrolled 104 probands with confirmed gallstone disease and 300 unrelated controls. After collecting DNA samples from all participants, the investigators sequenced mitochondrial DNA HVS1 regions. A comparison of haplogroups showed that B4b’d’e’j was more common among patients with gallstone disease than among controls (odds ratio, 4.428; P = .00012), and further analysis pinpointed 827A>G, a variant in 12S ribosomal RNA.
“During the evolutionary history of modern humans, haplogroup B4 might have originated in East Asia approximately 40,000 years ago,” the investigators wrote, noting that B2, a subhaplogroup of B4, “was a founder haplogroup and expanded in the Americas after the Last Glacial Maximum (approximately 20,000 years ago).”
According to the investigators, this may explain why Native Americans have a higher prevalence of gallstones than East Asians (14%-35% vs. 3%-12%) because they are more often carriers of B4 (14%-44% vs. 2%-8%).
The investigators sought to characterize the impact that the 827A>G variant has on mitochondrial function and found effects ranging from lower respiratory chain complex activity, diminished mitochondrial function, activated mitochondrial protein quality control and retrograde signaling pathways, abnormal lipid metabolism, and abnormal cholesterol transport processes.
For example, the investigators investigated respiratory chain complex activity by creating two sister branch haplogroup cell models, including six cybrids for 827A and six more for 827G, which is how they detected the lower activity. Another step the investigators took was corroborating this finding by detecting OXPHOS function in the 827A and 827G cybrids to determine mitochondrial function.
“In summary, our study demonstrates a potential link between mitochondrial DNA 827A>G and gallstone disease,” the investigators wrote. “Our findings provide a significant biological basis for the clinical diagnosis and prevention of gallstone disease in the future.”
The study was funded by the National Natural Science Foundation of China, the 111 Project, the Shanghai Municipal Science and Technology Major Project, the Scientific and Technology Committee of Shanghai Municipality, and the CAMS Innovation Fund for Medical Sciences. The investigators reported no conflicts of interest.
A mitochondrial DNA variant may increase the risk of gallstone disease more than fourfold, according to investigators.
Mitochondrial DNA 827A>G disrupts mitochondrial function and leads to abnormal cholesterol transport, which increases gallstone development, reported Dayan Sun, of Fudan University, Shanghai, China, and colleagues.
The investigators noted that the findings add support to a genetic role in gallstone development, which could allow for identification of at-risk individuals and implementation of preventive measures.
“The etiology of gallstone disease is multifactorial; age, sex, pregnancy, diet (macronutrients, alcohol, and coffee), and other factors are involved,” the investigators wrote in Cellular and Molecular Gastroenterology and Hepatology. “Moreover, the significant familial predisposition and ethnic differences in prevalence of this disease indicate the potential influences of genetic factors.”
In 2002, Nakeeb and colleagues reported that at least 30% of gallstone disease cases stemmed from genetic factors. And genetics may play an even greater role in certain populations, such as Native Americans, among whom more than 70% of women have gallstone disease, based on a study by Everhart and colleagues.
According to Ms. Sun and colleagues, a variety of genetic drivers of gallstone disease have been identified, such as ABCG8, identified as the most common genetic risk factor by at least one study, along with a list of other rare mutations, such as one affecting CFTR that leads to altered bile composition.
Based on previous research that linked mitochondrial DNA variants with metabolic defects and, more specifically, aberrations in lipid metabolism, as well as an observed “maternal bias in the maternal transmission of gallstone disease” that suggest mitochondrial influence, the investigators looked for patterns specifically in mitochondrial DNA variants among patients with gallstones.
The study enrolled 104 probands with confirmed gallstone disease and 300 unrelated controls. After collecting DNA samples from all participants, the investigators sequenced mitochondrial DNA HVS1 regions. A comparison of haplogroups showed that B4b’d’e’j was more common among patients with gallstone disease than among controls (odds ratio, 4.428; P = .00012), and further analysis pinpointed 827A>G, a variant in 12S ribosomal RNA.
“During the evolutionary history of modern humans, haplogroup B4 might have originated in East Asia approximately 40,000 years ago,” the investigators wrote, noting that B2, a subhaplogroup of B4, “was a founder haplogroup and expanded in the Americas after the Last Glacial Maximum (approximately 20,000 years ago).”
According to the investigators, this may explain why Native Americans have a higher prevalence of gallstones than East Asians (14%-35% vs. 3%-12%) because they are more often carriers of B4 (14%-44% vs. 2%-8%).
The investigators sought to characterize the impact that the 827A>G variant has on mitochondrial function and found effects ranging from lower respiratory chain complex activity, diminished mitochondrial function, activated mitochondrial protein quality control and retrograde signaling pathways, abnormal lipid metabolism, and abnormal cholesterol transport processes.
For example, the investigators investigated respiratory chain complex activity by creating two sister branch haplogroup cell models, including six cybrids for 827A and six more for 827G, which is how they detected the lower activity. Another step the investigators took was corroborating this finding by detecting OXPHOS function in the 827A and 827G cybrids to determine mitochondrial function.
“In summary, our study demonstrates a potential link between mitochondrial DNA 827A>G and gallstone disease,” the investigators wrote. “Our findings provide a significant biological basis for the clinical diagnosis and prevention of gallstone disease in the future.”
The study was funded by the National Natural Science Foundation of China, the 111 Project, the Shanghai Municipal Science and Technology Major Project, the Scientific and Technology Committee of Shanghai Municipality, and the CAMS Innovation Fund for Medical Sciences. The investigators reported no conflicts of interest.
FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY















