Associations Between Prescreening Dietary Patterns and Longitudinal Colonoscopy Outcomes in Veterans

Article Type
Changed
Tue, 08/19/2025 - 12:33
Display Headline

Associations Between Prescreening Dietary Patterns and Longitudinal Colonoscopy Outcomes in Veterans

Screening for colorectal cancer (CRC) with colonoscopy enables the identification and removal of CRC precursors (colonic adenomas) and has been associated with reduced risk of CRC incidence and mortality.1-3 Furthermore, there is consensus that diet and lifestyle may be associated with forestalling CRC pathogenesis at the intermediate adenoma stages.4-7 However, studies have shown that US veterans have poorer diet quality and a higher risk for neoplasia compared with nonveterans, reinforcing the need for tailored clinical approaches.8,9 Combining screening with conversations about modifiable environmental and lifestyle risk factors, such as poor diet, is a highly relevant and possibly easily leveraged prevention for those at high risk. However, there is limited evidence for any particular dietary patterns or dietary features that are most important over time.7

Several dietary components have been shown to be associated with CRC risk,10 either as potentially chemopreventive (fiber, fruits and vegetables,11 dairy,12 supplemental vitamin D,13 calcium,14 and multivitamins15) or carcinogenic (red meat16 and alcohol17). Previous studies of veterans have similarly shown that higher intake of fiber and vitamin D reduced risk, and red meat is associated with an increased risk for finding CRC precursors during colonoscopy.18 However, these dietary categories are often analyzed in isolation. Studying healthy dietary patterns in aggregate may be more clinically relevant and easier to implement for prevention of CRC and its precursors.19-21 Healthy dietary patterns, such as the US Dietary Guidelines for Americans represented by the Healthy Eating Index (HEI), the Mediterranean diet (MD), and the Dietary Approaches to Stop Hypertension (DASH) diet, have been associated with lower risk for chronic disease.22-24 Despite the extant literature, no known studies have compared these dietary patterns for associations with risk of CRC precursor or CRC development among US veterans undergoing long-term screening and follow-up after a baseline colonoscopy.

The objective of this study was to test for associations between baseline scores of healthy dietary patterns and the most severe colonoscopy findings (MSCFs) over ≥ 10 years following a baseline screening colonoscopy in veterans.

Methods

Participants in the Cooperative Studies Program (CSP) #380 cohort study included 3121 asymptomatic veterans aged 50 to 75 years at baseline who had consented to initial screening colonoscopy between 1994 and 1997, with subsequent follow-up and surveillance.25 Prior to their colonoscopy, all participants completed a baseline study survey that included questions about cancer risk factors including family history of CRC, diet, physical activity, and medication use.

Included in this cross-sectional analysis were data from a sample of veteran participants of the CSP #380 cohort with 1 baseline colonoscopy, follow-up surveillance through 2009, a cancer risk factor survey collected at baseline, and complete demographic and clinical indicator data. Excluded from the analysis were 67 participants with insufficient responses to the dietary food frequency questionnaire (FFQ) and 31 participants with missing body mass index (BMI), 3023 veterans.

Measures

MSCF. The outcome of interest in this study was the MSCF recorded across all participant colonoscopies during the study period. MSCF was categorized as either (1) no neoplasia; (2) < 2 nonadvanced adenomas, including small adenomas (diameter < 10 mm) with tubular histology; or (3) advanced neoplasia (AN), which is characterized by adenomas > 10 mm in diameter, with villous histology, with high-grade dysplasia, or CRC.

Dietary patterns. Dietary pattern scores representing dietary quality and calculated based on recommendations of the US Dietary Guidelines for Americans using the HEI, MD, and DASH diets were independent variables.26-28 These 3 dietary patterns were chosen for their hypothesized relationship with CRC risk, but each weighs food categories differently (Appendix 1).22-24,29 Dietary pattern scores were calculated using the CSP #380 self-reported responses to 129 baseline survey questions adapted from a well-established and previously validated semiquantitative FFQ.30 The form was administered by mail twice to a sample of 127 participants at baseline and at 1 year. During this interval, men completed 1-week diet records twice, spaced about 6 months apart. Mean values for intake of most nutrients assessed by the 2 methods were similar. Intraclass correlation coefficients for the baseline and 1-year FFQ-assessed nutrient intakes that ranged from 0.47 for vitamin E (without supplements) to 0.80 for vitamin C (with supplements). Correlation coefficients between the energy-adjusted nutrient intakes were measured by diet records and the 1-year FFQ, which asked about diet during the year encompassing the diet records. Higher raw and percent scores indicated better alignment with recommendations from each respective dietary pattern. Percent scores were calculated as a standardizing method and used in analyses for ease of comparing the dietary patterns. Scoring can be found in Appendix 2.

0825FED-AVAHO-COLON-A10825FED-AVAHO-COLON-A2

Demographic characteristics and clinical indicators. Demographic characteristics included age categories, sex, and race/ethnicity. Clinical indicators included BMI, the number of comorbid conditions used to calculate the Charlson Comorbidity Index, family history of CRC in first-degree relatives, number of follow-up colonoscopies across the study period, and food-based vitamin D intake.31 These variables were chosen for their applicability found in previous CSP #380 cohort studies.18,32,33 Self-reported race and ethnicity were collapsed due to small numbers in some groups. The authors acknowledge these are distinct concepts and the variable has limited utility other than for controlling for systemic racism in the model.

Statistical Analyses

Descriptive statistics were used to describe distributional assumptions for all variables, including demographics, clinical indicators, colonoscopy results, and dietary patterns. Pairwise correlations between the total dietary pattern scores and food category scores were calculated with Pearson correlation (r).

Multinomial logistic regression models were created using SAS procedure LOGISTIC with the outcome of the categorical MSCF (no neoplasia, nonadvanced adenoma, or AN).34 A model was created for each independent predictor variable of interest (ie, the HEI, MD, or DASH percentage-standardized dietary pattern score and each food category comprising each dietary pattern score). All models were adjusted for age, sex, race/ethnicity, BMI, number of comorbidities, family history of CRC, number of follow-up colonoscopies, and estimated daily food-derived vitamin D intake. The demographic and clinical indicators were included in the models as they are known to be associated with CRC risk.18 The number of colonoscopies was included to control for surveillance intensity presuming risk for AN is reduced as polyps are removed. Because colonoscopy findings from an initial screening have unique clinical implications compared with follow- up and surveillance, MSCF was observed in 2 ways in sensitivity analyses: (1) baseline and (2) aggregate follow-up and surveillance only, excluding baseline findings.

Adjusted odds ratios (aORs) and 95% CIs for each of the MSCF outcomes with a reference finding of no neoplasia for the models are presented. We chose not to adjust for multiple comparisons across the different dietary patterns given the correlation between dietary pattern total and category scores but did adjust for multiple comparisons for dietary categories within each dietary pattern. Tests for statistical significance used α= .05 for the dietary pattern total scores and P values for the dietary category scores for each dietary pattern controlled for false discovery rate using the MULTTEST SAS procedure.35 All data manipulations and analyses were performed using SAS version 9.4.

Results

The study included 3023 patients. All were aged 50 to 75 years, 2923 (96.7%) were male and 2532 (83.8%) were non-Hispanic White (Table 1). Most participants were overweight or obese (n = 2535 [83.8%]), 2024 (67.0%) had ≤ 2 comorbidities, and 2602 (86.1%) had no family history of CRC. The MSCF for 1628 patients (53.9%) was no neoplasia, 966 patients (32.0%) was nonadvanced adenoma, and 429 participants (14.2%) had AN.

0825FED-AVAHO-COLON-T1

Mean percent scores were 58.5% for HEI, 38.2% for MD, and 63.1% for the DASH diet, with higher percentages indicating greater alignment with the recommendations for each diet (Table 2). All 3 dietary patterns scores standardized to percentages were strongly and significantly correlated in pairwise comparisons: HEI:MD, r = 0.62 (P < .001); HEI:DASH, r = 0.60 (P < .001); and MD:DASH, r = 0.72 (P < .001). Likewise, food category scores were significantly correlated across dietary patterns. For example, whole grain and fiber values from each dietary score were strongly correlated in pairwise comparisons: HEI Whole Grain:MD Grain, r = 0.64 (P < .001); HEI Whole Grain:DASH Fiber, r = 0.71 (P < .001); and MD Grain:DASH Fiber, r = 0.70 (P < .001).

0825FED-AVAHO-COLON-T2

Associations between individual participants' dietary pattern scores and the outcome of their pooled MSCF from baseline screening and ≥ 10 years of surveillance are presented in Table 3. For each single-point increases in dietary pattern scores (reflecting better dietary quality), aORs for nonadvanced adenoma vs no neoplasia were slightly lower but not statistically significantly: HEI, aOR, 1.00 (95% CI, 0.99-1.01); MD, aOR, 0.98 (95% CI, 0.94-1.02); and DASH, aOR, 0.99 (95% CI, 0.99-1.00). aORs for AN vs no neoplasia were slightly lower for each dietary pattern assessed, and only the MD and DASH scores were significantly different from 1.00: HEI, aOR, 1.00 (95% CI, 0.99-1.01); MD, aOR, 0.95 (95% CI, 0.90-1.00); and DASH, aOR, 0.99 (95% CI, 0.98-1.00).

0825FED-AVAHO-COLON-T3

We observed lower odds for nonadvanced adenoma and AN among all these dietary patterns when there was greater alignment with the recommended intake of whole grains and fiber. In separate models conducted using food categories comprising the dietary patterns as independent variables and after correcting for multiple tests, higher scores for the HEI Refined Grain category were associated with higher odds for nonadvanced adenoma (aOR, 1.03 [95% CI, 1.01-1.05]; P = .01) and AN (aOR, 1.05 [95% CI, 1.02-1.08]; P < .001). Higher scores for the HEI Whole Grain category were associated with lower odds for nonadvanced adenoma (aOR, 0.97 [95% CI, 0.95-0.99]; P = .01) and AN (aOR, 0.96 [95% CI, 0.93-0.99]; P = .01). Higher scores for the MD Grain category were significantly associated with lower odds for nonadvanced adenoma (aOR, 0.44 [95% CI, 0.26-0.75]; P = .002) and AN (aOR, 0.29 [95% CI, 0.14-0.62]; P = .001). The DASH Grains category also was significantly associated with lower odds for AN (aOR, 0.86 [95% CI, 0.78-0.95]; P = .002).

Discussion

In this study of 3023 veterans undergoing first-time screening colonoscopy and ≥ 10 years of surveillance, we found that healthy dietary patterns, as assessed by the MD and DASH diet, were significantly associated with lower risk of AN. Additionally, we identified lower odds for AN and nonadvanced adenoma compared with no neoplasia for higher grain scores for all the dietary patterns studied. Other food categories that comprise the dietary pattern scores had mixed associations with the MSCF outcomes. Several other studies have examined associations between dietary patterns and risk for CRC but to our knowledge, no studies have explored these associations among US veterans.

These results also indicate study participants had better than average (based on a 50% threshold) dietary quality according to the HEI and DASH diet scoring methods we used, but poor dietary quality according to the MD scoring method. The mean HEI scores for the present study were higher than a US Department of Agriculture study by Dong et al that compared dietary quality between veterans and nonveterans using the HEI, for which veterans’ expected HEI score was 45.6 of 100.8 This could be explained by the fact that the participants needed to be healthy to be eligible and those with healthier behaviors overall may have self-selected into the study due to motivation for screening during a time when screening was not yet commonplace. 36 Similarly, participants of the present study had higher adherence to the DASH diet (63.1%) than adolescents with diabetes in a study by Günther et al. Conversely, firefighters who were coached to use a Mediterranean-style dietary pattern and dietary had higher adherence to MD than did participants in this study.27

A closer examination of specific food category component scores that comprise the 3 distinct dietary patterns revealed mixed results from the multinomial modeling, which may have to do with the guideline thresholds used to calculate the dietary scores. When analyzed separately in the logistic regression models for their associations with nonadvanced adenomas and AN compared with no neoplasia, higher MD and DASH fruit scores (but not HEI fruit scores) were found to be significant. Other studies have had mixed findings when attempting to test for associations of fruit intake with adenoma recurrence.10,37

This study had some unexpected findings. Vegetable intake was not associated with nonadvanced adenomas or AN risk. Studies of food categories have consistently found vegetable (specifically cruciferous ones) intake to be linked with lower odds for cancers.38 Likewise, the red meat category, which was only a unique food category in the MD score, was not associated with nonadvanced adenomas or AN. Despite consistent literature suggesting higher intake of red meat and processed meats increases CRC risk, in 2019 the Nutritional Recommendations Consortium indicated that the evidence was weak.39,40 This study showed higher DASH diet scores for low-fat dairy, which were maximized when participants reported at least 50% of their dairy servings per day as being low-fat, had lower odds for AN. Yet, the MD scores for low-fat dairy had no association with either outcome; their calculation was based on total number of servings per week. This difference in findings suggests the fat intake ratio may be more relevant to CRC risk than intake quantity.

The literature is mixed regarding fatty acid intake and CRC risk, which may be relevant to both dairy and meat intake. One systematic review and meta-analysis found dietary fat and types of fatty acid intake had no association with CRC risk.41 However, a more recent meta-analysis that assessed both dietary intake and plasma levels of fatty acids did find some statistically significant differences for various types of fatty acids and CRC risk.42

The findings in the present study that grain intake is associated with lower odds for more severe colonoscopy findings among veterans are notable.43 Lieberman et al, using the CSP #380 data, found that cereal fiber intake was associated with a lower odds for AN compared with hyperplastic polyps (OR, 0.98 [95% CI, 0.96- 1.00]).18 Similarly, Hullings et al determined that older adults in the highest quintile of cereal fiber intake had significantly lower odds of CRC than those in lower odds for CRC when compared with lowest quintile (OR, 0.89 [95% CI, 0.83- 0.96]; P < .001).44 These findings support existing guidance that prioritizes whole grains as a key source of dietary fiber for CRC prevention.

A recent literature review on fiber, fat, and CRC risk suggested a consensus regarding one protective mechanism: dietary fiber from grains modulates the gut microbiota by promoting butyrate synthesis.45 Butyrate is a short-chain fatty acid that supports energy production in colonocytes and has tumor-suppressing properties.46 Our findings suggest there could be more to learn about the relationship between butyrate production and reduction of CRC risk through metabolomic studies that use measurements of plasma butyrate. These studies may examine associations between not just a singular food or food category, but rather food patterns that include fruits, vegetables, nuts and seeds, and whole grains known to promote butyrate production and plasma butyrate.47

Improved understanding of mechanisms and risk-modifying lifestyle factors such as dietary patterns may enhance prevention strategies. Identifying the collective chemopreventive characteristics of a specific dietary pattern (eg, MD) will be helpful to clinicians and health care staff to promote healthy eating to reduce cancer risk. More studies are needed to understand whether such promotion is more clinically applicable and effective for patients, as compared with eating more or less of specific foods (eg, more whole grains, less red meat). Furthermore, considering important environmental factors collectively beyond dietary patterns may offer a way to better tailor screening and implement a variety of lifestyle interventions. In the literature, this is often referred to as a teachable moment when patients’ attentions are captured and may position them to be more receptive to guidance.48

Limitations

This study has several important limitations and leaves opportunities for future studies that explore the role of dietary patterns and AN or CRC risk. First, the FFQ data used to calculate dietary pattern scores used in analysis were only captured at baseline, and there are nearly 3 decades across the study period. However, it is widely assumed that the diets of older adults, like those included in this study, remain stable over time which is appropriate given our sample population was aged 50 to 75 years when the baseline FFQ data were collected.49-51 Additionally, while the HEI is a well-documented, standard scoring method for dietary quality, there are multitudes of dietary pattern scoring approaches for MD and DASH.23,52,53 Finally, findings from this study using the sample of veterans may not be generalizable to a broader population. Future longitudinal studies that test for a clinically significant change threshold are warranted.

Conclusion

Results of this study suggest future research should further explore the effects of dietary patterns, particularly intake of specific food groups in combination, as opposed to individual nutrients or food items, on AN and CRC risk. Possible studies might explore these dietary patterns for their mechanistic role in altering the microbiome metabolism, which may influence CRC outcomes or include diet in a more comprehensive, holistic risk score that could be used to predict colonic neoplasia risk or in intervention studies that assess the effects of dietary changes on long-term CRC prevention. We suggest there are differences in people’s dietary intake patterns that might be important to consider when implementing tailored approaches to CRC risk mitigation.

References
  1. Zauber AG, Winawer SJ, O’Brien MJ, et al. Colonoscopic polypectomy and long-term prevention of colorectalcancer deaths. N Engl J Med. 2012;366(8):687-696. doi:10.1056/NEJMoa1100370
  2. Nishihara R, Wu K, Lochhead P, et al. Long-term colorectal-cancer incidence and mortality after lower endoscopy. N Engl J Med. 2013;369(12):1095-1105. doi:10.1056/NEJMoa1301969
  3. Bretthauer M, Løberg M, Wieszczy P, et al. Effect of colonoscopy screening on risks of colorectal cancer and related death. N Engl J Med. 2022;387(17):1547-1556. doi:10.1056/NEJMoa2208375
  4. Cottet V, Bonithon-Kopp C, Kronborg O, et al. Dietary patterns and the risk of colorectal adenoma recurrence in a European intervention trial. Eur J Cancer Prev. 2005;14(1):21.
  5. Miller PE, Lesko SM, Muscat JE, Lazarus P, Hartman TJ. Dietary patterns and colorectal adenoma and cancer risk: a review of the epidemiological evidence. Nutr Cancer. 2010;62(4):413-424. doi:10.1080/01635580903407114
  6. Godos J, Bella F, Torrisi A, Sciacca S, Galvano F, Grosso G. Dietary patterns and risk of colorectal adenoma: a systematic review and meta-analysis of observational studies. J Hum Nutr Diet Off J Br Diet Assoc. 2016;29(6):757-767. doi:10.1111/jhn.12395
  7. Haggar FA, Boushey RP. Colorectal cancer epidemiology: incidence, mortality, survival, and risk factors. Clin Colon Rectal Surg. 2009;22(4):191-197. doi:10.1055/s-0029-1242458
  8. Dong D, Stewart H, Carlson AC. An Examination of Veterans’ Diet Quality. U.S. Department of Agriculture, Economic Research Service; 2019:32.
  9. El-Halabi MM, Rex DK, Saito A, Eckert GJ, Kahi CJ. Defining adenoma detection rate benchmarks in average-risk male veterans. Gastrointest Endosc. 2019;89(1):137-143. doi:10.1016/j.gie.2018.08.021
  10. Alberts DS, Hess LM, eds. Fundamentals of Cancer Prevention. Springer International Publishing; 2019. doi:10.1007/978-3-030-15935-1
  11. Dahm CC, Keogh RH, Spencer EA, et al. Dietary fiber and colorectal cancer risk: a nested case-control study using food diaries. J Natl Cancer Inst. 2010;102(9):614-626. doi:10.1093/jnci/djq092
  12. Aune D, Lau R, Chan DSM, et al. Dairy products and colorectal cancer risk: a systematic review and metaanalysis of cohort studies. Ann Oncol. 2012;23(1):37-45. doi:10.1093/annonc/mdr269
  13. Lee JE, Li H, Chan AT, et al. Circulating levels of vitamin D and colon and rectal cancer: the Physicians’ Health Study and a meta-analysis of prospective studies. Cancer Prev Res Phila Pa. 2011;4(5):735-743. doi:10.1158/1940-6207.CAPR-10-0289
  14. Carroll C, Cooper K, Papaioannou D, Hind D, Pilgrim H, Tappenden P. Supplemental calcium in the chemoprevention of colorectal cancer: a systematic review and meta-analysis. Clin Ther. 2010;32(5):789-803. doi:10.1016/j.clinthera.2010.04.024
  15. Park Y, Spiegelman D, Hunter DJ, et al. Intakes of vitamins A, C, and E and use of multiple vitamin supplements and risk of colon cancer: a pooled analysis of prospective cohort studies. Cancer Causes Control CCC. 2010;21(11):1745- 1757. doi:10.1007/s10552-010-9549-y
  16. Alexander DD, Weed DL, Miller PE, Mohamed MA. Red meat and colorectal cancer: a quantitative update on the state of the epidemiologic science. J Am Coll Nutr. 2015;34(6):521-543. doi:10.1080/07315724.2014.992553
  17. Park SY, Wilkens LR, Setiawan VW, Monroe KR, Haiman CA, Le Marchand L. Alcohol intake and colorectal cancer risk in the multiethnic cohort study. Am J Epidemiol. 2019;188(1):67-76. doi:10.1093/aje/kwy208
  18. Lieberman DA. Risk Factors for advanced colonic neoplasia and hyperplastic polyps in asymptomatic individuals. JAMA. 2003;290(22):2959. doi:10.1001/jama.290.22.2959
  19. Archambault AN, Jeon J, Lin Y, et al. Risk stratification for early-onset colorectal cancer using a combination of genetic and environmental risk scores: an international multi-center study. J Natl Cancer Inst. 2022;114(4):528-539. doi:10.1093/jnci/djac003
  20. Carr PR, Weigl K, Edelmann D, et al. Estimation of absolute risk of colorectal cancer based on healthy lifestyle, genetic risk, and colonoscopy status in a populationbased study. Gastroenterology. 2020;159(1):129-138.e9. doi:10.1053/j.gastro.2020.03.016
  21. Sullivan BA, Qin X, Miller C, et al. Screening colonoscopy findings are associated with noncolorectal cancer mortality. Clin Transl Gastroenterol. 2022;13(4):e00479. doi:10.14309/ctg.0000000000000479
  22. Erben V, Carr PR, Holleczek B, Stegmaier C, Hoffmeister M, Brenner H. Dietary patterns and risk of advanced colorectal neoplasms: A large population based screening study in Germany. Prev Med. 2018;111:101-109. doi:10.1016/j.ypmed.2018.02.025
  23. Donovan MG, Selmin OI, Doetschman TC, Romagnolo DF. Mediterranean diet: prevention of colorectal cancer. Front Nutr. 2017;4:59. doi:10.3389/fnut.2017.00059
  24. Mohseni R, Mohseni F, Alizadeh S, Abbasi S. The Association of Dietary Approaches to Stop Hypertension (DASH) diet with the risk of colorectal cancer: a meta-analysis of observational studies.Nutr Cancer. 2020;72(5):778-790. doi:10.1080/01635581.2019.1651880
  25. Lieberman DA, Weiss DG, Bond JH, Ahnen DJ, Garewal H, Chejfec G. Use of colonoscopy to screen asymptomatic adults for colorectal cancer. Veterans Affairs Cooperative Study Group 380. N Engl J Med. 2000;343(3):162-168. doi:10.1056/NEJM200007203430301
  26. Developing the Healthy Eating Index (HEI) | EGRP/ DCCPS/NCI/NIH. Accessed July 22, 2025. https://epi.grants.cancer.gov/hei/developing.html#2015c
  27. Reeve E, Piccici F, Feairheller DL. Validation of a Mediterranean diet scoring system for intervention based research. J Nutr Med Diet Care. 2021;7(1):053. doi:10.23937/2572-3278/1510053
  28. Günther AL, Liese AD, Bell RA, et al. ASSOCIATION BETWEEN THE DIETARY APPROACHES TO HYPERTENSION (DASH) DIET AND HYPERTENSION IN YOUTH WITH DIABETES. Hypertens Dallas Tex 1979. 2009;53(1):6-12. doi:10.1161/HYPERTENSIONAHA.108.116665
  29. Buckland G, Agudo A, Luján L, et al. Adherence to a Mediterranean diet and risk of gastric adenocarcinoma within the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort study. Am J Clin Nutr. 2010;91(2):381- 390. doi:10.3945/ajcn.2009.28209
  30. Rimm EB, Giovannucci EL, Stampfer MJ, Colditz GA, Litin LB, Willett WC. Reproducibility and validity of an expanded self-administered semiquantitative food frequency questionnaire among male health professionals. Am J Epidemiol. 1992;135(10):1114-1126. doi:10.1093/oxfordjournals.aje.a116211
  31. Charlson ME, Pompei P, Ales KL, MacKenzie CR. A new method of classifying prognostic comorbidity in longitudinal studies: development and validation. J Chronic Dis. 1987;40(5):373-383. doi:10.1016/0021-9681(87)90171-8
  32. Lieberman DA, Weiss DG, Harford WV, et al. Fiveyear colon surveillance after screening colonoscopy. Gastroenterology. 2007;133(4):1077-1085. doi:10.1053/j.gastro.2007.07.006
  33. Lieberman D, Sullivan BA, Hauser ER, et al. Baseline colonoscopy findings associated with 10-year outcomes in a screening cohort undergoing colonoscopy surveillance. Gastroenterology. 2020;158(4):862-874.e8. doi:10.1053/j.gastro.2019.07.052
  34. PROC LOGISTIC: PROC LOGISTIC Statement : SAS/STAT(R) 9.22 User’s Guide. Accessed July 22, 2025. https://support.sas.com/documentation/cdl/en/statug/63347/HTML/default/viewer.htm#statug_logistic_sect004.htm
  35. PROC MULTTEST: PROC MULTTEST Statement : SAS/ STAT(R) 9.22 User’s Guide. Accessed July 22, 2025. https://support.sas.com/documentation/cdl/en/statug/63347/HTML/default/viewer.htm#statug_multtest_sect005.htm
  36. Elston DM. Participation bias, self-selection bias, and response bias. J Am Acad Dermatol. Published online June 18, 2021. doi:10.1016/j.jaad.2021.06.025
  37. Sansbury LB, Wanke K, Albert PS, et al. The effect of strict adherence to a high-fiber, high-fruit and -vegetable, and low-fat eating pattern on adenoma recurrence. Am J Epidemiol. 2009;170(5):576-584. doi:10.1093/aje/kwp169
  38. Borgas P, Gonzalez G, Veselkov K, Mirnezami R. Phytochemically rich dietary components and the risk of colorectal cancer: A systematic review and meta-analysis of observational studies. World J Clin Oncol. 2021;12(6):482- 499. doi:10.5306/wjco.v12.i6.482
  39. Papadimitriou N, Markozannes G, Kanellopoulou A, et al. An umbrella review of the evidence associating diet and cancer risk at 11 anatomical sites. Nat Commun. 2021;12(1):4579. doi:10.1038/s41467-021-24861-8
  40. Johnston BC, Zeraatkar D, Han MA, et al. Unprocessed red meat and processed meat consumption: dietary guideline recommendations from the nutritional recommendations (NutriRECS) Consortium. Ann Intern Med. 2019;171(10):756-764. doi:10.7326/M19-1621
  41. Kim M, Park K. Dietary fat intake and risk of colorectal cancer: a systematic review and meta-analysis of prospective studies. Nutrients. 2018;10(12):1963. doi:10.3390/nu10121963
  42. Lu Y, Li D, Wang L, et al. Comprehensive investigation on associations between dietary intake and blood levels of fatty acids and colorectal cancer risk. Nutrients. 2023;15(3):730. doi:10.3390/nu15030730
  43. Gherasim A, Arhire LI, Ni.a O, Popa AD, Graur M, Mihalache L. The relationship between lifestyle components and dietary patterns. Proc Nutr Soc. 2020;79(3):311-323. doi:10.1017/S0029665120006898
  44. Hullings AG, Sinha R, Liao LM, Freedman ND, Graubard BI, Loftfield E. Whole grain and dietary fiber intake and risk of colorectal cancer in the NIH-AARP Diet and Health Study cohort. Am J Clin Nutr. 2020;112(3):603- 612. doi:10.1093/ajcn/nqaa161
  45. Ocvirk S, Wilson AS, Appolonia CN, Thomas TK, O’Keefe SJD. Fiber, fat, and colorectal cancer: new insight into modifiable dietary risk factors. Curr Gastroenterol Rep. 2019;21(11):62. doi:10.1007/s11894-019-0725-2
  46. O’Keefe SJD. Diet, microorganisms and their metabolites, and colon cancer. Nat Rev Gastroenterol Hepatol. 2016;13(12):691-706. doi:10.1038/nrgastro.2016.165
  47. The health benefits and side effects of Butyrate Cleveland Clinic. July 11, 2022. Accessed July 22, 2025. https://health.clevelandclinic.org/butyrate-benefits/
  48. Knudsen MD, Wang L, Wang K, et al. Changes in lifestyle factors after endoscopic screening: a prospective study in the United States. Clin Gastroenterol Hepatol Off ClinPract J Am Gastroenterol Assoc. 2022;20(6):e1240-e1249. doi:10.1016/j.cgh.2021.07.014
  49. Thorpe MG, Milte CM, Crawford D, McNaughton SA. Education and lifestyle predict change in dietary patterns and diet quality of adults 55 years and over. Nutr J. 2019;18(1):67. doi:10.1186/s12937-019-0495-6
  50. Chapman K, Ogden J. How do people change their diet?: an exploration into mechanisms of dietary change. J Health Psychol. 2009;14(8):1229-1242. doi:10.1177/1359105309342289
  51. Djoussé L, Petrone AB, Weir NL, et al. Repeated versus single measurement of plasma omega-3 fatty acids and risk of heart failure. Eur J Nutr. 2014;53(6):1403-1408. doi:10.1007/s00394-013-0642-3
  52. Bach-Faig A, Berry EM, Lairon D, et al. Mediterranean diet pyramid today. Science and cultural updates. Public Health Nutr. 2011;14(12A):2274-2284. doi:10.1017/S1368980011002515
  53. Miller PE, Cross AJ, Subar AF, et al. Comparison of 4 established DASH diet indexes: examining associations of index scores and colorectal cancer123. Am J Clin Nutr. 2013;98(3):794-803. doi:10.3945/ajcn.113.063602
  54. Krebs-Smith SM, Pannucci TE, Subar AF, et al. Update of the Healthy Eating Index: HEI-2015. J Acad Nutr Diet. 2018;118(9):1591-1602. doi:10.1016/j.jand.2018.05.021
  55. P.R. Pehrsson, Cutrufelli RL, Gebhardt SE, et al. USDA Database for the Added Sugars Content of Selected Foods. USDA; 2005. www.ars.usda.gov/nutrientdata
Article PDF
Author and Disclosure Information

April R. Williams, PhD, MSa; Thomas S. Redding IV, MSb; Brian A. Sullivan, MD, MHSb,c; Xuejun Qin, PhDb,c; Belinda Ear, MPHb; Kellie J. Sims, PhDb; Elizabeth R. Hauser, PhD, MSb,c; Christina D. Williams, PhD, MPHb,c; Jason A. Dominitz, MD, MHSd,e; David Lieberman, MDf,g

Author affiliations
aVeterans Affairs Boston Healthcare System, Massachusetts
bVeterans Affairs Durham Health Care System, North Carolina
cDuke University, Durham, North Carolina
dUniversity of Washington, Seattle
eVeterans Health Administration, Washington, DC
fVeterans Affairs Portland Health Care System, Oregon
gOregon Health & Science University, Portland

Author disclosures
The authors report no actual or potential conflicts of interest with regard to this article.

Correspondence: April Williams (april.williams9@va.gov)

Fed Pract. 2025;42(suppl 3). Published online August 15. doi:10.12788/fp.0609

Issue
Federal Practitioner - 42(suppl 3)
Publications
Topics
Page Number
S30-S39
Sections
Author and Disclosure Information

April R. Williams, PhD, MSa; Thomas S. Redding IV, MSb; Brian A. Sullivan, MD, MHSb,c; Xuejun Qin, PhDb,c; Belinda Ear, MPHb; Kellie J. Sims, PhDb; Elizabeth R. Hauser, PhD, MSb,c; Christina D. Williams, PhD, MPHb,c; Jason A. Dominitz, MD, MHSd,e; David Lieberman, MDf,g

Author affiliations
aVeterans Affairs Boston Healthcare System, Massachusetts
bVeterans Affairs Durham Health Care System, North Carolina
cDuke University, Durham, North Carolina
dUniversity of Washington, Seattle
eVeterans Health Administration, Washington, DC
fVeterans Affairs Portland Health Care System, Oregon
gOregon Health & Science University, Portland

Author disclosures
The authors report no actual or potential conflicts of interest with regard to this article.

Correspondence: April Williams (april.williams9@va.gov)

Fed Pract. 2025;42(suppl 3). Published online August 15. doi:10.12788/fp.0609

Author and Disclosure Information

April R. Williams, PhD, MSa; Thomas S. Redding IV, MSb; Brian A. Sullivan, MD, MHSb,c; Xuejun Qin, PhDb,c; Belinda Ear, MPHb; Kellie J. Sims, PhDb; Elizabeth R. Hauser, PhD, MSb,c; Christina D. Williams, PhD, MPHb,c; Jason A. Dominitz, MD, MHSd,e; David Lieberman, MDf,g

Author affiliations
aVeterans Affairs Boston Healthcare System, Massachusetts
bVeterans Affairs Durham Health Care System, North Carolina
cDuke University, Durham, North Carolina
dUniversity of Washington, Seattle
eVeterans Health Administration, Washington, DC
fVeterans Affairs Portland Health Care System, Oregon
gOregon Health & Science University, Portland

Author disclosures
The authors report no actual or potential conflicts of interest with regard to this article.

Correspondence: April Williams (april.williams9@va.gov)

Fed Pract. 2025;42(suppl 3). Published online August 15. doi:10.12788/fp.0609

Article PDF
Article PDF

Screening for colorectal cancer (CRC) with colonoscopy enables the identification and removal of CRC precursors (colonic adenomas) and has been associated with reduced risk of CRC incidence and mortality.1-3 Furthermore, there is consensus that diet and lifestyle may be associated with forestalling CRC pathogenesis at the intermediate adenoma stages.4-7 However, studies have shown that US veterans have poorer diet quality and a higher risk for neoplasia compared with nonveterans, reinforcing the need for tailored clinical approaches.8,9 Combining screening with conversations about modifiable environmental and lifestyle risk factors, such as poor diet, is a highly relevant and possibly easily leveraged prevention for those at high risk. However, there is limited evidence for any particular dietary patterns or dietary features that are most important over time.7

Several dietary components have been shown to be associated with CRC risk,10 either as potentially chemopreventive (fiber, fruits and vegetables,11 dairy,12 supplemental vitamin D,13 calcium,14 and multivitamins15) or carcinogenic (red meat16 and alcohol17). Previous studies of veterans have similarly shown that higher intake of fiber and vitamin D reduced risk, and red meat is associated with an increased risk for finding CRC precursors during colonoscopy.18 However, these dietary categories are often analyzed in isolation. Studying healthy dietary patterns in aggregate may be more clinically relevant and easier to implement for prevention of CRC and its precursors.19-21 Healthy dietary patterns, such as the US Dietary Guidelines for Americans represented by the Healthy Eating Index (HEI), the Mediterranean diet (MD), and the Dietary Approaches to Stop Hypertension (DASH) diet, have been associated with lower risk for chronic disease.22-24 Despite the extant literature, no known studies have compared these dietary patterns for associations with risk of CRC precursor or CRC development among US veterans undergoing long-term screening and follow-up after a baseline colonoscopy.

The objective of this study was to test for associations between baseline scores of healthy dietary patterns and the most severe colonoscopy findings (MSCFs) over ≥ 10 years following a baseline screening colonoscopy in veterans.

Methods

Participants in the Cooperative Studies Program (CSP) #380 cohort study included 3121 asymptomatic veterans aged 50 to 75 years at baseline who had consented to initial screening colonoscopy between 1994 and 1997, with subsequent follow-up and surveillance.25 Prior to their colonoscopy, all participants completed a baseline study survey that included questions about cancer risk factors including family history of CRC, diet, physical activity, and medication use.

Included in this cross-sectional analysis were data from a sample of veteran participants of the CSP #380 cohort with 1 baseline colonoscopy, follow-up surveillance through 2009, a cancer risk factor survey collected at baseline, and complete demographic and clinical indicator data. Excluded from the analysis were 67 participants with insufficient responses to the dietary food frequency questionnaire (FFQ) and 31 participants with missing body mass index (BMI), 3023 veterans.

Measures

MSCF. The outcome of interest in this study was the MSCF recorded across all participant colonoscopies during the study period. MSCF was categorized as either (1) no neoplasia; (2) < 2 nonadvanced adenomas, including small adenomas (diameter < 10 mm) with tubular histology; or (3) advanced neoplasia (AN), which is characterized by adenomas > 10 mm in diameter, with villous histology, with high-grade dysplasia, or CRC.

Dietary patterns. Dietary pattern scores representing dietary quality and calculated based on recommendations of the US Dietary Guidelines for Americans using the HEI, MD, and DASH diets were independent variables.26-28 These 3 dietary patterns were chosen for their hypothesized relationship with CRC risk, but each weighs food categories differently (Appendix 1).22-24,29 Dietary pattern scores were calculated using the CSP #380 self-reported responses to 129 baseline survey questions adapted from a well-established and previously validated semiquantitative FFQ.30 The form was administered by mail twice to a sample of 127 participants at baseline and at 1 year. During this interval, men completed 1-week diet records twice, spaced about 6 months apart. Mean values for intake of most nutrients assessed by the 2 methods were similar. Intraclass correlation coefficients for the baseline and 1-year FFQ-assessed nutrient intakes that ranged from 0.47 for vitamin E (without supplements) to 0.80 for vitamin C (with supplements). Correlation coefficients between the energy-adjusted nutrient intakes were measured by diet records and the 1-year FFQ, which asked about diet during the year encompassing the diet records. Higher raw and percent scores indicated better alignment with recommendations from each respective dietary pattern. Percent scores were calculated as a standardizing method and used in analyses for ease of comparing the dietary patterns. Scoring can be found in Appendix 2.

0825FED-AVAHO-COLON-A10825FED-AVAHO-COLON-A2

Demographic characteristics and clinical indicators. Demographic characteristics included age categories, sex, and race/ethnicity. Clinical indicators included BMI, the number of comorbid conditions used to calculate the Charlson Comorbidity Index, family history of CRC in first-degree relatives, number of follow-up colonoscopies across the study period, and food-based vitamin D intake.31 These variables were chosen for their applicability found in previous CSP #380 cohort studies.18,32,33 Self-reported race and ethnicity were collapsed due to small numbers in some groups. The authors acknowledge these are distinct concepts and the variable has limited utility other than for controlling for systemic racism in the model.

Statistical Analyses

Descriptive statistics were used to describe distributional assumptions for all variables, including demographics, clinical indicators, colonoscopy results, and dietary patterns. Pairwise correlations between the total dietary pattern scores and food category scores were calculated with Pearson correlation (r).

Multinomial logistic regression models were created using SAS procedure LOGISTIC with the outcome of the categorical MSCF (no neoplasia, nonadvanced adenoma, or AN).34 A model was created for each independent predictor variable of interest (ie, the HEI, MD, or DASH percentage-standardized dietary pattern score and each food category comprising each dietary pattern score). All models were adjusted for age, sex, race/ethnicity, BMI, number of comorbidities, family history of CRC, number of follow-up colonoscopies, and estimated daily food-derived vitamin D intake. The demographic and clinical indicators were included in the models as they are known to be associated with CRC risk.18 The number of colonoscopies was included to control for surveillance intensity presuming risk for AN is reduced as polyps are removed. Because colonoscopy findings from an initial screening have unique clinical implications compared with follow- up and surveillance, MSCF was observed in 2 ways in sensitivity analyses: (1) baseline and (2) aggregate follow-up and surveillance only, excluding baseline findings.

Adjusted odds ratios (aORs) and 95% CIs for each of the MSCF outcomes with a reference finding of no neoplasia for the models are presented. We chose not to adjust for multiple comparisons across the different dietary patterns given the correlation between dietary pattern total and category scores but did adjust for multiple comparisons for dietary categories within each dietary pattern. Tests for statistical significance used α= .05 for the dietary pattern total scores and P values for the dietary category scores for each dietary pattern controlled for false discovery rate using the MULTTEST SAS procedure.35 All data manipulations and analyses were performed using SAS version 9.4.

Results

The study included 3023 patients. All were aged 50 to 75 years, 2923 (96.7%) were male and 2532 (83.8%) were non-Hispanic White (Table 1). Most participants were overweight or obese (n = 2535 [83.8%]), 2024 (67.0%) had ≤ 2 comorbidities, and 2602 (86.1%) had no family history of CRC. The MSCF for 1628 patients (53.9%) was no neoplasia, 966 patients (32.0%) was nonadvanced adenoma, and 429 participants (14.2%) had AN.

0825FED-AVAHO-COLON-T1

Mean percent scores were 58.5% for HEI, 38.2% for MD, and 63.1% for the DASH diet, with higher percentages indicating greater alignment with the recommendations for each diet (Table 2). All 3 dietary patterns scores standardized to percentages were strongly and significantly correlated in pairwise comparisons: HEI:MD, r = 0.62 (P < .001); HEI:DASH, r = 0.60 (P < .001); and MD:DASH, r = 0.72 (P < .001). Likewise, food category scores were significantly correlated across dietary patterns. For example, whole grain and fiber values from each dietary score were strongly correlated in pairwise comparisons: HEI Whole Grain:MD Grain, r = 0.64 (P < .001); HEI Whole Grain:DASH Fiber, r = 0.71 (P < .001); and MD Grain:DASH Fiber, r = 0.70 (P < .001).

0825FED-AVAHO-COLON-T2

Associations between individual participants' dietary pattern scores and the outcome of their pooled MSCF from baseline screening and ≥ 10 years of surveillance are presented in Table 3. For each single-point increases in dietary pattern scores (reflecting better dietary quality), aORs for nonadvanced adenoma vs no neoplasia were slightly lower but not statistically significantly: HEI, aOR, 1.00 (95% CI, 0.99-1.01); MD, aOR, 0.98 (95% CI, 0.94-1.02); and DASH, aOR, 0.99 (95% CI, 0.99-1.00). aORs for AN vs no neoplasia were slightly lower for each dietary pattern assessed, and only the MD and DASH scores were significantly different from 1.00: HEI, aOR, 1.00 (95% CI, 0.99-1.01); MD, aOR, 0.95 (95% CI, 0.90-1.00); and DASH, aOR, 0.99 (95% CI, 0.98-1.00).

0825FED-AVAHO-COLON-T3

We observed lower odds for nonadvanced adenoma and AN among all these dietary patterns when there was greater alignment with the recommended intake of whole grains and fiber. In separate models conducted using food categories comprising the dietary patterns as independent variables and after correcting for multiple tests, higher scores for the HEI Refined Grain category were associated with higher odds for nonadvanced adenoma (aOR, 1.03 [95% CI, 1.01-1.05]; P = .01) and AN (aOR, 1.05 [95% CI, 1.02-1.08]; P < .001). Higher scores for the HEI Whole Grain category were associated with lower odds for nonadvanced adenoma (aOR, 0.97 [95% CI, 0.95-0.99]; P = .01) and AN (aOR, 0.96 [95% CI, 0.93-0.99]; P = .01). Higher scores for the MD Grain category were significantly associated with lower odds for nonadvanced adenoma (aOR, 0.44 [95% CI, 0.26-0.75]; P = .002) and AN (aOR, 0.29 [95% CI, 0.14-0.62]; P = .001). The DASH Grains category also was significantly associated with lower odds for AN (aOR, 0.86 [95% CI, 0.78-0.95]; P = .002).

Discussion

In this study of 3023 veterans undergoing first-time screening colonoscopy and ≥ 10 years of surveillance, we found that healthy dietary patterns, as assessed by the MD and DASH diet, were significantly associated with lower risk of AN. Additionally, we identified lower odds for AN and nonadvanced adenoma compared with no neoplasia for higher grain scores for all the dietary patterns studied. Other food categories that comprise the dietary pattern scores had mixed associations with the MSCF outcomes. Several other studies have examined associations between dietary patterns and risk for CRC but to our knowledge, no studies have explored these associations among US veterans.

These results also indicate study participants had better than average (based on a 50% threshold) dietary quality according to the HEI and DASH diet scoring methods we used, but poor dietary quality according to the MD scoring method. The mean HEI scores for the present study were higher than a US Department of Agriculture study by Dong et al that compared dietary quality between veterans and nonveterans using the HEI, for which veterans’ expected HEI score was 45.6 of 100.8 This could be explained by the fact that the participants needed to be healthy to be eligible and those with healthier behaviors overall may have self-selected into the study due to motivation for screening during a time when screening was not yet commonplace. 36 Similarly, participants of the present study had higher adherence to the DASH diet (63.1%) than adolescents with diabetes in a study by Günther et al. Conversely, firefighters who were coached to use a Mediterranean-style dietary pattern and dietary had higher adherence to MD than did participants in this study.27

A closer examination of specific food category component scores that comprise the 3 distinct dietary patterns revealed mixed results from the multinomial modeling, which may have to do with the guideline thresholds used to calculate the dietary scores. When analyzed separately in the logistic regression models for their associations with nonadvanced adenomas and AN compared with no neoplasia, higher MD and DASH fruit scores (but not HEI fruit scores) were found to be significant. Other studies have had mixed findings when attempting to test for associations of fruit intake with adenoma recurrence.10,37

This study had some unexpected findings. Vegetable intake was not associated with nonadvanced adenomas or AN risk. Studies of food categories have consistently found vegetable (specifically cruciferous ones) intake to be linked with lower odds for cancers.38 Likewise, the red meat category, which was only a unique food category in the MD score, was not associated with nonadvanced adenomas or AN. Despite consistent literature suggesting higher intake of red meat and processed meats increases CRC risk, in 2019 the Nutritional Recommendations Consortium indicated that the evidence was weak.39,40 This study showed higher DASH diet scores for low-fat dairy, which were maximized when participants reported at least 50% of their dairy servings per day as being low-fat, had lower odds for AN. Yet, the MD scores for low-fat dairy had no association with either outcome; their calculation was based on total number of servings per week. This difference in findings suggests the fat intake ratio may be more relevant to CRC risk than intake quantity.

The literature is mixed regarding fatty acid intake and CRC risk, which may be relevant to both dairy and meat intake. One systematic review and meta-analysis found dietary fat and types of fatty acid intake had no association with CRC risk.41 However, a more recent meta-analysis that assessed both dietary intake and plasma levels of fatty acids did find some statistically significant differences for various types of fatty acids and CRC risk.42

The findings in the present study that grain intake is associated with lower odds for more severe colonoscopy findings among veterans are notable.43 Lieberman et al, using the CSP #380 data, found that cereal fiber intake was associated with a lower odds for AN compared with hyperplastic polyps (OR, 0.98 [95% CI, 0.96- 1.00]).18 Similarly, Hullings et al determined that older adults in the highest quintile of cereal fiber intake had significantly lower odds of CRC than those in lower odds for CRC when compared with lowest quintile (OR, 0.89 [95% CI, 0.83- 0.96]; P < .001).44 These findings support existing guidance that prioritizes whole grains as a key source of dietary fiber for CRC prevention.

A recent literature review on fiber, fat, and CRC risk suggested a consensus regarding one protective mechanism: dietary fiber from grains modulates the gut microbiota by promoting butyrate synthesis.45 Butyrate is a short-chain fatty acid that supports energy production in colonocytes and has tumor-suppressing properties.46 Our findings suggest there could be more to learn about the relationship between butyrate production and reduction of CRC risk through metabolomic studies that use measurements of plasma butyrate. These studies may examine associations between not just a singular food or food category, but rather food patterns that include fruits, vegetables, nuts and seeds, and whole grains known to promote butyrate production and plasma butyrate.47

Improved understanding of mechanisms and risk-modifying lifestyle factors such as dietary patterns may enhance prevention strategies. Identifying the collective chemopreventive characteristics of a specific dietary pattern (eg, MD) will be helpful to clinicians and health care staff to promote healthy eating to reduce cancer risk. More studies are needed to understand whether such promotion is more clinically applicable and effective for patients, as compared with eating more or less of specific foods (eg, more whole grains, less red meat). Furthermore, considering important environmental factors collectively beyond dietary patterns may offer a way to better tailor screening and implement a variety of lifestyle interventions. In the literature, this is often referred to as a teachable moment when patients’ attentions are captured and may position them to be more receptive to guidance.48

Limitations

This study has several important limitations and leaves opportunities for future studies that explore the role of dietary patterns and AN or CRC risk. First, the FFQ data used to calculate dietary pattern scores used in analysis were only captured at baseline, and there are nearly 3 decades across the study period. However, it is widely assumed that the diets of older adults, like those included in this study, remain stable over time which is appropriate given our sample population was aged 50 to 75 years when the baseline FFQ data were collected.49-51 Additionally, while the HEI is a well-documented, standard scoring method for dietary quality, there are multitudes of dietary pattern scoring approaches for MD and DASH.23,52,53 Finally, findings from this study using the sample of veterans may not be generalizable to a broader population. Future longitudinal studies that test for a clinically significant change threshold are warranted.

Conclusion

Results of this study suggest future research should further explore the effects of dietary patterns, particularly intake of specific food groups in combination, as opposed to individual nutrients or food items, on AN and CRC risk. Possible studies might explore these dietary patterns for their mechanistic role in altering the microbiome metabolism, which may influence CRC outcomes or include diet in a more comprehensive, holistic risk score that could be used to predict colonic neoplasia risk or in intervention studies that assess the effects of dietary changes on long-term CRC prevention. We suggest there are differences in people’s dietary intake patterns that might be important to consider when implementing tailored approaches to CRC risk mitigation.

Screening for colorectal cancer (CRC) with colonoscopy enables the identification and removal of CRC precursors (colonic adenomas) and has been associated with reduced risk of CRC incidence and mortality.1-3 Furthermore, there is consensus that diet and lifestyle may be associated with forestalling CRC pathogenesis at the intermediate adenoma stages.4-7 However, studies have shown that US veterans have poorer diet quality and a higher risk for neoplasia compared with nonveterans, reinforcing the need for tailored clinical approaches.8,9 Combining screening with conversations about modifiable environmental and lifestyle risk factors, such as poor diet, is a highly relevant and possibly easily leveraged prevention for those at high risk. However, there is limited evidence for any particular dietary patterns or dietary features that are most important over time.7

Several dietary components have been shown to be associated with CRC risk,10 either as potentially chemopreventive (fiber, fruits and vegetables,11 dairy,12 supplemental vitamin D,13 calcium,14 and multivitamins15) or carcinogenic (red meat16 and alcohol17). Previous studies of veterans have similarly shown that higher intake of fiber and vitamin D reduced risk, and red meat is associated with an increased risk for finding CRC precursors during colonoscopy.18 However, these dietary categories are often analyzed in isolation. Studying healthy dietary patterns in aggregate may be more clinically relevant and easier to implement for prevention of CRC and its precursors.19-21 Healthy dietary patterns, such as the US Dietary Guidelines for Americans represented by the Healthy Eating Index (HEI), the Mediterranean diet (MD), and the Dietary Approaches to Stop Hypertension (DASH) diet, have been associated with lower risk for chronic disease.22-24 Despite the extant literature, no known studies have compared these dietary patterns for associations with risk of CRC precursor or CRC development among US veterans undergoing long-term screening and follow-up after a baseline colonoscopy.

The objective of this study was to test for associations between baseline scores of healthy dietary patterns and the most severe colonoscopy findings (MSCFs) over ≥ 10 years following a baseline screening colonoscopy in veterans.

Methods

Participants in the Cooperative Studies Program (CSP) #380 cohort study included 3121 asymptomatic veterans aged 50 to 75 years at baseline who had consented to initial screening colonoscopy between 1994 and 1997, with subsequent follow-up and surveillance.25 Prior to their colonoscopy, all participants completed a baseline study survey that included questions about cancer risk factors including family history of CRC, diet, physical activity, and medication use.

Included in this cross-sectional analysis were data from a sample of veteran participants of the CSP #380 cohort with 1 baseline colonoscopy, follow-up surveillance through 2009, a cancer risk factor survey collected at baseline, and complete demographic and clinical indicator data. Excluded from the analysis were 67 participants with insufficient responses to the dietary food frequency questionnaire (FFQ) and 31 participants with missing body mass index (BMI), 3023 veterans.

Measures

MSCF. The outcome of interest in this study was the MSCF recorded across all participant colonoscopies during the study period. MSCF was categorized as either (1) no neoplasia; (2) < 2 nonadvanced adenomas, including small adenomas (diameter < 10 mm) with tubular histology; or (3) advanced neoplasia (AN), which is characterized by adenomas > 10 mm in diameter, with villous histology, with high-grade dysplasia, or CRC.

Dietary patterns. Dietary pattern scores representing dietary quality and calculated based on recommendations of the US Dietary Guidelines for Americans using the HEI, MD, and DASH diets were independent variables.26-28 These 3 dietary patterns were chosen for their hypothesized relationship with CRC risk, but each weighs food categories differently (Appendix 1).22-24,29 Dietary pattern scores were calculated using the CSP #380 self-reported responses to 129 baseline survey questions adapted from a well-established and previously validated semiquantitative FFQ.30 The form was administered by mail twice to a sample of 127 participants at baseline and at 1 year. During this interval, men completed 1-week diet records twice, spaced about 6 months apart. Mean values for intake of most nutrients assessed by the 2 methods were similar. Intraclass correlation coefficients for the baseline and 1-year FFQ-assessed nutrient intakes that ranged from 0.47 for vitamin E (without supplements) to 0.80 for vitamin C (with supplements). Correlation coefficients between the energy-adjusted nutrient intakes were measured by diet records and the 1-year FFQ, which asked about diet during the year encompassing the diet records. Higher raw and percent scores indicated better alignment with recommendations from each respective dietary pattern. Percent scores were calculated as a standardizing method and used in analyses for ease of comparing the dietary patterns. Scoring can be found in Appendix 2.

0825FED-AVAHO-COLON-A10825FED-AVAHO-COLON-A2

Demographic characteristics and clinical indicators. Demographic characteristics included age categories, sex, and race/ethnicity. Clinical indicators included BMI, the number of comorbid conditions used to calculate the Charlson Comorbidity Index, family history of CRC in first-degree relatives, number of follow-up colonoscopies across the study period, and food-based vitamin D intake.31 These variables were chosen for their applicability found in previous CSP #380 cohort studies.18,32,33 Self-reported race and ethnicity were collapsed due to small numbers in some groups. The authors acknowledge these are distinct concepts and the variable has limited utility other than for controlling for systemic racism in the model.

Statistical Analyses

Descriptive statistics were used to describe distributional assumptions for all variables, including demographics, clinical indicators, colonoscopy results, and dietary patterns. Pairwise correlations between the total dietary pattern scores and food category scores were calculated with Pearson correlation (r).

Multinomial logistic regression models were created using SAS procedure LOGISTIC with the outcome of the categorical MSCF (no neoplasia, nonadvanced adenoma, or AN).34 A model was created for each independent predictor variable of interest (ie, the HEI, MD, or DASH percentage-standardized dietary pattern score and each food category comprising each dietary pattern score). All models were adjusted for age, sex, race/ethnicity, BMI, number of comorbidities, family history of CRC, number of follow-up colonoscopies, and estimated daily food-derived vitamin D intake. The demographic and clinical indicators were included in the models as they are known to be associated with CRC risk.18 The number of colonoscopies was included to control for surveillance intensity presuming risk for AN is reduced as polyps are removed. Because colonoscopy findings from an initial screening have unique clinical implications compared with follow- up and surveillance, MSCF was observed in 2 ways in sensitivity analyses: (1) baseline and (2) aggregate follow-up and surveillance only, excluding baseline findings.

Adjusted odds ratios (aORs) and 95% CIs for each of the MSCF outcomes with a reference finding of no neoplasia for the models are presented. We chose not to adjust for multiple comparisons across the different dietary patterns given the correlation between dietary pattern total and category scores but did adjust for multiple comparisons for dietary categories within each dietary pattern. Tests for statistical significance used α= .05 for the dietary pattern total scores and P values for the dietary category scores for each dietary pattern controlled for false discovery rate using the MULTTEST SAS procedure.35 All data manipulations and analyses were performed using SAS version 9.4.

Results

The study included 3023 patients. All were aged 50 to 75 years, 2923 (96.7%) were male and 2532 (83.8%) were non-Hispanic White (Table 1). Most participants were overweight or obese (n = 2535 [83.8%]), 2024 (67.0%) had ≤ 2 comorbidities, and 2602 (86.1%) had no family history of CRC. The MSCF for 1628 patients (53.9%) was no neoplasia, 966 patients (32.0%) was nonadvanced adenoma, and 429 participants (14.2%) had AN.

0825FED-AVAHO-COLON-T1

Mean percent scores were 58.5% for HEI, 38.2% for MD, and 63.1% for the DASH diet, with higher percentages indicating greater alignment with the recommendations for each diet (Table 2). All 3 dietary patterns scores standardized to percentages were strongly and significantly correlated in pairwise comparisons: HEI:MD, r = 0.62 (P < .001); HEI:DASH, r = 0.60 (P < .001); and MD:DASH, r = 0.72 (P < .001). Likewise, food category scores were significantly correlated across dietary patterns. For example, whole grain and fiber values from each dietary score were strongly correlated in pairwise comparisons: HEI Whole Grain:MD Grain, r = 0.64 (P < .001); HEI Whole Grain:DASH Fiber, r = 0.71 (P < .001); and MD Grain:DASH Fiber, r = 0.70 (P < .001).

0825FED-AVAHO-COLON-T2

Associations between individual participants' dietary pattern scores and the outcome of their pooled MSCF from baseline screening and ≥ 10 years of surveillance are presented in Table 3. For each single-point increases in dietary pattern scores (reflecting better dietary quality), aORs for nonadvanced adenoma vs no neoplasia were slightly lower but not statistically significantly: HEI, aOR, 1.00 (95% CI, 0.99-1.01); MD, aOR, 0.98 (95% CI, 0.94-1.02); and DASH, aOR, 0.99 (95% CI, 0.99-1.00). aORs for AN vs no neoplasia were slightly lower for each dietary pattern assessed, and only the MD and DASH scores were significantly different from 1.00: HEI, aOR, 1.00 (95% CI, 0.99-1.01); MD, aOR, 0.95 (95% CI, 0.90-1.00); and DASH, aOR, 0.99 (95% CI, 0.98-1.00).

0825FED-AVAHO-COLON-T3

We observed lower odds for nonadvanced adenoma and AN among all these dietary patterns when there was greater alignment with the recommended intake of whole grains and fiber. In separate models conducted using food categories comprising the dietary patterns as independent variables and after correcting for multiple tests, higher scores for the HEI Refined Grain category were associated with higher odds for nonadvanced adenoma (aOR, 1.03 [95% CI, 1.01-1.05]; P = .01) and AN (aOR, 1.05 [95% CI, 1.02-1.08]; P < .001). Higher scores for the HEI Whole Grain category were associated with lower odds for nonadvanced adenoma (aOR, 0.97 [95% CI, 0.95-0.99]; P = .01) and AN (aOR, 0.96 [95% CI, 0.93-0.99]; P = .01). Higher scores for the MD Grain category were significantly associated with lower odds for nonadvanced adenoma (aOR, 0.44 [95% CI, 0.26-0.75]; P = .002) and AN (aOR, 0.29 [95% CI, 0.14-0.62]; P = .001). The DASH Grains category also was significantly associated with lower odds for AN (aOR, 0.86 [95% CI, 0.78-0.95]; P = .002).

Discussion

In this study of 3023 veterans undergoing first-time screening colonoscopy and ≥ 10 years of surveillance, we found that healthy dietary patterns, as assessed by the MD and DASH diet, were significantly associated with lower risk of AN. Additionally, we identified lower odds for AN and nonadvanced adenoma compared with no neoplasia for higher grain scores for all the dietary patterns studied. Other food categories that comprise the dietary pattern scores had mixed associations with the MSCF outcomes. Several other studies have examined associations between dietary patterns and risk for CRC but to our knowledge, no studies have explored these associations among US veterans.

These results also indicate study participants had better than average (based on a 50% threshold) dietary quality according to the HEI and DASH diet scoring methods we used, but poor dietary quality according to the MD scoring method. The mean HEI scores for the present study were higher than a US Department of Agriculture study by Dong et al that compared dietary quality between veterans and nonveterans using the HEI, for which veterans’ expected HEI score was 45.6 of 100.8 This could be explained by the fact that the participants needed to be healthy to be eligible and those with healthier behaviors overall may have self-selected into the study due to motivation for screening during a time when screening was not yet commonplace. 36 Similarly, participants of the present study had higher adherence to the DASH diet (63.1%) than adolescents with diabetes in a study by Günther et al. Conversely, firefighters who were coached to use a Mediterranean-style dietary pattern and dietary had higher adherence to MD than did participants in this study.27

A closer examination of specific food category component scores that comprise the 3 distinct dietary patterns revealed mixed results from the multinomial modeling, which may have to do with the guideline thresholds used to calculate the dietary scores. When analyzed separately in the logistic regression models for their associations with nonadvanced adenomas and AN compared with no neoplasia, higher MD and DASH fruit scores (but not HEI fruit scores) were found to be significant. Other studies have had mixed findings when attempting to test for associations of fruit intake with adenoma recurrence.10,37

This study had some unexpected findings. Vegetable intake was not associated with nonadvanced adenomas or AN risk. Studies of food categories have consistently found vegetable (specifically cruciferous ones) intake to be linked with lower odds for cancers.38 Likewise, the red meat category, which was only a unique food category in the MD score, was not associated with nonadvanced adenomas or AN. Despite consistent literature suggesting higher intake of red meat and processed meats increases CRC risk, in 2019 the Nutritional Recommendations Consortium indicated that the evidence was weak.39,40 This study showed higher DASH diet scores for low-fat dairy, which were maximized when participants reported at least 50% of their dairy servings per day as being low-fat, had lower odds for AN. Yet, the MD scores for low-fat dairy had no association with either outcome; their calculation was based on total number of servings per week. This difference in findings suggests the fat intake ratio may be more relevant to CRC risk than intake quantity.

The literature is mixed regarding fatty acid intake and CRC risk, which may be relevant to both dairy and meat intake. One systematic review and meta-analysis found dietary fat and types of fatty acid intake had no association with CRC risk.41 However, a more recent meta-analysis that assessed both dietary intake and plasma levels of fatty acids did find some statistically significant differences for various types of fatty acids and CRC risk.42

The findings in the present study that grain intake is associated with lower odds for more severe colonoscopy findings among veterans are notable.43 Lieberman et al, using the CSP #380 data, found that cereal fiber intake was associated with a lower odds for AN compared with hyperplastic polyps (OR, 0.98 [95% CI, 0.96- 1.00]).18 Similarly, Hullings et al determined that older adults in the highest quintile of cereal fiber intake had significantly lower odds of CRC than those in lower odds for CRC when compared with lowest quintile (OR, 0.89 [95% CI, 0.83- 0.96]; P < .001).44 These findings support existing guidance that prioritizes whole grains as a key source of dietary fiber for CRC prevention.

A recent literature review on fiber, fat, and CRC risk suggested a consensus regarding one protective mechanism: dietary fiber from grains modulates the gut microbiota by promoting butyrate synthesis.45 Butyrate is a short-chain fatty acid that supports energy production in colonocytes and has tumor-suppressing properties.46 Our findings suggest there could be more to learn about the relationship between butyrate production and reduction of CRC risk through metabolomic studies that use measurements of plasma butyrate. These studies may examine associations between not just a singular food or food category, but rather food patterns that include fruits, vegetables, nuts and seeds, and whole grains known to promote butyrate production and plasma butyrate.47

Improved understanding of mechanisms and risk-modifying lifestyle factors such as dietary patterns may enhance prevention strategies. Identifying the collective chemopreventive characteristics of a specific dietary pattern (eg, MD) will be helpful to clinicians and health care staff to promote healthy eating to reduce cancer risk. More studies are needed to understand whether such promotion is more clinically applicable and effective for patients, as compared with eating more or less of specific foods (eg, more whole grains, less red meat). Furthermore, considering important environmental factors collectively beyond dietary patterns may offer a way to better tailor screening and implement a variety of lifestyle interventions. In the literature, this is often referred to as a teachable moment when patients’ attentions are captured and may position them to be more receptive to guidance.48

Limitations

This study has several important limitations and leaves opportunities for future studies that explore the role of dietary patterns and AN or CRC risk. First, the FFQ data used to calculate dietary pattern scores used in analysis were only captured at baseline, and there are nearly 3 decades across the study period. However, it is widely assumed that the diets of older adults, like those included in this study, remain stable over time which is appropriate given our sample population was aged 50 to 75 years when the baseline FFQ data were collected.49-51 Additionally, while the HEI is a well-documented, standard scoring method for dietary quality, there are multitudes of dietary pattern scoring approaches for MD and DASH.23,52,53 Finally, findings from this study using the sample of veterans may not be generalizable to a broader population. Future longitudinal studies that test for a clinically significant change threshold are warranted.

Conclusion

Results of this study suggest future research should further explore the effects of dietary patterns, particularly intake of specific food groups in combination, as opposed to individual nutrients or food items, on AN and CRC risk. Possible studies might explore these dietary patterns for their mechanistic role in altering the microbiome metabolism, which may influence CRC outcomes or include diet in a more comprehensive, holistic risk score that could be used to predict colonic neoplasia risk or in intervention studies that assess the effects of dietary changes on long-term CRC prevention. We suggest there are differences in people’s dietary intake patterns that might be important to consider when implementing tailored approaches to CRC risk mitigation.

References
  1. Zauber AG, Winawer SJ, O’Brien MJ, et al. Colonoscopic polypectomy and long-term prevention of colorectalcancer deaths. N Engl J Med. 2012;366(8):687-696. doi:10.1056/NEJMoa1100370
  2. Nishihara R, Wu K, Lochhead P, et al. Long-term colorectal-cancer incidence and mortality after lower endoscopy. N Engl J Med. 2013;369(12):1095-1105. doi:10.1056/NEJMoa1301969
  3. Bretthauer M, Løberg M, Wieszczy P, et al. Effect of colonoscopy screening on risks of colorectal cancer and related death. N Engl J Med. 2022;387(17):1547-1556. doi:10.1056/NEJMoa2208375
  4. Cottet V, Bonithon-Kopp C, Kronborg O, et al. Dietary patterns and the risk of colorectal adenoma recurrence in a European intervention trial. Eur J Cancer Prev. 2005;14(1):21.
  5. Miller PE, Lesko SM, Muscat JE, Lazarus P, Hartman TJ. Dietary patterns and colorectal adenoma and cancer risk: a review of the epidemiological evidence. Nutr Cancer. 2010;62(4):413-424. doi:10.1080/01635580903407114
  6. Godos J, Bella F, Torrisi A, Sciacca S, Galvano F, Grosso G. Dietary patterns and risk of colorectal adenoma: a systematic review and meta-analysis of observational studies. J Hum Nutr Diet Off J Br Diet Assoc. 2016;29(6):757-767. doi:10.1111/jhn.12395
  7. Haggar FA, Boushey RP. Colorectal cancer epidemiology: incidence, mortality, survival, and risk factors. Clin Colon Rectal Surg. 2009;22(4):191-197. doi:10.1055/s-0029-1242458
  8. Dong D, Stewart H, Carlson AC. An Examination of Veterans’ Diet Quality. U.S. Department of Agriculture, Economic Research Service; 2019:32.
  9. El-Halabi MM, Rex DK, Saito A, Eckert GJ, Kahi CJ. Defining adenoma detection rate benchmarks in average-risk male veterans. Gastrointest Endosc. 2019;89(1):137-143. doi:10.1016/j.gie.2018.08.021
  10. Alberts DS, Hess LM, eds. Fundamentals of Cancer Prevention. Springer International Publishing; 2019. doi:10.1007/978-3-030-15935-1
  11. Dahm CC, Keogh RH, Spencer EA, et al. Dietary fiber and colorectal cancer risk: a nested case-control study using food diaries. J Natl Cancer Inst. 2010;102(9):614-626. doi:10.1093/jnci/djq092
  12. Aune D, Lau R, Chan DSM, et al. Dairy products and colorectal cancer risk: a systematic review and metaanalysis of cohort studies. Ann Oncol. 2012;23(1):37-45. doi:10.1093/annonc/mdr269
  13. Lee JE, Li H, Chan AT, et al. Circulating levels of vitamin D and colon and rectal cancer: the Physicians’ Health Study and a meta-analysis of prospective studies. Cancer Prev Res Phila Pa. 2011;4(5):735-743. doi:10.1158/1940-6207.CAPR-10-0289
  14. Carroll C, Cooper K, Papaioannou D, Hind D, Pilgrim H, Tappenden P. Supplemental calcium in the chemoprevention of colorectal cancer: a systematic review and meta-analysis. Clin Ther. 2010;32(5):789-803. doi:10.1016/j.clinthera.2010.04.024
  15. Park Y, Spiegelman D, Hunter DJ, et al. Intakes of vitamins A, C, and E and use of multiple vitamin supplements and risk of colon cancer: a pooled analysis of prospective cohort studies. Cancer Causes Control CCC. 2010;21(11):1745- 1757. doi:10.1007/s10552-010-9549-y
  16. Alexander DD, Weed DL, Miller PE, Mohamed MA. Red meat and colorectal cancer: a quantitative update on the state of the epidemiologic science. J Am Coll Nutr. 2015;34(6):521-543. doi:10.1080/07315724.2014.992553
  17. Park SY, Wilkens LR, Setiawan VW, Monroe KR, Haiman CA, Le Marchand L. Alcohol intake and colorectal cancer risk in the multiethnic cohort study. Am J Epidemiol. 2019;188(1):67-76. doi:10.1093/aje/kwy208
  18. Lieberman DA. Risk Factors for advanced colonic neoplasia and hyperplastic polyps in asymptomatic individuals. JAMA. 2003;290(22):2959. doi:10.1001/jama.290.22.2959
  19. Archambault AN, Jeon J, Lin Y, et al. Risk stratification for early-onset colorectal cancer using a combination of genetic and environmental risk scores: an international multi-center study. J Natl Cancer Inst. 2022;114(4):528-539. doi:10.1093/jnci/djac003
  20. Carr PR, Weigl K, Edelmann D, et al. Estimation of absolute risk of colorectal cancer based on healthy lifestyle, genetic risk, and colonoscopy status in a populationbased study. Gastroenterology. 2020;159(1):129-138.e9. doi:10.1053/j.gastro.2020.03.016
  21. Sullivan BA, Qin X, Miller C, et al. Screening colonoscopy findings are associated with noncolorectal cancer mortality. Clin Transl Gastroenterol. 2022;13(4):e00479. doi:10.14309/ctg.0000000000000479
  22. Erben V, Carr PR, Holleczek B, Stegmaier C, Hoffmeister M, Brenner H. Dietary patterns and risk of advanced colorectal neoplasms: A large population based screening study in Germany. Prev Med. 2018;111:101-109. doi:10.1016/j.ypmed.2018.02.025
  23. Donovan MG, Selmin OI, Doetschman TC, Romagnolo DF. Mediterranean diet: prevention of colorectal cancer. Front Nutr. 2017;4:59. doi:10.3389/fnut.2017.00059
  24. Mohseni R, Mohseni F, Alizadeh S, Abbasi S. The Association of Dietary Approaches to Stop Hypertension (DASH) diet with the risk of colorectal cancer: a meta-analysis of observational studies.Nutr Cancer. 2020;72(5):778-790. doi:10.1080/01635581.2019.1651880
  25. Lieberman DA, Weiss DG, Bond JH, Ahnen DJ, Garewal H, Chejfec G. Use of colonoscopy to screen asymptomatic adults for colorectal cancer. Veterans Affairs Cooperative Study Group 380. N Engl J Med. 2000;343(3):162-168. doi:10.1056/NEJM200007203430301
  26. Developing the Healthy Eating Index (HEI) | EGRP/ DCCPS/NCI/NIH. Accessed July 22, 2025. https://epi.grants.cancer.gov/hei/developing.html#2015c
  27. Reeve E, Piccici F, Feairheller DL. Validation of a Mediterranean diet scoring system for intervention based research. J Nutr Med Diet Care. 2021;7(1):053. doi:10.23937/2572-3278/1510053
  28. Günther AL, Liese AD, Bell RA, et al. ASSOCIATION BETWEEN THE DIETARY APPROACHES TO HYPERTENSION (DASH) DIET AND HYPERTENSION IN YOUTH WITH DIABETES. Hypertens Dallas Tex 1979. 2009;53(1):6-12. doi:10.1161/HYPERTENSIONAHA.108.116665
  29. Buckland G, Agudo A, Luján L, et al. Adherence to a Mediterranean diet and risk of gastric adenocarcinoma within the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort study. Am J Clin Nutr. 2010;91(2):381- 390. doi:10.3945/ajcn.2009.28209
  30. Rimm EB, Giovannucci EL, Stampfer MJ, Colditz GA, Litin LB, Willett WC. Reproducibility and validity of an expanded self-administered semiquantitative food frequency questionnaire among male health professionals. Am J Epidemiol. 1992;135(10):1114-1126. doi:10.1093/oxfordjournals.aje.a116211
  31. Charlson ME, Pompei P, Ales KL, MacKenzie CR. A new method of classifying prognostic comorbidity in longitudinal studies: development and validation. J Chronic Dis. 1987;40(5):373-383. doi:10.1016/0021-9681(87)90171-8
  32. Lieberman DA, Weiss DG, Harford WV, et al. Fiveyear colon surveillance after screening colonoscopy. Gastroenterology. 2007;133(4):1077-1085. doi:10.1053/j.gastro.2007.07.006
  33. Lieberman D, Sullivan BA, Hauser ER, et al. Baseline colonoscopy findings associated with 10-year outcomes in a screening cohort undergoing colonoscopy surveillance. Gastroenterology. 2020;158(4):862-874.e8. doi:10.1053/j.gastro.2019.07.052
  34. PROC LOGISTIC: PROC LOGISTIC Statement : SAS/STAT(R) 9.22 User’s Guide. Accessed July 22, 2025. https://support.sas.com/documentation/cdl/en/statug/63347/HTML/default/viewer.htm#statug_logistic_sect004.htm
  35. PROC MULTTEST: PROC MULTTEST Statement : SAS/ STAT(R) 9.22 User’s Guide. Accessed July 22, 2025. https://support.sas.com/documentation/cdl/en/statug/63347/HTML/default/viewer.htm#statug_multtest_sect005.htm
  36. Elston DM. Participation bias, self-selection bias, and response bias. J Am Acad Dermatol. Published online June 18, 2021. doi:10.1016/j.jaad.2021.06.025
  37. Sansbury LB, Wanke K, Albert PS, et al. The effect of strict adherence to a high-fiber, high-fruit and -vegetable, and low-fat eating pattern on adenoma recurrence. Am J Epidemiol. 2009;170(5):576-584. doi:10.1093/aje/kwp169
  38. Borgas P, Gonzalez G, Veselkov K, Mirnezami R. Phytochemically rich dietary components and the risk of colorectal cancer: A systematic review and meta-analysis of observational studies. World J Clin Oncol. 2021;12(6):482- 499. doi:10.5306/wjco.v12.i6.482
  39. Papadimitriou N, Markozannes G, Kanellopoulou A, et al. An umbrella review of the evidence associating diet and cancer risk at 11 anatomical sites. Nat Commun. 2021;12(1):4579. doi:10.1038/s41467-021-24861-8
  40. Johnston BC, Zeraatkar D, Han MA, et al. Unprocessed red meat and processed meat consumption: dietary guideline recommendations from the nutritional recommendations (NutriRECS) Consortium. Ann Intern Med. 2019;171(10):756-764. doi:10.7326/M19-1621
  41. Kim M, Park K. Dietary fat intake and risk of colorectal cancer: a systematic review and meta-analysis of prospective studies. Nutrients. 2018;10(12):1963. doi:10.3390/nu10121963
  42. Lu Y, Li D, Wang L, et al. Comprehensive investigation on associations between dietary intake and blood levels of fatty acids and colorectal cancer risk. Nutrients. 2023;15(3):730. doi:10.3390/nu15030730
  43. Gherasim A, Arhire LI, Ni.a O, Popa AD, Graur M, Mihalache L. The relationship between lifestyle components and dietary patterns. Proc Nutr Soc. 2020;79(3):311-323. doi:10.1017/S0029665120006898
  44. Hullings AG, Sinha R, Liao LM, Freedman ND, Graubard BI, Loftfield E. Whole grain and dietary fiber intake and risk of colorectal cancer in the NIH-AARP Diet and Health Study cohort. Am J Clin Nutr. 2020;112(3):603- 612. doi:10.1093/ajcn/nqaa161
  45. Ocvirk S, Wilson AS, Appolonia CN, Thomas TK, O’Keefe SJD. Fiber, fat, and colorectal cancer: new insight into modifiable dietary risk factors. Curr Gastroenterol Rep. 2019;21(11):62. doi:10.1007/s11894-019-0725-2
  46. O’Keefe SJD. Diet, microorganisms and their metabolites, and colon cancer. Nat Rev Gastroenterol Hepatol. 2016;13(12):691-706. doi:10.1038/nrgastro.2016.165
  47. The health benefits and side effects of Butyrate Cleveland Clinic. July 11, 2022. Accessed July 22, 2025. https://health.clevelandclinic.org/butyrate-benefits/
  48. Knudsen MD, Wang L, Wang K, et al. Changes in lifestyle factors after endoscopic screening: a prospective study in the United States. Clin Gastroenterol Hepatol Off ClinPract J Am Gastroenterol Assoc. 2022;20(6):e1240-e1249. doi:10.1016/j.cgh.2021.07.014
  49. Thorpe MG, Milte CM, Crawford D, McNaughton SA. Education and lifestyle predict change in dietary patterns and diet quality of adults 55 years and over. Nutr J. 2019;18(1):67. doi:10.1186/s12937-019-0495-6
  50. Chapman K, Ogden J. How do people change their diet?: an exploration into mechanisms of dietary change. J Health Psychol. 2009;14(8):1229-1242. doi:10.1177/1359105309342289
  51. Djoussé L, Petrone AB, Weir NL, et al. Repeated versus single measurement of plasma omega-3 fatty acids and risk of heart failure. Eur J Nutr. 2014;53(6):1403-1408. doi:10.1007/s00394-013-0642-3
  52. Bach-Faig A, Berry EM, Lairon D, et al. Mediterranean diet pyramid today. Science and cultural updates. Public Health Nutr. 2011;14(12A):2274-2284. doi:10.1017/S1368980011002515
  53. Miller PE, Cross AJ, Subar AF, et al. Comparison of 4 established DASH diet indexes: examining associations of index scores and colorectal cancer123. Am J Clin Nutr. 2013;98(3):794-803. doi:10.3945/ajcn.113.063602
  54. Krebs-Smith SM, Pannucci TE, Subar AF, et al. Update of the Healthy Eating Index: HEI-2015. J Acad Nutr Diet. 2018;118(9):1591-1602. doi:10.1016/j.jand.2018.05.021
  55. P.R. Pehrsson, Cutrufelli RL, Gebhardt SE, et al. USDA Database for the Added Sugars Content of Selected Foods. USDA; 2005. www.ars.usda.gov/nutrientdata
References
  1. Zauber AG, Winawer SJ, O’Brien MJ, et al. Colonoscopic polypectomy and long-term prevention of colorectalcancer deaths. N Engl J Med. 2012;366(8):687-696. doi:10.1056/NEJMoa1100370
  2. Nishihara R, Wu K, Lochhead P, et al. Long-term colorectal-cancer incidence and mortality after lower endoscopy. N Engl J Med. 2013;369(12):1095-1105. doi:10.1056/NEJMoa1301969
  3. Bretthauer M, Løberg M, Wieszczy P, et al. Effect of colonoscopy screening on risks of colorectal cancer and related death. N Engl J Med. 2022;387(17):1547-1556. doi:10.1056/NEJMoa2208375
  4. Cottet V, Bonithon-Kopp C, Kronborg O, et al. Dietary patterns and the risk of colorectal adenoma recurrence in a European intervention trial. Eur J Cancer Prev. 2005;14(1):21.
  5. Miller PE, Lesko SM, Muscat JE, Lazarus P, Hartman TJ. Dietary patterns and colorectal adenoma and cancer risk: a review of the epidemiological evidence. Nutr Cancer. 2010;62(4):413-424. doi:10.1080/01635580903407114
  6. Godos J, Bella F, Torrisi A, Sciacca S, Galvano F, Grosso G. Dietary patterns and risk of colorectal adenoma: a systematic review and meta-analysis of observational studies. J Hum Nutr Diet Off J Br Diet Assoc. 2016;29(6):757-767. doi:10.1111/jhn.12395
  7. Haggar FA, Boushey RP. Colorectal cancer epidemiology: incidence, mortality, survival, and risk factors. Clin Colon Rectal Surg. 2009;22(4):191-197. doi:10.1055/s-0029-1242458
  8. Dong D, Stewart H, Carlson AC. An Examination of Veterans’ Diet Quality. U.S. Department of Agriculture, Economic Research Service; 2019:32.
  9. El-Halabi MM, Rex DK, Saito A, Eckert GJ, Kahi CJ. Defining adenoma detection rate benchmarks in average-risk male veterans. Gastrointest Endosc. 2019;89(1):137-143. doi:10.1016/j.gie.2018.08.021
  10. Alberts DS, Hess LM, eds. Fundamentals of Cancer Prevention. Springer International Publishing; 2019. doi:10.1007/978-3-030-15935-1
  11. Dahm CC, Keogh RH, Spencer EA, et al. Dietary fiber and colorectal cancer risk: a nested case-control study using food diaries. J Natl Cancer Inst. 2010;102(9):614-626. doi:10.1093/jnci/djq092
  12. Aune D, Lau R, Chan DSM, et al. Dairy products and colorectal cancer risk: a systematic review and metaanalysis of cohort studies. Ann Oncol. 2012;23(1):37-45. doi:10.1093/annonc/mdr269
  13. Lee JE, Li H, Chan AT, et al. Circulating levels of vitamin D and colon and rectal cancer: the Physicians’ Health Study and a meta-analysis of prospective studies. Cancer Prev Res Phila Pa. 2011;4(5):735-743. doi:10.1158/1940-6207.CAPR-10-0289
  14. Carroll C, Cooper K, Papaioannou D, Hind D, Pilgrim H, Tappenden P. Supplemental calcium in the chemoprevention of colorectal cancer: a systematic review and meta-analysis. Clin Ther. 2010;32(5):789-803. doi:10.1016/j.clinthera.2010.04.024
  15. Park Y, Spiegelman D, Hunter DJ, et al. Intakes of vitamins A, C, and E and use of multiple vitamin supplements and risk of colon cancer: a pooled analysis of prospective cohort studies. Cancer Causes Control CCC. 2010;21(11):1745- 1757. doi:10.1007/s10552-010-9549-y
  16. Alexander DD, Weed DL, Miller PE, Mohamed MA. Red meat and colorectal cancer: a quantitative update on the state of the epidemiologic science. J Am Coll Nutr. 2015;34(6):521-543. doi:10.1080/07315724.2014.992553
  17. Park SY, Wilkens LR, Setiawan VW, Monroe KR, Haiman CA, Le Marchand L. Alcohol intake and colorectal cancer risk in the multiethnic cohort study. Am J Epidemiol. 2019;188(1):67-76. doi:10.1093/aje/kwy208
  18. Lieberman DA. Risk Factors for advanced colonic neoplasia and hyperplastic polyps in asymptomatic individuals. JAMA. 2003;290(22):2959. doi:10.1001/jama.290.22.2959
  19. Archambault AN, Jeon J, Lin Y, et al. Risk stratification for early-onset colorectal cancer using a combination of genetic and environmental risk scores: an international multi-center study. J Natl Cancer Inst. 2022;114(4):528-539. doi:10.1093/jnci/djac003
  20. Carr PR, Weigl K, Edelmann D, et al. Estimation of absolute risk of colorectal cancer based on healthy lifestyle, genetic risk, and colonoscopy status in a populationbased study. Gastroenterology. 2020;159(1):129-138.e9. doi:10.1053/j.gastro.2020.03.016
  21. Sullivan BA, Qin X, Miller C, et al. Screening colonoscopy findings are associated with noncolorectal cancer mortality. Clin Transl Gastroenterol. 2022;13(4):e00479. doi:10.14309/ctg.0000000000000479
  22. Erben V, Carr PR, Holleczek B, Stegmaier C, Hoffmeister M, Brenner H. Dietary patterns and risk of advanced colorectal neoplasms: A large population based screening study in Germany. Prev Med. 2018;111:101-109. doi:10.1016/j.ypmed.2018.02.025
  23. Donovan MG, Selmin OI, Doetschman TC, Romagnolo DF. Mediterranean diet: prevention of colorectal cancer. Front Nutr. 2017;4:59. doi:10.3389/fnut.2017.00059
  24. Mohseni R, Mohseni F, Alizadeh S, Abbasi S. The Association of Dietary Approaches to Stop Hypertension (DASH) diet with the risk of colorectal cancer: a meta-analysis of observational studies.Nutr Cancer. 2020;72(5):778-790. doi:10.1080/01635581.2019.1651880
  25. Lieberman DA, Weiss DG, Bond JH, Ahnen DJ, Garewal H, Chejfec G. Use of colonoscopy to screen asymptomatic adults for colorectal cancer. Veterans Affairs Cooperative Study Group 380. N Engl J Med. 2000;343(3):162-168. doi:10.1056/NEJM200007203430301
  26. Developing the Healthy Eating Index (HEI) | EGRP/ DCCPS/NCI/NIH. Accessed July 22, 2025. https://epi.grants.cancer.gov/hei/developing.html#2015c
  27. Reeve E, Piccici F, Feairheller DL. Validation of a Mediterranean diet scoring system for intervention based research. J Nutr Med Diet Care. 2021;7(1):053. doi:10.23937/2572-3278/1510053
  28. Günther AL, Liese AD, Bell RA, et al. ASSOCIATION BETWEEN THE DIETARY APPROACHES TO HYPERTENSION (DASH) DIET AND HYPERTENSION IN YOUTH WITH DIABETES. Hypertens Dallas Tex 1979. 2009;53(1):6-12. doi:10.1161/HYPERTENSIONAHA.108.116665
  29. Buckland G, Agudo A, Luján L, et al. Adherence to a Mediterranean diet and risk of gastric adenocarcinoma within the European Prospective Investigation into Cancer and Nutrition (EPIC) cohort study. Am J Clin Nutr. 2010;91(2):381- 390. doi:10.3945/ajcn.2009.28209
  30. Rimm EB, Giovannucci EL, Stampfer MJ, Colditz GA, Litin LB, Willett WC. Reproducibility and validity of an expanded self-administered semiquantitative food frequency questionnaire among male health professionals. Am J Epidemiol. 1992;135(10):1114-1126. doi:10.1093/oxfordjournals.aje.a116211
  31. Charlson ME, Pompei P, Ales KL, MacKenzie CR. A new method of classifying prognostic comorbidity in longitudinal studies: development and validation. J Chronic Dis. 1987;40(5):373-383. doi:10.1016/0021-9681(87)90171-8
  32. Lieberman DA, Weiss DG, Harford WV, et al. Fiveyear colon surveillance after screening colonoscopy. Gastroenterology. 2007;133(4):1077-1085. doi:10.1053/j.gastro.2007.07.006
  33. Lieberman D, Sullivan BA, Hauser ER, et al. Baseline colonoscopy findings associated with 10-year outcomes in a screening cohort undergoing colonoscopy surveillance. Gastroenterology. 2020;158(4):862-874.e8. doi:10.1053/j.gastro.2019.07.052
  34. PROC LOGISTIC: PROC LOGISTIC Statement : SAS/STAT(R) 9.22 User’s Guide. Accessed July 22, 2025. https://support.sas.com/documentation/cdl/en/statug/63347/HTML/default/viewer.htm#statug_logistic_sect004.htm
  35. PROC MULTTEST: PROC MULTTEST Statement : SAS/ STAT(R) 9.22 User’s Guide. Accessed July 22, 2025. https://support.sas.com/documentation/cdl/en/statug/63347/HTML/default/viewer.htm#statug_multtest_sect005.htm
  36. Elston DM. Participation bias, self-selection bias, and response bias. J Am Acad Dermatol. Published online June 18, 2021. doi:10.1016/j.jaad.2021.06.025
  37. Sansbury LB, Wanke K, Albert PS, et al. The effect of strict adherence to a high-fiber, high-fruit and -vegetable, and low-fat eating pattern on adenoma recurrence. Am J Epidemiol. 2009;170(5):576-584. doi:10.1093/aje/kwp169
  38. Borgas P, Gonzalez G, Veselkov K, Mirnezami R. Phytochemically rich dietary components and the risk of colorectal cancer: A systematic review and meta-analysis of observational studies. World J Clin Oncol. 2021;12(6):482- 499. doi:10.5306/wjco.v12.i6.482
  39. Papadimitriou N, Markozannes G, Kanellopoulou A, et al. An umbrella review of the evidence associating diet and cancer risk at 11 anatomical sites. Nat Commun. 2021;12(1):4579. doi:10.1038/s41467-021-24861-8
  40. Johnston BC, Zeraatkar D, Han MA, et al. Unprocessed red meat and processed meat consumption: dietary guideline recommendations from the nutritional recommendations (NutriRECS) Consortium. Ann Intern Med. 2019;171(10):756-764. doi:10.7326/M19-1621
  41. Kim M, Park K. Dietary fat intake and risk of colorectal cancer: a systematic review and meta-analysis of prospective studies. Nutrients. 2018;10(12):1963. doi:10.3390/nu10121963
  42. Lu Y, Li D, Wang L, et al. Comprehensive investigation on associations between dietary intake and blood levels of fatty acids and colorectal cancer risk. Nutrients. 2023;15(3):730. doi:10.3390/nu15030730
  43. Gherasim A, Arhire LI, Ni.a O, Popa AD, Graur M, Mihalache L. The relationship between lifestyle components and dietary patterns. Proc Nutr Soc. 2020;79(3):311-323. doi:10.1017/S0029665120006898
  44. Hullings AG, Sinha R, Liao LM, Freedman ND, Graubard BI, Loftfield E. Whole grain and dietary fiber intake and risk of colorectal cancer in the NIH-AARP Diet and Health Study cohort. Am J Clin Nutr. 2020;112(3):603- 612. doi:10.1093/ajcn/nqaa161
  45. Ocvirk S, Wilson AS, Appolonia CN, Thomas TK, O’Keefe SJD. Fiber, fat, and colorectal cancer: new insight into modifiable dietary risk factors. Curr Gastroenterol Rep. 2019;21(11):62. doi:10.1007/s11894-019-0725-2
  46. O’Keefe SJD. Diet, microorganisms and their metabolites, and colon cancer. Nat Rev Gastroenterol Hepatol. 2016;13(12):691-706. doi:10.1038/nrgastro.2016.165
  47. The health benefits and side effects of Butyrate Cleveland Clinic. July 11, 2022. Accessed July 22, 2025. https://health.clevelandclinic.org/butyrate-benefits/
  48. Knudsen MD, Wang L, Wang K, et al. Changes in lifestyle factors after endoscopic screening: a prospective study in the United States. Clin Gastroenterol Hepatol Off ClinPract J Am Gastroenterol Assoc. 2022;20(6):e1240-e1249. doi:10.1016/j.cgh.2021.07.014
  49. Thorpe MG, Milte CM, Crawford D, McNaughton SA. Education and lifestyle predict change in dietary patterns and diet quality of adults 55 years and over. Nutr J. 2019;18(1):67. doi:10.1186/s12937-019-0495-6
  50. Chapman K, Ogden J. How do people change their diet?: an exploration into mechanisms of dietary change. J Health Psychol. 2009;14(8):1229-1242. doi:10.1177/1359105309342289
  51. Djoussé L, Petrone AB, Weir NL, et al. Repeated versus single measurement of plasma omega-3 fatty acids and risk of heart failure. Eur J Nutr. 2014;53(6):1403-1408. doi:10.1007/s00394-013-0642-3
  52. Bach-Faig A, Berry EM, Lairon D, et al. Mediterranean diet pyramid today. Science and cultural updates. Public Health Nutr. 2011;14(12A):2274-2284. doi:10.1017/S1368980011002515
  53. Miller PE, Cross AJ, Subar AF, et al. Comparison of 4 established DASH diet indexes: examining associations of index scores and colorectal cancer123. Am J Clin Nutr. 2013;98(3):794-803. doi:10.3945/ajcn.113.063602
  54. Krebs-Smith SM, Pannucci TE, Subar AF, et al. Update of the Healthy Eating Index: HEI-2015. J Acad Nutr Diet. 2018;118(9):1591-1602. doi:10.1016/j.jand.2018.05.021
  55. P.R. Pehrsson, Cutrufelli RL, Gebhardt SE, et al. USDA Database for the Added Sugars Content of Selected Foods. USDA; 2005. www.ars.usda.gov/nutrientdata
Issue
Federal Practitioner - 42(suppl 3)
Issue
Federal Practitioner - 42(suppl 3)
Page Number
S30-S39
Page Number
S30-S39
Publications
Publications
Topics
Article Type
Display Headline

Associations Between Prescreening Dietary Patterns and Longitudinal Colonoscopy Outcomes in Veterans

Display Headline

Associations Between Prescreening Dietary Patterns and Longitudinal Colonoscopy Outcomes in Veterans

Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Wed, 08/06/2025 - 18:59
Un-Gate On Date
Wed, 08/06/2025 - 18:59
Use ProPublica
CFC Schedule Remove Status
Wed, 08/06/2025 - 18:59
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Wed, 08/06/2025 - 18:59

Earlier Vaccinations Helped Limit Marine Adenovirus Outbreak

Article Type
Changed
Tue, 08/05/2025 - 14:33
Display Headline

Earlier Vaccinations Helped Limit Marine Adenovirus Outbreak

During an adenovirus (AdV) outbreak among recruits and staff at the Marine Corps Recruit Depot (MCRD) in San Diego, an investigation revealed that the earlier individuals working at the site received vaccination, the better. The clinical team found that accelerating the vaccination schedule could help prevent further outbreaks, medical separations, and training disruption.

From July 1, 2024, through September 23, 2024, a total of 212 trainees and staff developed AdV and 28 were hospitalized. Nine patients were hospitalized with AdV pneumo­nia within a 2-week period; 3 were admitted to the intensive care unit. Outpatient acute respiratory disease (ARD) cases also increased, with recruits accounting for nearly 97% of the AdV outbreak cases.

AdV is a frequent cause of illness among military recruits. Research has found that up to 80% of cases of febrile ARD in recruits are due to AdV, and 20% result in hospitalization. 

The military developed and implemented a live, oral vaccine against AdV serotypes 4 and 7 (most common in recruits) starting in the 1970s, reducing febrile respiratory illness in recruit training sites by 50% and AdV infection by > 90%. However, the manufacturer halted production of the vaccine in 1995. By 1999, vaccine supply was depleted, and ARD cases rose. A replacement vaccine introduced in 2011 proved 99% effective, leading to a dramatic 100-fold decline in AdV disease among military trainees. 

While the vaccine is effective, outbreaks are still possible among closely congregating groups like military trainees. AdV pneumonia cases spiked as the virus spread through the training companies and into new companies when they arrived at the MCRD in early July 2024. Most new infections were in recruits who had missed the AdV vaccination day.

Early symptoms of AdV may be very mild, and some recruits were likely already symptomatic when vaccinated. Aggressive environmental cleaning, separation of sick and well recruits, masking, and other nonpharmaceutical interventions did not slow the spread.

The preventive medicine and public health teams noted that AdV vaccination was being administered 11 days postarrival, to allow for pregnancy testing, and for assessing vaccine titers. US Department of Defense regula­tions do not dictate precise vaccination schedules. Implementation of the regulation varies among military train­ing sites. 

After reviewing other training sites’ vaccine timing schedules (most required vaccination by day 6 postarrival) and determin­ing the time required for immu­nity, the medical teams at MCRD recommended shifting AdV vac­cine administration, along with other standard vaccines, from day 11 to day 1 postarrival. Two weeks after the schedule change, overall incidence began declining rapidly.

Nearly 75% of patients had coinfections with other respiratory patho­gens, most notably seasonal coronaviruses, COVID-19, and rhinovirus/enterovirus, suggesting that infection with AdV may increase susceptibility to other viruses, a finding that has not been identified in previous AdV out­breaks. Newly increased testing sensitiv­ity associated with multiplex respiratory pathogen PCR availability may have been a factor in coinfection identification during this outbreak.

AdV is a significant medical threat to military recruits. Early vaccination, the investigators advise, should remain “a central tenet for preven­tion and control of communicable diseases in these high-risk, congregate settings.”

Publications
Topics
Sections

During an adenovirus (AdV) outbreak among recruits and staff at the Marine Corps Recruit Depot (MCRD) in San Diego, an investigation revealed that the earlier individuals working at the site received vaccination, the better. The clinical team found that accelerating the vaccination schedule could help prevent further outbreaks, medical separations, and training disruption.

From July 1, 2024, through September 23, 2024, a total of 212 trainees and staff developed AdV and 28 were hospitalized. Nine patients were hospitalized with AdV pneumo­nia within a 2-week period; 3 were admitted to the intensive care unit. Outpatient acute respiratory disease (ARD) cases also increased, with recruits accounting for nearly 97% of the AdV outbreak cases.

AdV is a frequent cause of illness among military recruits. Research has found that up to 80% of cases of febrile ARD in recruits are due to AdV, and 20% result in hospitalization. 

The military developed and implemented a live, oral vaccine against AdV serotypes 4 and 7 (most common in recruits) starting in the 1970s, reducing febrile respiratory illness in recruit training sites by 50% and AdV infection by > 90%. However, the manufacturer halted production of the vaccine in 1995. By 1999, vaccine supply was depleted, and ARD cases rose. A replacement vaccine introduced in 2011 proved 99% effective, leading to a dramatic 100-fold decline in AdV disease among military trainees. 

While the vaccine is effective, outbreaks are still possible among closely congregating groups like military trainees. AdV pneumonia cases spiked as the virus spread through the training companies and into new companies when they arrived at the MCRD in early July 2024. Most new infections were in recruits who had missed the AdV vaccination day.

Early symptoms of AdV may be very mild, and some recruits were likely already symptomatic when vaccinated. Aggressive environmental cleaning, separation of sick and well recruits, masking, and other nonpharmaceutical interventions did not slow the spread.

The preventive medicine and public health teams noted that AdV vaccination was being administered 11 days postarrival, to allow for pregnancy testing, and for assessing vaccine titers. US Department of Defense regula­tions do not dictate precise vaccination schedules. Implementation of the regulation varies among military train­ing sites. 

After reviewing other training sites’ vaccine timing schedules (most required vaccination by day 6 postarrival) and determin­ing the time required for immu­nity, the medical teams at MCRD recommended shifting AdV vac­cine administration, along with other standard vaccines, from day 11 to day 1 postarrival. Two weeks after the schedule change, overall incidence began declining rapidly.

Nearly 75% of patients had coinfections with other respiratory patho­gens, most notably seasonal coronaviruses, COVID-19, and rhinovirus/enterovirus, suggesting that infection with AdV may increase susceptibility to other viruses, a finding that has not been identified in previous AdV out­breaks. Newly increased testing sensitiv­ity associated with multiplex respiratory pathogen PCR availability may have been a factor in coinfection identification during this outbreak.

AdV is a significant medical threat to military recruits. Early vaccination, the investigators advise, should remain “a central tenet for preven­tion and control of communicable diseases in these high-risk, congregate settings.”

During an adenovirus (AdV) outbreak among recruits and staff at the Marine Corps Recruit Depot (MCRD) in San Diego, an investigation revealed that the earlier individuals working at the site received vaccination, the better. The clinical team found that accelerating the vaccination schedule could help prevent further outbreaks, medical separations, and training disruption.

From July 1, 2024, through September 23, 2024, a total of 212 trainees and staff developed AdV and 28 were hospitalized. Nine patients were hospitalized with AdV pneumo­nia within a 2-week period; 3 were admitted to the intensive care unit. Outpatient acute respiratory disease (ARD) cases also increased, with recruits accounting for nearly 97% of the AdV outbreak cases.

AdV is a frequent cause of illness among military recruits. Research has found that up to 80% of cases of febrile ARD in recruits are due to AdV, and 20% result in hospitalization. 

The military developed and implemented a live, oral vaccine against AdV serotypes 4 and 7 (most common in recruits) starting in the 1970s, reducing febrile respiratory illness in recruit training sites by 50% and AdV infection by > 90%. However, the manufacturer halted production of the vaccine in 1995. By 1999, vaccine supply was depleted, and ARD cases rose. A replacement vaccine introduced in 2011 proved 99% effective, leading to a dramatic 100-fold decline in AdV disease among military trainees. 

While the vaccine is effective, outbreaks are still possible among closely congregating groups like military trainees. AdV pneumonia cases spiked as the virus spread through the training companies and into new companies when they arrived at the MCRD in early July 2024. Most new infections were in recruits who had missed the AdV vaccination day.

Early symptoms of AdV may be very mild, and some recruits were likely already symptomatic when vaccinated. Aggressive environmental cleaning, separation of sick and well recruits, masking, and other nonpharmaceutical interventions did not slow the spread.

The preventive medicine and public health teams noted that AdV vaccination was being administered 11 days postarrival, to allow for pregnancy testing, and for assessing vaccine titers. US Department of Defense regula­tions do not dictate precise vaccination schedules. Implementation of the regulation varies among military train­ing sites. 

After reviewing other training sites’ vaccine timing schedules (most required vaccination by day 6 postarrival) and determin­ing the time required for immu­nity, the medical teams at MCRD recommended shifting AdV vac­cine administration, along with other standard vaccines, from day 11 to day 1 postarrival. Two weeks after the schedule change, overall incidence began declining rapidly.

Nearly 75% of patients had coinfections with other respiratory patho­gens, most notably seasonal coronaviruses, COVID-19, and rhinovirus/enterovirus, suggesting that infection with AdV may increase susceptibility to other viruses, a finding that has not been identified in previous AdV out­breaks. Newly increased testing sensitiv­ity associated with multiplex respiratory pathogen PCR availability may have been a factor in coinfection identification during this outbreak.

AdV is a significant medical threat to military recruits. Early vaccination, the investigators advise, should remain “a central tenet for preven­tion and control of communicable diseases in these high-risk, congregate settings.”

Publications
Publications
Topics
Article Type
Display Headline

Earlier Vaccinations Helped Limit Marine Adenovirus Outbreak

Display Headline

Earlier Vaccinations Helped Limit Marine Adenovirus Outbreak

Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Tue, 08/05/2025 - 09:20
Un-Gate On Date
Tue, 08/05/2025 - 09:20
Use ProPublica
CFC Schedule Remove Status
Tue, 08/05/2025 - 09:20
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Tue, 08/05/2025 - 09:20

Alarming Rise in Early-Onset GI Cancers Calls for Early Screening, Lifestyle Change

Article Type
Changed
Fri, 08/08/2025 - 15:30

Early-onset gastrointestinal (GI) cancers diagnosed before age 50 are rising at alarming rates worldwide, underscoring the need for enhanced prevention strategies and early detection, said the authors of a JAMA review.

In the US, early-onset GI cancers are increasing faster than any other type of early-onset cancer, including breast cancer. The trend is not limited to colorectal cancer (CRC). Gastric, pancreatic, esophageal, as well as many biliary tract and appendix cancers, are also on the rise in young adults, Kimmie Ng, MD, MPH, and Thejus Jayakrishnan, MD, both with Dana-Farber Cancer Institute, Boston, noted in their article.

Dr. Kimmie Ng



The increase in early-onset GI cancers follows a “birth cohort effect,” with generational variation in risk, suggesting a potential association with changes in environmental exposures, Ng explained in an accompanying JAMA podcast.

All these GI cancers link strongly to multiple modifiable risk factors, and it is a “top area of investigation to determine exactly what environmental exposures are at play,” Ng added.

For many of these GI cancers, obesity has been the “leading hypothesis” given that rising rates seem to parallel the increase in incidence of these early-onset GI cancers, Ng explained.

“But we also have evidence, particularly strong for colorectal cancer, that dietary patterns, such as consuming a Western diet, as well as sedentary behavior and lifestyles seem to be associated with a significantly higher risk of developing these cancers at an age under 50,” Ng said.

 

Rising Incidence 

Globally, among early-onset GI cancers reported in 2022, CRC was the most common (54%), followed by gastric cancer (24%), esophageal cancer (13%), and pancreatic cancer (9%).

In the US in 2022, 20,805 individuals were diagnosed with early-onset CRC, 2689 with early-onset gastric cancer, 2657 with early-onset pancreatic cancer, and 875 with early-onset esophageal cancer.

Since the mid-1990s, CRC among adults of all ages in the US declined by 1.3%-4.2% annually but early-onset CRC increased by roughly 2% per year in both men and women, and currently makes up about 14% of all CRC cases.

Early-onset pancreatic cancer and esophageal cancer each currently make up about 5% of all cases of these cancers in the US.

Between 2010 and 2019, the number of newly diagnosed cases of early-onset GI cancers rose by nearly about 15%, with Black, Hispanic, Indigenous ancestry, and women disproportionately affected, Ng and coauthors noted in a related review published in the British Journal of Surgery.

 

Modifiable and Nonmodifiable Risk Factors 

Along with obesity and poor diet, other modifiable risk factors for early-onset GI cancers include sedentary lifestyle, cigarette smoking, and alcohol consumption.

Nonmodifiable risk factors include family history, hereditary cancer syndromes such as Lynch syndrome and inflammatory bowel disease.

Roughly 15%-30% of early-onset GI cancers have pathogenic germline variants in genes such as DNA mismatch repair genes and BRCA1/2.

All individuals with early-onset GI cancers should undergo germline and somatic genetic testing to guide treatment, screen for other cancers (eg, endometrial cancer in Lynch syndrome), and assess familial risk, Ng and Jayakrishnan advised.

 

Treatment Challenges

Treatment for early-onset GI cancers is generally similar to later-onset GI cancers and prognosis for patients with early-onset GI cancers is “similar to or worse” than that for patients with later-onset GI cancers, highlighting the need for improved methods of prevention and early detection, the authors said.

Ng noted that younger cancer patients often face more challenges after diagnosis than older patients and benefit from multidisciplinary care, including referral for fertility counseling and preservation if appropriate, and psychosocial support.

“It is very difficult and challenging to receive a cancer diagnosis no matter what age you are, but when a person is diagnosed in their 20s, 30s, or 40s, there are unique challenges,” Ng said.

Studies have documented “much higher levels of psychosocial distress, depression and anxiety” in early-onset cancer patients, “and they also often experience more financial toxicity, disruptions in their education as well as their career and there may be fertility concerns,” Ng added.

 

Diagnostic Delays and Screening

Currently, screening is not recommended for most early-onset GI cancers — with the exception of CRC, with screening recommended for average-risk adults in the US starting at age 45.

Yet, despite this recommendation, fewer than 1 in 5 (19.7%) US adults aged 45-49 years were screened in 2021, indicating a significant gap in early detection efforts.

High-risk individuals, such as those with Lynch syndrome, a first-degree relative with CRC, or advanced colorectal adenoma, should begin CRC screening earlier, at an age determined by the specific risk factor.

“Studies have shown significant delays in diagnosis among younger patients. It’s important that prompt diagnosis happens so that these patients do not end up being diagnosed with advanced or metastatic stages of cancer, as they often are,” Ng said.

“Screening adherence is absolutely critical,” co-author Jayakrishnan added in a news release.

“We have strong evidence that colorectal cancer screening saves lives by reducing both the number of people who develop colorectal cancer and the number of people who die from it. Each missed screening is a lost opportunity to detect cancer early when it is more treatable, or to prevent cancer altogether by identifying and removing precancerous polyps,” Jayakrishnan said.This research had no funding. Ng reported receipt of nonfinancial support from Pharmavite, institutional grants from Janssen, and personal fees from Bayer, Seagen, GlaxoSmithKline, Pfizer, CytomX, Jazz Pharmaceuticals, Revolution Medicines, Redesign Health, AbbVie, Etiome, and CRICO. Ng is an associate editor of JAMA but was not involved in any of the decisions regarding review of the manuscript or its acceptance. Jayakrishnan had no disclosures.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Early-onset gastrointestinal (GI) cancers diagnosed before age 50 are rising at alarming rates worldwide, underscoring the need for enhanced prevention strategies and early detection, said the authors of a JAMA review.

In the US, early-onset GI cancers are increasing faster than any other type of early-onset cancer, including breast cancer. The trend is not limited to colorectal cancer (CRC). Gastric, pancreatic, esophageal, as well as many biliary tract and appendix cancers, are also on the rise in young adults, Kimmie Ng, MD, MPH, and Thejus Jayakrishnan, MD, both with Dana-Farber Cancer Institute, Boston, noted in their article.

Dr. Kimmie Ng



The increase in early-onset GI cancers follows a “birth cohort effect,” with generational variation in risk, suggesting a potential association with changes in environmental exposures, Ng explained in an accompanying JAMA podcast.

All these GI cancers link strongly to multiple modifiable risk factors, and it is a “top area of investigation to determine exactly what environmental exposures are at play,” Ng added.

For many of these GI cancers, obesity has been the “leading hypothesis” given that rising rates seem to parallel the increase in incidence of these early-onset GI cancers, Ng explained.

“But we also have evidence, particularly strong for colorectal cancer, that dietary patterns, such as consuming a Western diet, as well as sedentary behavior and lifestyles seem to be associated with a significantly higher risk of developing these cancers at an age under 50,” Ng said.

 

Rising Incidence 

Globally, among early-onset GI cancers reported in 2022, CRC was the most common (54%), followed by gastric cancer (24%), esophageal cancer (13%), and pancreatic cancer (9%).

In the US in 2022, 20,805 individuals were diagnosed with early-onset CRC, 2689 with early-onset gastric cancer, 2657 with early-onset pancreatic cancer, and 875 with early-onset esophageal cancer.

Since the mid-1990s, CRC among adults of all ages in the US declined by 1.3%-4.2% annually but early-onset CRC increased by roughly 2% per year in both men and women, and currently makes up about 14% of all CRC cases.

Early-onset pancreatic cancer and esophageal cancer each currently make up about 5% of all cases of these cancers in the US.

Between 2010 and 2019, the number of newly diagnosed cases of early-onset GI cancers rose by nearly about 15%, with Black, Hispanic, Indigenous ancestry, and women disproportionately affected, Ng and coauthors noted in a related review published in the British Journal of Surgery.

 

Modifiable and Nonmodifiable Risk Factors 

Along with obesity and poor diet, other modifiable risk factors for early-onset GI cancers include sedentary lifestyle, cigarette smoking, and alcohol consumption.

Nonmodifiable risk factors include family history, hereditary cancer syndromes such as Lynch syndrome and inflammatory bowel disease.

Roughly 15%-30% of early-onset GI cancers have pathogenic germline variants in genes such as DNA mismatch repair genes and BRCA1/2.

All individuals with early-onset GI cancers should undergo germline and somatic genetic testing to guide treatment, screen for other cancers (eg, endometrial cancer in Lynch syndrome), and assess familial risk, Ng and Jayakrishnan advised.

 

Treatment Challenges

Treatment for early-onset GI cancers is generally similar to later-onset GI cancers and prognosis for patients with early-onset GI cancers is “similar to or worse” than that for patients with later-onset GI cancers, highlighting the need for improved methods of prevention and early detection, the authors said.

Ng noted that younger cancer patients often face more challenges after diagnosis than older patients and benefit from multidisciplinary care, including referral for fertility counseling and preservation if appropriate, and psychosocial support.

“It is very difficult and challenging to receive a cancer diagnosis no matter what age you are, but when a person is diagnosed in their 20s, 30s, or 40s, there are unique challenges,” Ng said.

Studies have documented “much higher levels of psychosocial distress, depression and anxiety” in early-onset cancer patients, “and they also often experience more financial toxicity, disruptions in their education as well as their career and there may be fertility concerns,” Ng added.

 

Diagnostic Delays and Screening

Currently, screening is not recommended for most early-onset GI cancers — with the exception of CRC, with screening recommended for average-risk adults in the US starting at age 45.

Yet, despite this recommendation, fewer than 1 in 5 (19.7%) US adults aged 45-49 years were screened in 2021, indicating a significant gap in early detection efforts.

High-risk individuals, such as those with Lynch syndrome, a first-degree relative with CRC, or advanced colorectal adenoma, should begin CRC screening earlier, at an age determined by the specific risk factor.

“Studies have shown significant delays in diagnosis among younger patients. It’s important that prompt diagnosis happens so that these patients do not end up being diagnosed with advanced or metastatic stages of cancer, as they often are,” Ng said.

“Screening adherence is absolutely critical,” co-author Jayakrishnan added in a news release.

“We have strong evidence that colorectal cancer screening saves lives by reducing both the number of people who develop colorectal cancer and the number of people who die from it. Each missed screening is a lost opportunity to detect cancer early when it is more treatable, or to prevent cancer altogether by identifying and removing precancerous polyps,” Jayakrishnan said.This research had no funding. Ng reported receipt of nonfinancial support from Pharmavite, institutional grants from Janssen, and personal fees from Bayer, Seagen, GlaxoSmithKline, Pfizer, CytomX, Jazz Pharmaceuticals, Revolution Medicines, Redesign Health, AbbVie, Etiome, and CRICO. Ng is an associate editor of JAMA but was not involved in any of the decisions regarding review of the manuscript or its acceptance. Jayakrishnan had no disclosures.

A version of this article appeared on Medscape.com.

Early-onset gastrointestinal (GI) cancers diagnosed before age 50 are rising at alarming rates worldwide, underscoring the need for enhanced prevention strategies and early detection, said the authors of a JAMA review.

In the US, early-onset GI cancers are increasing faster than any other type of early-onset cancer, including breast cancer. The trend is not limited to colorectal cancer (CRC). Gastric, pancreatic, esophageal, as well as many biliary tract and appendix cancers, are also on the rise in young adults, Kimmie Ng, MD, MPH, and Thejus Jayakrishnan, MD, both with Dana-Farber Cancer Institute, Boston, noted in their article.

Dr. Kimmie Ng



The increase in early-onset GI cancers follows a “birth cohort effect,” with generational variation in risk, suggesting a potential association with changes in environmental exposures, Ng explained in an accompanying JAMA podcast.

All these GI cancers link strongly to multiple modifiable risk factors, and it is a “top area of investigation to determine exactly what environmental exposures are at play,” Ng added.

For many of these GI cancers, obesity has been the “leading hypothesis” given that rising rates seem to parallel the increase in incidence of these early-onset GI cancers, Ng explained.

“But we also have evidence, particularly strong for colorectal cancer, that dietary patterns, such as consuming a Western diet, as well as sedentary behavior and lifestyles seem to be associated with a significantly higher risk of developing these cancers at an age under 50,” Ng said.

 

Rising Incidence 

Globally, among early-onset GI cancers reported in 2022, CRC was the most common (54%), followed by gastric cancer (24%), esophageal cancer (13%), and pancreatic cancer (9%).

In the US in 2022, 20,805 individuals were diagnosed with early-onset CRC, 2689 with early-onset gastric cancer, 2657 with early-onset pancreatic cancer, and 875 with early-onset esophageal cancer.

Since the mid-1990s, CRC among adults of all ages in the US declined by 1.3%-4.2% annually but early-onset CRC increased by roughly 2% per year in both men and women, and currently makes up about 14% of all CRC cases.

Early-onset pancreatic cancer and esophageal cancer each currently make up about 5% of all cases of these cancers in the US.

Between 2010 and 2019, the number of newly diagnosed cases of early-onset GI cancers rose by nearly about 15%, with Black, Hispanic, Indigenous ancestry, and women disproportionately affected, Ng and coauthors noted in a related review published in the British Journal of Surgery.

 

Modifiable and Nonmodifiable Risk Factors 

Along with obesity and poor diet, other modifiable risk factors for early-onset GI cancers include sedentary lifestyle, cigarette smoking, and alcohol consumption.

Nonmodifiable risk factors include family history, hereditary cancer syndromes such as Lynch syndrome and inflammatory bowel disease.

Roughly 15%-30% of early-onset GI cancers have pathogenic germline variants in genes such as DNA mismatch repair genes and BRCA1/2.

All individuals with early-onset GI cancers should undergo germline and somatic genetic testing to guide treatment, screen for other cancers (eg, endometrial cancer in Lynch syndrome), and assess familial risk, Ng and Jayakrishnan advised.

 

Treatment Challenges

Treatment for early-onset GI cancers is generally similar to later-onset GI cancers and prognosis for patients with early-onset GI cancers is “similar to or worse” than that for patients with later-onset GI cancers, highlighting the need for improved methods of prevention and early detection, the authors said.

Ng noted that younger cancer patients often face more challenges after diagnosis than older patients and benefit from multidisciplinary care, including referral for fertility counseling and preservation if appropriate, and psychosocial support.

“It is very difficult and challenging to receive a cancer diagnosis no matter what age you are, but when a person is diagnosed in their 20s, 30s, or 40s, there are unique challenges,” Ng said.

Studies have documented “much higher levels of psychosocial distress, depression and anxiety” in early-onset cancer patients, “and they also often experience more financial toxicity, disruptions in their education as well as their career and there may be fertility concerns,” Ng added.

 

Diagnostic Delays and Screening

Currently, screening is not recommended for most early-onset GI cancers — with the exception of CRC, with screening recommended for average-risk adults in the US starting at age 45.

Yet, despite this recommendation, fewer than 1 in 5 (19.7%) US adults aged 45-49 years were screened in 2021, indicating a significant gap in early detection efforts.

High-risk individuals, such as those with Lynch syndrome, a first-degree relative with CRC, or advanced colorectal adenoma, should begin CRC screening earlier, at an age determined by the specific risk factor.

“Studies have shown significant delays in diagnosis among younger patients. It’s important that prompt diagnosis happens so that these patients do not end up being diagnosed with advanced or metastatic stages of cancer, as they often are,” Ng said.

“Screening adherence is absolutely critical,” co-author Jayakrishnan added in a news release.

“We have strong evidence that colorectal cancer screening saves lives by reducing both the number of people who develop colorectal cancer and the number of people who die from it. Each missed screening is a lost opportunity to detect cancer early when it is more treatable, or to prevent cancer altogether by identifying and removing precancerous polyps,” Jayakrishnan said.This research had no funding. Ng reported receipt of nonfinancial support from Pharmavite, institutional grants from Janssen, and personal fees from Bayer, Seagen, GlaxoSmithKline, Pfizer, CytomX, Jazz Pharmaceuticals, Revolution Medicines, Redesign Health, AbbVie, Etiome, and CRICO. Ng is an associate editor of JAMA but was not involved in any of the decisions regarding review of the manuscript or its acceptance. Jayakrishnan had no disclosures.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Mon, 08/04/2025 - 10:29
Un-Gate On Date
Mon, 08/04/2025 - 10:29
Use ProPublica
CFC Schedule Remove Status
Mon, 08/04/2025 - 10:29
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Mon, 08/04/2025 - 10:29

Sterile Water Bottles Deemed Unnecessary for Endoscopy

‘Back to Basics’ on Water
Article Type
Changed
Fri, 08/08/2025 - 15:31

Like diners saving on drinks, endoscopists can safely forgo sterile water in favor of tap, reducing both environmental and financial costs, according to a recent narrative review.

“No direct evidence supports the recommendation and widespread use of sterile water during gastrointestinal endosco-py procedures,” lead author Deepak Agrawal, MD, chief of gastroenterology & hepatology at the Dell Medical School, University Texas at Austin, and colleagues, wrote in Gastro Hep Advances. “Guidelines recommending sterile water during endoscopy are based on limited evidence and mostly expert opinions.”

Dr. Deepak Agrawal



After reviewing the literature back to 1975, Dr. Agrawal and colleagues considered the use of sterile water in endoscopy via three frameworks: medical evidence and guidelines, environmental and broader health effects, and financial costs.

Only 2 studies – both from the 1990s – directly compared sterile and tap water use in endoscopy. Neither showed an increased risk of infection from tap water. In fact, some cultures from allegedly sterile water bottles grew pathogenic bacteria, while no patient complications were reported in either study.

“The recommendations for sterile water contradict observations in other medical care scenarios, for example, for the irrigation of open wounds,” Dr. Agrawal and colleagues noted. “Similarly, there is no benefit in using sterile water for enteral feeds in immunosuppressed patients, and tap water enemas are routinely acceptable for colon cleansing before sigmoidoscopies in all patients, irrespective of immune status.”

Current guidelines, including the 2021 US multisociety guideline on reprocessing flexible GI endoscopes and accessories, recommend sterile water for procedures involving mucosal penetration but acknowledge low-quality supporting evidence. These recommendations are based on outdated studies, some unrelated to GI endoscopy, Dr. Agrawal and colleagues pointed out, and rely heavily on cross-referenced opinion statements rather than clinical data.

They went on to suggest a concerning possibility: all those plastic bottles may actually cause more health problems than prevent them. The review estimates that the production and transportation of sterile water bottles contributes over 6,000 metric tons of emissions per year from US endoscopy units alone. What’s more, as discarded bottles break down, they release greenhouse gases and microplastics, the latter of which have been linked to cardiovascular disease, inflammatory bowel disease, and endocrine disruption.

Dr. Agrawal and colleagues also underscored the financial toxicity of sterile water bottles. Considering a 1-liter bottle of sterile water costs $3-10, an endoscopy unit performing 30 procedures per day spends approximately $1,000-3,000 per month on bottled water alone. Scaled nationally, the routine use of sterile water costs tens of millions of dollars each year, not counting indirect expenses associated with stocking and waste disposal.

Considering the dubious clinical upside against the apparent environmental and financial downsides, Dr. Agrawal and colleagues urged endoscopy units to rethink routine sterile water use. 

They proposed a pragmatic model: start the day with a new sterile or reusable bottle, refill with tap water for subsequent cases, and recycle the bottle at day’s end. Institutions should ensure their tap water meets safety standards, they added, such as those outlined in the Joint Commission’s 2022 R3 Report on standards for water management.

Dr. Agrawal and colleagues also called on GI societies to revise existing guidance to reflect today’s clinical and environmental realities. Until strong evidence supports the need for sterile water, they wrote, the smarter, safer, and more sustainable option may be simply turning on the tap.

The investigators disclosed relationships with Guardant, Exact Sciences, Freenome, and others.
 

Body

In an editorial accompanying the study and comments to GI & Hepatology News, Dr. Seth A. Gross of NYU Langone Health urged gastroenterologists to reconsider the use of sterile water in endoscopy.

Dr. Seth A. Gross

While the rationale for bottled water has centered on infection prevention, Gross argued that the evidence does not hold up, noting that this practice contradicts modern values around sustainability and evidence-based care.



The two relevant clinical studies comparing sterile versus tap water in endoscopy are almost 30 years old, he said, and neither detected an increased risk of infection with tap water, leading both to conclude that tap water is “safe and practical” for routine endoscopy.



Gross also pointed out the inconsistency of sterile water use in medical practice, noting that tap water is acceptable in procedures with higher infection risk than endoscopy.



“Lastly,” he added, “most people drink tap water and not sterile water on a daily basis without outbreaks of gastroenteritis from bacterial infections.”



Gross’s comments went beyond the data to emphasize the obvious but overlooked environmental impacts of sterile water bottles. He suggested several challenging suggestions to make medicine more ecofriendly, like reducing travel to conferences, increasing the availability of telehealth, and choosing reusable devices over disposables.



But “what’s hiding in plain sight,” he said, “is our use of sterile water.”



While acknowledging that some patients, like those who are immunocompromised, might still warrant sterile water, Gross supported the review’s recommendation to use tap water instead. He called on GI societies and regulatory bodies to re-examine current policy and pursue updated guidance.



“Sometimes going back to the basics,” he concluded, “could be the most innovative strategy with tremendous impact.”



 

Seth A. Gross, MD, AGAF, is clinical chief in the Division of Gastroenterology & Hepatology at NYU Langone Health, and professor at the NYU Grossman School of Medicine, both in New York City. He reported no conflicts of interest.

Publications
Topics
Sections
Body

In an editorial accompanying the study and comments to GI & Hepatology News, Dr. Seth A. Gross of NYU Langone Health urged gastroenterologists to reconsider the use of sterile water in endoscopy.

Dr. Seth A. Gross

While the rationale for bottled water has centered on infection prevention, Gross argued that the evidence does not hold up, noting that this practice contradicts modern values around sustainability and evidence-based care.



The two relevant clinical studies comparing sterile versus tap water in endoscopy are almost 30 years old, he said, and neither detected an increased risk of infection with tap water, leading both to conclude that tap water is “safe and practical” for routine endoscopy.



Gross also pointed out the inconsistency of sterile water use in medical practice, noting that tap water is acceptable in procedures with higher infection risk than endoscopy.



“Lastly,” he added, “most people drink tap water and not sterile water on a daily basis without outbreaks of gastroenteritis from bacterial infections.”



Gross’s comments went beyond the data to emphasize the obvious but overlooked environmental impacts of sterile water bottles. He suggested several challenging suggestions to make medicine more ecofriendly, like reducing travel to conferences, increasing the availability of telehealth, and choosing reusable devices over disposables.



But “what’s hiding in plain sight,” he said, “is our use of sterile water.”



While acknowledging that some patients, like those who are immunocompromised, might still warrant sterile water, Gross supported the review’s recommendation to use tap water instead. He called on GI societies and regulatory bodies to re-examine current policy and pursue updated guidance.



“Sometimes going back to the basics,” he concluded, “could be the most innovative strategy with tremendous impact.”



 

Seth A. Gross, MD, AGAF, is clinical chief in the Division of Gastroenterology & Hepatology at NYU Langone Health, and professor at the NYU Grossman School of Medicine, both in New York City. He reported no conflicts of interest.

Body

In an editorial accompanying the study and comments to GI & Hepatology News, Dr. Seth A. Gross of NYU Langone Health urged gastroenterologists to reconsider the use of sterile water in endoscopy.

Dr. Seth A. Gross

While the rationale for bottled water has centered on infection prevention, Gross argued that the evidence does not hold up, noting that this practice contradicts modern values around sustainability and evidence-based care.



The two relevant clinical studies comparing sterile versus tap water in endoscopy are almost 30 years old, he said, and neither detected an increased risk of infection with tap water, leading both to conclude that tap water is “safe and practical” for routine endoscopy.



Gross also pointed out the inconsistency of sterile water use in medical practice, noting that tap water is acceptable in procedures with higher infection risk than endoscopy.



“Lastly,” he added, “most people drink tap water and not sterile water on a daily basis without outbreaks of gastroenteritis from bacterial infections.”



Gross’s comments went beyond the data to emphasize the obvious but overlooked environmental impacts of sterile water bottles. He suggested several challenging suggestions to make medicine more ecofriendly, like reducing travel to conferences, increasing the availability of telehealth, and choosing reusable devices over disposables.



But “what’s hiding in plain sight,” he said, “is our use of sterile water.”



While acknowledging that some patients, like those who are immunocompromised, might still warrant sterile water, Gross supported the review’s recommendation to use tap water instead. He called on GI societies and regulatory bodies to re-examine current policy and pursue updated guidance.



“Sometimes going back to the basics,” he concluded, “could be the most innovative strategy with tremendous impact.”



 

Seth A. Gross, MD, AGAF, is clinical chief in the Division of Gastroenterology & Hepatology at NYU Langone Health, and professor at the NYU Grossman School of Medicine, both in New York City. He reported no conflicts of interest.

Title
‘Back to Basics’ on Water
‘Back to Basics’ on Water

Like diners saving on drinks, endoscopists can safely forgo sterile water in favor of tap, reducing both environmental and financial costs, according to a recent narrative review.

“No direct evidence supports the recommendation and widespread use of sterile water during gastrointestinal endosco-py procedures,” lead author Deepak Agrawal, MD, chief of gastroenterology & hepatology at the Dell Medical School, University Texas at Austin, and colleagues, wrote in Gastro Hep Advances. “Guidelines recommending sterile water during endoscopy are based on limited evidence and mostly expert opinions.”

Dr. Deepak Agrawal



After reviewing the literature back to 1975, Dr. Agrawal and colleagues considered the use of sterile water in endoscopy via three frameworks: medical evidence and guidelines, environmental and broader health effects, and financial costs.

Only 2 studies – both from the 1990s – directly compared sterile and tap water use in endoscopy. Neither showed an increased risk of infection from tap water. In fact, some cultures from allegedly sterile water bottles grew pathogenic bacteria, while no patient complications were reported in either study.

“The recommendations for sterile water contradict observations in other medical care scenarios, for example, for the irrigation of open wounds,” Dr. Agrawal and colleagues noted. “Similarly, there is no benefit in using sterile water for enteral feeds in immunosuppressed patients, and tap water enemas are routinely acceptable for colon cleansing before sigmoidoscopies in all patients, irrespective of immune status.”

Current guidelines, including the 2021 US multisociety guideline on reprocessing flexible GI endoscopes and accessories, recommend sterile water for procedures involving mucosal penetration but acknowledge low-quality supporting evidence. These recommendations are based on outdated studies, some unrelated to GI endoscopy, Dr. Agrawal and colleagues pointed out, and rely heavily on cross-referenced opinion statements rather than clinical data.

They went on to suggest a concerning possibility: all those plastic bottles may actually cause more health problems than prevent them. The review estimates that the production and transportation of sterile water bottles contributes over 6,000 metric tons of emissions per year from US endoscopy units alone. What’s more, as discarded bottles break down, they release greenhouse gases and microplastics, the latter of which have been linked to cardiovascular disease, inflammatory bowel disease, and endocrine disruption.

Dr. Agrawal and colleagues also underscored the financial toxicity of sterile water bottles. Considering a 1-liter bottle of sterile water costs $3-10, an endoscopy unit performing 30 procedures per day spends approximately $1,000-3,000 per month on bottled water alone. Scaled nationally, the routine use of sterile water costs tens of millions of dollars each year, not counting indirect expenses associated with stocking and waste disposal.

Considering the dubious clinical upside against the apparent environmental and financial downsides, Dr. Agrawal and colleagues urged endoscopy units to rethink routine sterile water use. 

They proposed a pragmatic model: start the day with a new sterile or reusable bottle, refill with tap water for subsequent cases, and recycle the bottle at day’s end. Institutions should ensure their tap water meets safety standards, they added, such as those outlined in the Joint Commission’s 2022 R3 Report on standards for water management.

Dr. Agrawal and colleagues also called on GI societies to revise existing guidance to reflect today’s clinical and environmental realities. Until strong evidence supports the need for sterile water, they wrote, the smarter, safer, and more sustainable option may be simply turning on the tap.

The investigators disclosed relationships with Guardant, Exact Sciences, Freenome, and others.
 

Like diners saving on drinks, endoscopists can safely forgo sterile water in favor of tap, reducing both environmental and financial costs, according to a recent narrative review.

“No direct evidence supports the recommendation and widespread use of sterile water during gastrointestinal endosco-py procedures,” lead author Deepak Agrawal, MD, chief of gastroenterology & hepatology at the Dell Medical School, University Texas at Austin, and colleagues, wrote in Gastro Hep Advances. “Guidelines recommending sterile water during endoscopy are based on limited evidence and mostly expert opinions.”

Dr. Deepak Agrawal



After reviewing the literature back to 1975, Dr. Agrawal and colleagues considered the use of sterile water in endoscopy via three frameworks: medical evidence and guidelines, environmental and broader health effects, and financial costs.

Only 2 studies – both from the 1990s – directly compared sterile and tap water use in endoscopy. Neither showed an increased risk of infection from tap water. In fact, some cultures from allegedly sterile water bottles grew pathogenic bacteria, while no patient complications were reported in either study.

“The recommendations for sterile water contradict observations in other medical care scenarios, for example, for the irrigation of open wounds,” Dr. Agrawal and colleagues noted. “Similarly, there is no benefit in using sterile water for enteral feeds in immunosuppressed patients, and tap water enemas are routinely acceptable for colon cleansing before sigmoidoscopies in all patients, irrespective of immune status.”

Current guidelines, including the 2021 US multisociety guideline on reprocessing flexible GI endoscopes and accessories, recommend sterile water for procedures involving mucosal penetration but acknowledge low-quality supporting evidence. These recommendations are based on outdated studies, some unrelated to GI endoscopy, Dr. Agrawal and colleagues pointed out, and rely heavily on cross-referenced opinion statements rather than clinical data.

They went on to suggest a concerning possibility: all those plastic bottles may actually cause more health problems than prevent them. The review estimates that the production and transportation of sterile water bottles contributes over 6,000 metric tons of emissions per year from US endoscopy units alone. What’s more, as discarded bottles break down, they release greenhouse gases and microplastics, the latter of which have been linked to cardiovascular disease, inflammatory bowel disease, and endocrine disruption.

Dr. Agrawal and colleagues also underscored the financial toxicity of sterile water bottles. Considering a 1-liter bottle of sterile water costs $3-10, an endoscopy unit performing 30 procedures per day spends approximately $1,000-3,000 per month on bottled water alone. Scaled nationally, the routine use of sterile water costs tens of millions of dollars each year, not counting indirect expenses associated with stocking and waste disposal.

Considering the dubious clinical upside against the apparent environmental and financial downsides, Dr. Agrawal and colleagues urged endoscopy units to rethink routine sterile water use. 

They proposed a pragmatic model: start the day with a new sterile or reusable bottle, refill with tap water for subsequent cases, and recycle the bottle at day’s end. Institutions should ensure their tap water meets safety standards, they added, such as those outlined in the Joint Commission’s 2022 R3 Report on standards for water management.

Dr. Agrawal and colleagues also called on GI societies to revise existing guidance to reflect today’s clinical and environmental realities. Until strong evidence supports the need for sterile water, they wrote, the smarter, safer, and more sustainable option may be simply turning on the tap.

The investigators disclosed relationships with Guardant, Exact Sciences, Freenome, and others.
 

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTRO HEP ADVANCES

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Mon, 08/04/2025 - 10:25
Un-Gate On Date
Mon, 08/04/2025 - 10:25
Use ProPublica
CFC Schedule Remove Status
Mon, 08/04/2025 - 10:25
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Mon, 08/04/2025 - 10:25

Cirrhosis Mortality Prediction Boosted by Machine Learning

Article Type
Changed
Fri, 08/01/2025 - 17:16

Among hospitalized patients with cirrhosis, a machine learning (ML) model enhanced mortality prediction compared with traditional methods and was consistent across country income levels in a large global study.

“This highly inclusive, representative, and globally derived model has been externally validated,” Jasmohan Bajaj, MD, AGAF, professor of medicine at Virginia Commonwealth University in Richmond, Virginia, told GI & Hepatology News. “This gives us a crystal ball. It helps hospital teams, transplant centers, gastroenterology and intensive care unit services triage and prioritize patients more effectively.”

Dr. Jasmohan Bajaj



The study supporting the model, which Bajaj said “could be used at this stage,” was published online in Gastroenterology. The model is available for downloading at https://silveys.shinyapps.io/app_cleared/.

 

CLEARED Cohort Analyzed

Wide variations across the world regarding available resources, outpatient services, reasons for admission, and etiologies of cirrhosis can influence patient outcomes, according to Bajaj and colleagues. Therefore, they sought to use ML approaches to improve prognostication for all countries.

They analyzed admission-day data from the prospective Chronic Liver Disease Evolution And Registry for Events and Decompensation (CLEARED) consortium, which includes inpatients with cirrhosis enrolled from six continents. The analysis compared ML approaches with logistical regression to predict inpatient mortality.

The researchers performed internal validation (75/25 split) and subdivision using World-Bank income status: low/low-middle (L-LMIC), upper middle (UMIC), and high (HIC). They determined that the ML model with the best area-under-the-curve (AUC) would be externally validated in a US-Veteran cirrhosis inpatient population.

The CLEARED cohort included 7239 cirrhosis inpatients (mean age, 56 years; 64% men; median MELD-Na, 25) from 115 centers globally; 22.5% of centers belonged to LMICs, 41% to UMICs, and 34% to HICs.

A total of 808 patients (11.1%) died in the hospital.

Random-Forest analysis showed the best AUC (0.815) with high calibration. This was significantly better than parametric logistic regression (AUC, 0.774) and LASSO (AUC, 0.787) models.

Random-Forest also was better than logistic regression regardless of country income-level: HIC (AUC,0.806), UMIC (AUC, 0.867), and L-LMICs (AUC, 0.768).

Of the top 15 important variables selected from Random-Forest, admission for acute kidney injury, hepatic encephalopathy, high MELD-Na/white blood count, and not being in high income country were variables most predictive of mortality.

In contrast, higher albumin, hemoglobin, diuretic use on admission, viral etiology, and being in a high-income country were most protective.

The Random-Forest model was validated in 28,670 veterans (mean age, 67 years; 96% men; median MELD-Na,15), with an inpatient mortality of 4% (1158 patients).

The final Random-Forest model, using 48 of the 67 original covariates, attained a strong AUC of 0.859. A refit version using only the top 15 variables achieved a comparable AUC of 0.851.

 

Clinical Relevance

“Cirrhosis and resultant organ failures remain a dynamic and multidisciplinary problem,” Bajaj noted. “Machine learning techniques are one part of multi-faceted management strategy that is required in this population.”

If patients fall into the high-risk category, he said, “careful consultation with patients, families, and clinical teams is needed before providing information, including where this model was derived from. The results of these discussions could be instructive regarding decisions for transfer, more aggressive monitoring/ICU transfer, palliative care or transplant assessments.”

Meena B. Bansal, MD, system chief, Division of Liver Diseases, Mount Sinai Health System in New York City, called the tool “very promising.” However, she told GI & Hepatology News, “it was validated on a VA [Veterans Affairs] cohort, which is a bit different than the cohort of patients seen at Mount Sinai. Therefore, validation in more academic tertiary care medical centers with high volume liver transplant would be helpful.”

Dr. Meena B. Bansal

 

Furthermore, said Bansal, who was not involved in the study, “they excluded those that receiving a liver transplant, and while only a small number, this is an important limitation.”

Nevertheless, she added, “Artificial intelligence has great potential in predictive risk models and will likely be a tool that assists for risk stratification, clinical management, and hopefully improved clinical outcomes.”

This study was partly supported by a VA Merit review to Bajaj and the National Center for Advancing Translational Sciences, National Institutes of Health. No conflicts of interest were reported by any author.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Among hospitalized patients with cirrhosis, a machine learning (ML) model enhanced mortality prediction compared with traditional methods and was consistent across country income levels in a large global study.

“This highly inclusive, representative, and globally derived model has been externally validated,” Jasmohan Bajaj, MD, AGAF, professor of medicine at Virginia Commonwealth University in Richmond, Virginia, told GI & Hepatology News. “This gives us a crystal ball. It helps hospital teams, transplant centers, gastroenterology and intensive care unit services triage and prioritize patients more effectively.”

Dr. Jasmohan Bajaj



The study supporting the model, which Bajaj said “could be used at this stage,” was published online in Gastroenterology. The model is available for downloading at https://silveys.shinyapps.io/app_cleared/.

 

CLEARED Cohort Analyzed

Wide variations across the world regarding available resources, outpatient services, reasons for admission, and etiologies of cirrhosis can influence patient outcomes, according to Bajaj and colleagues. Therefore, they sought to use ML approaches to improve prognostication for all countries.

They analyzed admission-day data from the prospective Chronic Liver Disease Evolution And Registry for Events and Decompensation (CLEARED) consortium, which includes inpatients with cirrhosis enrolled from six continents. The analysis compared ML approaches with logistical regression to predict inpatient mortality.

The researchers performed internal validation (75/25 split) and subdivision using World-Bank income status: low/low-middle (L-LMIC), upper middle (UMIC), and high (HIC). They determined that the ML model with the best area-under-the-curve (AUC) would be externally validated in a US-Veteran cirrhosis inpatient population.

The CLEARED cohort included 7239 cirrhosis inpatients (mean age, 56 years; 64% men; median MELD-Na, 25) from 115 centers globally; 22.5% of centers belonged to LMICs, 41% to UMICs, and 34% to HICs.

A total of 808 patients (11.1%) died in the hospital.

Random-Forest analysis showed the best AUC (0.815) with high calibration. This was significantly better than parametric logistic regression (AUC, 0.774) and LASSO (AUC, 0.787) models.

Random-Forest also was better than logistic regression regardless of country income-level: HIC (AUC,0.806), UMIC (AUC, 0.867), and L-LMICs (AUC, 0.768).

Of the top 15 important variables selected from Random-Forest, admission for acute kidney injury, hepatic encephalopathy, high MELD-Na/white blood count, and not being in high income country were variables most predictive of mortality.

In contrast, higher albumin, hemoglobin, diuretic use on admission, viral etiology, and being in a high-income country were most protective.

The Random-Forest model was validated in 28,670 veterans (mean age, 67 years; 96% men; median MELD-Na,15), with an inpatient mortality of 4% (1158 patients).

The final Random-Forest model, using 48 of the 67 original covariates, attained a strong AUC of 0.859. A refit version using only the top 15 variables achieved a comparable AUC of 0.851.

 

Clinical Relevance

“Cirrhosis and resultant organ failures remain a dynamic and multidisciplinary problem,” Bajaj noted. “Machine learning techniques are one part of multi-faceted management strategy that is required in this population.”

If patients fall into the high-risk category, he said, “careful consultation with patients, families, and clinical teams is needed before providing information, including where this model was derived from. The results of these discussions could be instructive regarding decisions for transfer, more aggressive monitoring/ICU transfer, palliative care or transplant assessments.”

Meena B. Bansal, MD, system chief, Division of Liver Diseases, Mount Sinai Health System in New York City, called the tool “very promising.” However, she told GI & Hepatology News, “it was validated on a VA [Veterans Affairs] cohort, which is a bit different than the cohort of patients seen at Mount Sinai. Therefore, validation in more academic tertiary care medical centers with high volume liver transplant would be helpful.”

Dr. Meena B. Bansal

 

Furthermore, said Bansal, who was not involved in the study, “they excluded those that receiving a liver transplant, and while only a small number, this is an important limitation.”

Nevertheless, she added, “Artificial intelligence has great potential in predictive risk models and will likely be a tool that assists for risk stratification, clinical management, and hopefully improved clinical outcomes.”

This study was partly supported by a VA Merit review to Bajaj and the National Center for Advancing Translational Sciences, National Institutes of Health. No conflicts of interest were reported by any author.

A version of this article appeared on Medscape.com.

Among hospitalized patients with cirrhosis, a machine learning (ML) model enhanced mortality prediction compared with traditional methods and was consistent across country income levels in a large global study.

“This highly inclusive, representative, and globally derived model has been externally validated,” Jasmohan Bajaj, MD, AGAF, professor of medicine at Virginia Commonwealth University in Richmond, Virginia, told GI & Hepatology News. “This gives us a crystal ball. It helps hospital teams, transplant centers, gastroenterology and intensive care unit services triage and prioritize patients more effectively.”

Dr. Jasmohan Bajaj



The study supporting the model, which Bajaj said “could be used at this stage,” was published online in Gastroenterology. The model is available for downloading at https://silveys.shinyapps.io/app_cleared/.

 

CLEARED Cohort Analyzed

Wide variations across the world regarding available resources, outpatient services, reasons for admission, and etiologies of cirrhosis can influence patient outcomes, according to Bajaj and colleagues. Therefore, they sought to use ML approaches to improve prognostication for all countries.

They analyzed admission-day data from the prospective Chronic Liver Disease Evolution And Registry for Events and Decompensation (CLEARED) consortium, which includes inpatients with cirrhosis enrolled from six continents. The analysis compared ML approaches with logistical regression to predict inpatient mortality.

The researchers performed internal validation (75/25 split) and subdivision using World-Bank income status: low/low-middle (L-LMIC), upper middle (UMIC), and high (HIC). They determined that the ML model with the best area-under-the-curve (AUC) would be externally validated in a US-Veteran cirrhosis inpatient population.

The CLEARED cohort included 7239 cirrhosis inpatients (mean age, 56 years; 64% men; median MELD-Na, 25) from 115 centers globally; 22.5% of centers belonged to LMICs, 41% to UMICs, and 34% to HICs.

A total of 808 patients (11.1%) died in the hospital.

Random-Forest analysis showed the best AUC (0.815) with high calibration. This was significantly better than parametric logistic regression (AUC, 0.774) and LASSO (AUC, 0.787) models.

Random-Forest also was better than logistic regression regardless of country income-level: HIC (AUC,0.806), UMIC (AUC, 0.867), and L-LMICs (AUC, 0.768).

Of the top 15 important variables selected from Random-Forest, admission for acute kidney injury, hepatic encephalopathy, high MELD-Na/white blood count, and not being in high income country were variables most predictive of mortality.

In contrast, higher albumin, hemoglobin, diuretic use on admission, viral etiology, and being in a high-income country were most protective.

The Random-Forest model was validated in 28,670 veterans (mean age, 67 years; 96% men; median MELD-Na,15), with an inpatient mortality of 4% (1158 patients).

The final Random-Forest model, using 48 of the 67 original covariates, attained a strong AUC of 0.859. A refit version using only the top 15 variables achieved a comparable AUC of 0.851.

 

Clinical Relevance

“Cirrhosis and resultant organ failures remain a dynamic and multidisciplinary problem,” Bajaj noted. “Machine learning techniques are one part of multi-faceted management strategy that is required in this population.”

If patients fall into the high-risk category, he said, “careful consultation with patients, families, and clinical teams is needed before providing information, including where this model was derived from. The results of these discussions could be instructive regarding decisions for transfer, more aggressive monitoring/ICU transfer, palliative care or transplant assessments.”

Meena B. Bansal, MD, system chief, Division of Liver Diseases, Mount Sinai Health System in New York City, called the tool “very promising.” However, she told GI & Hepatology News, “it was validated on a VA [Veterans Affairs] cohort, which is a bit different than the cohort of patients seen at Mount Sinai. Therefore, validation in more academic tertiary care medical centers with high volume liver transplant would be helpful.”

Dr. Meena B. Bansal

 

Furthermore, said Bansal, who was not involved in the study, “they excluded those that receiving a liver transplant, and while only a small number, this is an important limitation.”

Nevertheless, she added, “Artificial intelligence has great potential in predictive risk models and will likely be a tool that assists for risk stratification, clinical management, and hopefully improved clinical outcomes.”

This study was partly supported by a VA Merit review to Bajaj and the National Center for Advancing Translational Sciences, National Institutes of Health. No conflicts of interest were reported by any author.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Fri, 08/01/2025 - 16:05
Un-Gate On Date
Fri, 08/01/2025 - 16:05
Use ProPublica
CFC Schedule Remove Status
Fri, 08/01/2025 - 16:05
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Fri, 08/01/2025 - 16:05

Colonoscopy Costs Rise When Private Equity Acquires GI Practices, but Quality Does Not

Article Type
Changed
Fri, 08/01/2025 - 11:30

Private equity (PE) acquisition of gastroenterology (GI) practices led to higher colonoscopy prices, utilization, and spending with no commensurate effect on quality, an economic analysis found. Price increases ranged from about 5% to about 7%.

In view of the growing trend to such acquisitions, policy makers should monitor the impact of PE investment in medical practices, according to researchers led by health economist Daniel R. Arnold, PhD, of the Department of Health Services, Policy & Practice in the School of Public Health at Brown University in Providence, Rhode Island. “In a previous study of ours, gastroenterology stood out as a particularly attractive specialty to private equity,” Arnold told GI & Hepatology News.

David R. Arnold



Published in JAMA Health Forum, the economic evaluation of more than 1.1 million patients and 1.3 million colonoscopies concluded that PE acquisitions of GI sites are difficult to justify.

 

The Study

This difference-in-differences event study and economic evaluation analyzed data from US GI practices acquired by PE firms from 2015 to 2021. Commercial insurance claims covering more than 50 million enrollees were used to calculate price, spending, utilization, and quality measures from 2012 to 2021, with all data analyzed from April to September 2024.

The main outcomes were price, spending per physician, number of colonoscopies per physician, number of unique patients per physician, and quality, as defined by polyp detection, incomplete colonoscopies, and four adverse event measures: cardiovascular, serious and nonserious GI events, and any other adverse events.

The mean age of patients was 47.1 years, and 47.8% were men. The sample included 718, 851 colonoscopies conducted by 1494 physicians in 590, 900 patients across 1240 PE-acquired practice sites and 637, 990 control colonoscopies conducted by 2550 physicians in 527,380 patients across 2657 independent practice sites.

Among the findings:

  • Colonoscopy prices at PE-acquired sites increased by 4.5% (95% CI, 2.5-6.6; P < .001) vs independent practices. That increase was much lower than that reported by Singh and colleagues for  .
  • The estimated price increase was 6.7% (95% CI, 4.2-9.3; P < .001) when only colonoscopies at PE practices with market shares above the 75th percentile (24.4%) in 2021 were considered. Both increases were in line with other research, Arnold said.
  • Colonoscopy spending per physician increased by 16.0% (95% CI, 8.4%-24.0%; P < .001), while the number of colonoscopies and the number of unique patients per physician increased by 12.1% (95% CI, 5.3-19.4; P < .001) and 11.3% (95% CI, 4.4%-18.5%; P < .001), respectively. These measures, however, were already increasing before PE acquisition.
  • No statistically significant associations were detected for the six quality measures analyzed.

Could such cost-raising acquisitions potentially be blocked by concerned regulators? 

“No. Generally the purchases are at prices below what would require notification to federal authorities,” Arnold said. “The Department of Justice/Federal Trade Commission hinted at being willing to look at serial acquisitions in their 2023 Merger Guidelines, but until that happens, these will likely continue to fly under the radar.”

Still, as evidence of PE-associated poorer quality outcomes as well as clinician burnout continues to emerge, Arnold added, “I would advise physicians who get buyout offers from PE to educate themselves on what could happen to patients and staff if they choose to sell.”

Offering an outsider’s perspective on the study, health economist Atul Gupta, PhD, an assistant professor of healthcare management in the Wharton School at the University of Pennsylvania in Philadelphia, called it an “excellent addition to the developing literature examining the effects of private equity ownership of healthcare providers.” Very few studies have examined the effects on prices and quality for the same set of deals and providers. “This is important because we want to be able to do an apples-to-apples comparison of the effects on both outcomes before judging PE ownership,” he told GI & Hepatology News.

Atul Gupta



In an accompanying editorial , primary care physician Jane M. Zhu, MD, an associate professor of medicine at Oregon Health & Science University in Portland, Oregon, and not involved in the commercial-insurance-based study, said one interpretation of the findings may be that PE acquisition focuses on reducing inefficiencies, improving access by expanding practice capacity, and increasing throughput. “Another interpretation may be that PE acquisition is focused on the strategic exploitation of market and pricing power. The latter may have less of an impact on clinical measures like quality of care, but potentially, both strategies could be at play.” 

Since this analysis focused on the commercial population, understanding how patient demographics may change after PE acquisition is a future avenue for exploration. “For instance, a potential explanation for both the price and utilization shifts might be if payer mix shifted toward more commercially insured patients at the expense of Medicaid or Medicare patients,” she wrote.

Zhu added that the impact of PE on prices and spending, by now replicated across different settings and specialties, is far clearer than the effect of PE on access and quality. “The analysis by Arnold et al is a welcome addition to the literature, generating important questions for future study and transparent monitoring as investor-owners become increasingly influential in healthcare.”

Going forward, said Gupta, an open question is whether the harmful effects of PE ownership of practices are differentially worse than those of other corporate entities such as insurers and hospital systems.

Dr. Jane M. Zhu



“There are reasons to believe that PE could be worse in theory. For example, their short-term investment horizon may force them to take measures that others will not as well as avoid investing into capital improvements that have a long-run payoff,” he said. “Their uniquely high dependence on debt and unbundling of real estate can severely hurt financial solvency of providers.” But high-quality evidence is lacking to compare the effects from these two distinct forms of corporatization.

The trend away from individual private practice is a reality, Arnold said. “The administrative burden on solo docs is becoming too much and physicians just seem to want to treat patients and not deal with it. So the options at this point really become selling to a hospital system or private equity.”

This study was funded by a grant from the philanthropic foundation Arnold Ventures (no family relation to Daniel Arnold). 

Arnold reported receiving grants from Arnold Ventures during the conduct of the study. Gupta had no competing interests to declare. Zhu reported receiving grants from the Agency for Healthcare Research and Quality during the submitted work and from the National Institutes of Health, National Institute for Health Care Management Foundation, and American Psychological Association, as well as personal fees from Cambia outside the submitted work.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Private equity (PE) acquisition of gastroenterology (GI) practices led to higher colonoscopy prices, utilization, and spending with no commensurate effect on quality, an economic analysis found. Price increases ranged from about 5% to about 7%.

In view of the growing trend to such acquisitions, policy makers should monitor the impact of PE investment in medical practices, according to researchers led by health economist Daniel R. Arnold, PhD, of the Department of Health Services, Policy & Practice in the School of Public Health at Brown University in Providence, Rhode Island. “In a previous study of ours, gastroenterology stood out as a particularly attractive specialty to private equity,” Arnold told GI & Hepatology News.

David R. Arnold



Published in JAMA Health Forum, the economic evaluation of more than 1.1 million patients and 1.3 million colonoscopies concluded that PE acquisitions of GI sites are difficult to justify.

 

The Study

This difference-in-differences event study and economic evaluation analyzed data from US GI practices acquired by PE firms from 2015 to 2021. Commercial insurance claims covering more than 50 million enrollees were used to calculate price, spending, utilization, and quality measures from 2012 to 2021, with all data analyzed from April to September 2024.

The main outcomes were price, spending per physician, number of colonoscopies per physician, number of unique patients per physician, and quality, as defined by polyp detection, incomplete colonoscopies, and four adverse event measures: cardiovascular, serious and nonserious GI events, and any other adverse events.

The mean age of patients was 47.1 years, and 47.8% were men. The sample included 718, 851 colonoscopies conducted by 1494 physicians in 590, 900 patients across 1240 PE-acquired practice sites and 637, 990 control colonoscopies conducted by 2550 physicians in 527,380 patients across 2657 independent practice sites.

Among the findings:

  • Colonoscopy prices at PE-acquired sites increased by 4.5% (95% CI, 2.5-6.6; P < .001) vs independent practices. That increase was much lower than that reported by Singh and colleagues for  .
  • The estimated price increase was 6.7% (95% CI, 4.2-9.3; P < .001) when only colonoscopies at PE practices with market shares above the 75th percentile (24.4%) in 2021 were considered. Both increases were in line with other research, Arnold said.
  • Colonoscopy spending per physician increased by 16.0% (95% CI, 8.4%-24.0%; P < .001), while the number of colonoscopies and the number of unique patients per physician increased by 12.1% (95% CI, 5.3-19.4; P < .001) and 11.3% (95% CI, 4.4%-18.5%; P < .001), respectively. These measures, however, were already increasing before PE acquisition.
  • No statistically significant associations were detected for the six quality measures analyzed.

Could such cost-raising acquisitions potentially be blocked by concerned regulators? 

“No. Generally the purchases are at prices below what would require notification to federal authorities,” Arnold said. “The Department of Justice/Federal Trade Commission hinted at being willing to look at serial acquisitions in their 2023 Merger Guidelines, but until that happens, these will likely continue to fly under the radar.”

Still, as evidence of PE-associated poorer quality outcomes as well as clinician burnout continues to emerge, Arnold added, “I would advise physicians who get buyout offers from PE to educate themselves on what could happen to patients and staff if they choose to sell.”

Offering an outsider’s perspective on the study, health economist Atul Gupta, PhD, an assistant professor of healthcare management in the Wharton School at the University of Pennsylvania in Philadelphia, called it an “excellent addition to the developing literature examining the effects of private equity ownership of healthcare providers.” Very few studies have examined the effects on prices and quality for the same set of deals and providers. “This is important because we want to be able to do an apples-to-apples comparison of the effects on both outcomes before judging PE ownership,” he told GI & Hepatology News.

Atul Gupta



In an accompanying editorial , primary care physician Jane M. Zhu, MD, an associate professor of medicine at Oregon Health & Science University in Portland, Oregon, and not involved in the commercial-insurance-based study, said one interpretation of the findings may be that PE acquisition focuses on reducing inefficiencies, improving access by expanding practice capacity, and increasing throughput. “Another interpretation may be that PE acquisition is focused on the strategic exploitation of market and pricing power. The latter may have less of an impact on clinical measures like quality of care, but potentially, both strategies could be at play.” 

Since this analysis focused on the commercial population, understanding how patient demographics may change after PE acquisition is a future avenue for exploration. “For instance, a potential explanation for both the price and utilization shifts might be if payer mix shifted toward more commercially insured patients at the expense of Medicaid or Medicare patients,” she wrote.

Zhu added that the impact of PE on prices and spending, by now replicated across different settings and specialties, is far clearer than the effect of PE on access and quality. “The analysis by Arnold et al is a welcome addition to the literature, generating important questions for future study and transparent monitoring as investor-owners become increasingly influential in healthcare.”

Going forward, said Gupta, an open question is whether the harmful effects of PE ownership of practices are differentially worse than those of other corporate entities such as insurers and hospital systems.

Dr. Jane M. Zhu



“There are reasons to believe that PE could be worse in theory. For example, their short-term investment horizon may force them to take measures that others will not as well as avoid investing into capital improvements that have a long-run payoff,” he said. “Their uniquely high dependence on debt and unbundling of real estate can severely hurt financial solvency of providers.” But high-quality evidence is lacking to compare the effects from these two distinct forms of corporatization.

The trend away from individual private practice is a reality, Arnold said. “The administrative burden on solo docs is becoming too much and physicians just seem to want to treat patients and not deal with it. So the options at this point really become selling to a hospital system or private equity.”

This study was funded by a grant from the philanthropic foundation Arnold Ventures (no family relation to Daniel Arnold). 

Arnold reported receiving grants from Arnold Ventures during the conduct of the study. Gupta had no competing interests to declare. Zhu reported receiving grants from the Agency for Healthcare Research and Quality during the submitted work and from the National Institutes of Health, National Institute for Health Care Management Foundation, and American Psychological Association, as well as personal fees from Cambia outside the submitted work.

A version of this article appeared on Medscape.com.

Private equity (PE) acquisition of gastroenterology (GI) practices led to higher colonoscopy prices, utilization, and spending with no commensurate effect on quality, an economic analysis found. Price increases ranged from about 5% to about 7%.

In view of the growing trend to such acquisitions, policy makers should monitor the impact of PE investment in medical practices, according to researchers led by health economist Daniel R. Arnold, PhD, of the Department of Health Services, Policy & Practice in the School of Public Health at Brown University in Providence, Rhode Island. “In a previous study of ours, gastroenterology stood out as a particularly attractive specialty to private equity,” Arnold told GI & Hepatology News.

David R. Arnold



Published in JAMA Health Forum, the economic evaluation of more than 1.1 million patients and 1.3 million colonoscopies concluded that PE acquisitions of GI sites are difficult to justify.

 

The Study

This difference-in-differences event study and economic evaluation analyzed data from US GI practices acquired by PE firms from 2015 to 2021. Commercial insurance claims covering more than 50 million enrollees were used to calculate price, spending, utilization, and quality measures from 2012 to 2021, with all data analyzed from April to September 2024.

The main outcomes were price, spending per physician, number of colonoscopies per physician, number of unique patients per physician, and quality, as defined by polyp detection, incomplete colonoscopies, and four adverse event measures: cardiovascular, serious and nonserious GI events, and any other adverse events.

The mean age of patients was 47.1 years, and 47.8% were men. The sample included 718, 851 colonoscopies conducted by 1494 physicians in 590, 900 patients across 1240 PE-acquired practice sites and 637, 990 control colonoscopies conducted by 2550 physicians in 527,380 patients across 2657 independent practice sites.

Among the findings:

  • Colonoscopy prices at PE-acquired sites increased by 4.5% (95% CI, 2.5-6.6; P < .001) vs independent practices. That increase was much lower than that reported by Singh and colleagues for  .
  • The estimated price increase was 6.7% (95% CI, 4.2-9.3; P < .001) when only colonoscopies at PE practices with market shares above the 75th percentile (24.4%) in 2021 were considered. Both increases were in line with other research, Arnold said.
  • Colonoscopy spending per physician increased by 16.0% (95% CI, 8.4%-24.0%; P < .001), while the number of colonoscopies and the number of unique patients per physician increased by 12.1% (95% CI, 5.3-19.4; P < .001) and 11.3% (95% CI, 4.4%-18.5%; P < .001), respectively. These measures, however, were already increasing before PE acquisition.
  • No statistically significant associations were detected for the six quality measures analyzed.

Could such cost-raising acquisitions potentially be blocked by concerned regulators? 

“No. Generally the purchases are at prices below what would require notification to federal authorities,” Arnold said. “The Department of Justice/Federal Trade Commission hinted at being willing to look at serial acquisitions in their 2023 Merger Guidelines, but until that happens, these will likely continue to fly under the radar.”

Still, as evidence of PE-associated poorer quality outcomes as well as clinician burnout continues to emerge, Arnold added, “I would advise physicians who get buyout offers from PE to educate themselves on what could happen to patients and staff if they choose to sell.”

Offering an outsider’s perspective on the study, health economist Atul Gupta, PhD, an assistant professor of healthcare management in the Wharton School at the University of Pennsylvania in Philadelphia, called it an “excellent addition to the developing literature examining the effects of private equity ownership of healthcare providers.” Very few studies have examined the effects on prices and quality for the same set of deals and providers. “This is important because we want to be able to do an apples-to-apples comparison of the effects on both outcomes before judging PE ownership,” he told GI & Hepatology News.

Atul Gupta



In an accompanying editorial , primary care physician Jane M. Zhu, MD, an associate professor of medicine at Oregon Health & Science University in Portland, Oregon, and not involved in the commercial-insurance-based study, said one interpretation of the findings may be that PE acquisition focuses on reducing inefficiencies, improving access by expanding practice capacity, and increasing throughput. “Another interpretation may be that PE acquisition is focused on the strategic exploitation of market and pricing power. The latter may have less of an impact on clinical measures like quality of care, but potentially, both strategies could be at play.” 

Since this analysis focused on the commercial population, understanding how patient demographics may change after PE acquisition is a future avenue for exploration. “For instance, a potential explanation for both the price and utilization shifts might be if payer mix shifted toward more commercially insured patients at the expense of Medicaid or Medicare patients,” she wrote.

Zhu added that the impact of PE on prices and spending, by now replicated across different settings and specialties, is far clearer than the effect of PE on access and quality. “The analysis by Arnold et al is a welcome addition to the literature, generating important questions for future study and transparent monitoring as investor-owners become increasingly influential in healthcare.”

Going forward, said Gupta, an open question is whether the harmful effects of PE ownership of practices are differentially worse than those of other corporate entities such as insurers and hospital systems.

Dr. Jane M. Zhu



“There are reasons to believe that PE could be worse in theory. For example, their short-term investment horizon may force them to take measures that others will not as well as avoid investing into capital improvements that have a long-run payoff,” he said. “Their uniquely high dependence on debt and unbundling of real estate can severely hurt financial solvency of providers.” But high-quality evidence is lacking to compare the effects from these two distinct forms of corporatization.

The trend away from individual private practice is a reality, Arnold said. “The administrative burden on solo docs is becoming too much and physicians just seem to want to treat patients and not deal with it. So the options at this point really become selling to a hospital system or private equity.”

This study was funded by a grant from the philanthropic foundation Arnold Ventures (no family relation to Daniel Arnold). 

Arnold reported receiving grants from Arnold Ventures during the conduct of the study. Gupta had no competing interests to declare. Zhu reported receiving grants from the Agency for Healthcare Research and Quality during the submitted work and from the National Institutes of Health, National Institute for Health Care Management Foundation, and American Psychological Association, as well as personal fees from Cambia outside the submitted work.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Fri, 08/01/2025 - 10:11
Un-Gate On Date
Fri, 08/01/2025 - 10:11
Use ProPublica
CFC Schedule Remove Status
Fri, 08/01/2025 - 10:11
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Fri, 08/01/2025 - 10:11

Less Invasive Sponge Test Stratifies Risk in Patients With Barrett’s Esophagus

Article Type
Changed
Tue, 07/29/2025 - 13:53

Capsule sponge-based surveillance could be used in lieu of endoscopy for low-risk Barrett’s esophagus (BE) surveillance, a prospective multisite UK study found. The biomarker risk panel collected by the panesophageal Cytosponge-on-a-string in more than 900 UK patients helped identify those at highest risk for dysplasia or cancer and needing endoscopy. It was found safe for following low-risk patients who did not need endoscopy. 

Endoscopic surveillance is the clinical standard for BE, but its effectiveness is inconsistent, wrote Rebecca C. Fitzgerald, MD, AGAF, professor in the Early Cancer Institute at the University of Cambridge in Cambridge, England, and colleagues in The Lancet

Dr. Rebecca C. Fitzgerald



“It is often performed by nonspecialists, and recent trials show that around 10% of cases of dysplasia and cancer are missed, which means some patients re-present within a year of their surveillance procedure with a symptomatic cancer that should have been diagnosed earlier,” Fitzgerald told GI & Hepatology News.

Moreover, repeated endoscopy monitoring is stressful. “A simple nonendoscopic capsule sponge test done nearer to home is less scary and could be less operator-dependent. By reducing the burden of endoscopy in patients at very low risk we can focus more on the patients at higher risk,” she said.

In 2022, her research group had reported that the capsule sponge test, coupled with a centralized lab test for p53 and atypia, can risk-stratify patients into low-, moderate-, and high-risk groups. “In the current study, we wanted to check this risk stratification capsule sponge test in the real world. Our main aim was to see if we could conform the 2022 results with the hypothesis that the low-risk patients — more than 50% of patients in surveillance — would have a risk of high-grade dysplasia or cancer that was sufficiently low — that is, less than from 3% — and could therefore have follow-up with the capsule sponge without requiring endoscopy.”

The investigators hypothesized that the 15% at high risk would have a significant chance of dysplasia warranting endoscopy in a specialist center.

“Our results showed that in the low-risk group the risk of high-grade dysplasia or cancer was 0.4%, suggesting these patients could be offered follow-up with the capsule sponge test,” Fitzgerald said.

The high-risk group with a double biomarker positive (p53 and atypia) had an 85% risk for dysplasia or cancer. “We call this a tier 1 or ultra-high risk, and this suggests these cases merit a specialist endoscopy in a center that could treat the dysplasia/cancer,” she said.

 

Study Details

Adult participants (n = 910) were recruited from August 2020 to December 2024 in two multicenter, prospective, pragmatic implementation studies from 13 hospitals. Patients with nondysplastic BE on last endoscopy had a capsule sponge test.

Patient risk was assigned as low (clinical and capsule sponge biomarkers negative), moderate (negative for capsule sponge biomarkers, positive clinical biomarkers: age, sex, and segment length), or high risk (p53 abnormality, glandular atypia regardless of clinical biomarkers, or both). The primary outcome was a diagnosis of high-grade dysplasia or cancer necessitating treatment, according to the risk group.

In the cohort, 138 (15%) were classified as having high risk, 283 (31%) had moderate risk, and 489 (54%) had low risk.

The positive predictive value for any dysplasia or worse in the high-risk group was 37.7% (95% CI, 29.7-46.4). Patients with both atypia and aberrant p53 had the highest risk for high-grade dysplasia or cancer with a relative risk of 135.8 (95% CI, 32.7-564.0) vs the low-risk group. 

The prevalence of high-grade dysplasia or cancer in the low-risk group was, as mentioned, just 0.4% (95% CI, 0.1-1.6), while the negative predictive value for any dysplasia or cancer was 97.8% (95% CI, 95.9-98.8). Applying a machine learning algorithm reduced the proportion needing p53 pathology review to 32% without missing any positive cases.

Offering a US perspective on the study, Nicholas J. Shaheen, MD, MPH, AGAF, professor of medicine and director of the NC Translational & Clinical Sciences Institute at the University of North Carolina School of Medicine in Chapel Hill, called the findings “very provocative.”

 

Dr. Nicholas J. Shaheen



“We have known for some time that nonendoscopic techniques could be used to screen for Barrett’s esophagus and esophageal cancer, allowing us to screen larger groups of patients in a more cost-effective manner compared to traditional upper endoscopy,” he told GI & Hepatology News. “This study suggests that, in addition to case-finding for Barrett’s [esophagus], a nonendoscopic sponge-based technique can also help us stratify risk, finding cases that either already harbor cancer or are at high risk to do so.”

Shaheen said these cases deserve immediate attention since they are most likely to benefit from timely endoscopic intervention. “The study also suggests that a nonendoscopic result could someday be used to decide subsequent follow-up, with low-risk patients undergoing further nonendoscopic surveillance, while higher-risk patients would move on to endoscopy. Such a paradigm could unburden our endoscopy units from low-risk patients unlikely to benefit from endoscopy as well as increase the numbers of patients who are able to be screened.”

Fitzgerald added, “The GI community is realizing that we need a better approach to managing patients with Barrett’s [esophagus]. In the UK this evidence is being considered by our guideline committee, and it would influence the upcoming guidelines in 2025 with a requirement to continue to audit the results. Outside of the UK we hope this will pave the way for nonendoscopic approaches to Barrett’s [esophagus] surveillance.”

One ongoing goal is to optimize the biomarkers, Fitzgerald said. “For patients with longer segments we would like to add additional genomic biomarkers to refine the risk predictions,” she said. “We need a more operator-independent, consistent method for monitoring Barrett’s [esophagus]. This large real-world study is highly encouraging for a more personalized and patient-friendly approach to Barrett’s [esophagus] surveillance.”

This study was funded by Innovate UK, Cancer Research UK, National Health Service England Cancer Alliance. Cytosponge technology is licensed by the Medical Research Council to Medtronic. Fitzgerald declared holding patents related to this test. Fitzgerald reported being a shareholder in Cyted Health. 

Shaheen reported receiving research funding from Lucid Diagnostics and Cyted Health, both of which are manufacturers of nonendoscopic screening devices for BE.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Capsule sponge-based surveillance could be used in lieu of endoscopy for low-risk Barrett’s esophagus (BE) surveillance, a prospective multisite UK study found. The biomarker risk panel collected by the panesophageal Cytosponge-on-a-string in more than 900 UK patients helped identify those at highest risk for dysplasia or cancer and needing endoscopy. It was found safe for following low-risk patients who did not need endoscopy. 

Endoscopic surveillance is the clinical standard for BE, but its effectiveness is inconsistent, wrote Rebecca C. Fitzgerald, MD, AGAF, professor in the Early Cancer Institute at the University of Cambridge in Cambridge, England, and colleagues in The Lancet

Dr. Rebecca C. Fitzgerald



“It is often performed by nonspecialists, and recent trials show that around 10% of cases of dysplasia and cancer are missed, which means some patients re-present within a year of their surveillance procedure with a symptomatic cancer that should have been diagnosed earlier,” Fitzgerald told GI & Hepatology News.

Moreover, repeated endoscopy monitoring is stressful. “A simple nonendoscopic capsule sponge test done nearer to home is less scary and could be less operator-dependent. By reducing the burden of endoscopy in patients at very low risk we can focus more on the patients at higher risk,” she said.

In 2022, her research group had reported that the capsule sponge test, coupled with a centralized lab test for p53 and atypia, can risk-stratify patients into low-, moderate-, and high-risk groups. “In the current study, we wanted to check this risk stratification capsule sponge test in the real world. Our main aim was to see if we could conform the 2022 results with the hypothesis that the low-risk patients — more than 50% of patients in surveillance — would have a risk of high-grade dysplasia or cancer that was sufficiently low — that is, less than from 3% — and could therefore have follow-up with the capsule sponge without requiring endoscopy.”

The investigators hypothesized that the 15% at high risk would have a significant chance of dysplasia warranting endoscopy in a specialist center.

“Our results showed that in the low-risk group the risk of high-grade dysplasia or cancer was 0.4%, suggesting these patients could be offered follow-up with the capsule sponge test,” Fitzgerald said.

The high-risk group with a double biomarker positive (p53 and atypia) had an 85% risk for dysplasia or cancer. “We call this a tier 1 or ultra-high risk, and this suggests these cases merit a specialist endoscopy in a center that could treat the dysplasia/cancer,” she said.

 

Study Details

Adult participants (n = 910) were recruited from August 2020 to December 2024 in two multicenter, prospective, pragmatic implementation studies from 13 hospitals. Patients with nondysplastic BE on last endoscopy had a capsule sponge test.

Patient risk was assigned as low (clinical and capsule sponge biomarkers negative), moderate (negative for capsule sponge biomarkers, positive clinical biomarkers: age, sex, and segment length), or high risk (p53 abnormality, glandular atypia regardless of clinical biomarkers, or both). The primary outcome was a diagnosis of high-grade dysplasia or cancer necessitating treatment, according to the risk group.

In the cohort, 138 (15%) were classified as having high risk, 283 (31%) had moderate risk, and 489 (54%) had low risk.

The positive predictive value for any dysplasia or worse in the high-risk group was 37.7% (95% CI, 29.7-46.4). Patients with both atypia and aberrant p53 had the highest risk for high-grade dysplasia or cancer with a relative risk of 135.8 (95% CI, 32.7-564.0) vs the low-risk group. 

The prevalence of high-grade dysplasia or cancer in the low-risk group was, as mentioned, just 0.4% (95% CI, 0.1-1.6), while the negative predictive value for any dysplasia or cancer was 97.8% (95% CI, 95.9-98.8). Applying a machine learning algorithm reduced the proportion needing p53 pathology review to 32% without missing any positive cases.

Offering a US perspective on the study, Nicholas J. Shaheen, MD, MPH, AGAF, professor of medicine and director of the NC Translational & Clinical Sciences Institute at the University of North Carolina School of Medicine in Chapel Hill, called the findings “very provocative.”

 

Dr. Nicholas J. Shaheen



“We have known for some time that nonendoscopic techniques could be used to screen for Barrett’s esophagus and esophageal cancer, allowing us to screen larger groups of patients in a more cost-effective manner compared to traditional upper endoscopy,” he told GI & Hepatology News. “This study suggests that, in addition to case-finding for Barrett’s [esophagus], a nonendoscopic sponge-based technique can also help us stratify risk, finding cases that either already harbor cancer or are at high risk to do so.”

Shaheen said these cases deserve immediate attention since they are most likely to benefit from timely endoscopic intervention. “The study also suggests that a nonendoscopic result could someday be used to decide subsequent follow-up, with low-risk patients undergoing further nonendoscopic surveillance, while higher-risk patients would move on to endoscopy. Such a paradigm could unburden our endoscopy units from low-risk patients unlikely to benefit from endoscopy as well as increase the numbers of patients who are able to be screened.”

Fitzgerald added, “The GI community is realizing that we need a better approach to managing patients with Barrett’s [esophagus]. In the UK this evidence is being considered by our guideline committee, and it would influence the upcoming guidelines in 2025 with a requirement to continue to audit the results. Outside of the UK we hope this will pave the way for nonendoscopic approaches to Barrett’s [esophagus] surveillance.”

One ongoing goal is to optimize the biomarkers, Fitzgerald said. “For patients with longer segments we would like to add additional genomic biomarkers to refine the risk predictions,” she said. “We need a more operator-independent, consistent method for monitoring Barrett’s [esophagus]. This large real-world study is highly encouraging for a more personalized and patient-friendly approach to Barrett’s [esophagus] surveillance.”

This study was funded by Innovate UK, Cancer Research UK, National Health Service England Cancer Alliance. Cytosponge technology is licensed by the Medical Research Council to Medtronic. Fitzgerald declared holding patents related to this test. Fitzgerald reported being a shareholder in Cyted Health. 

Shaheen reported receiving research funding from Lucid Diagnostics and Cyted Health, both of which are manufacturers of nonendoscopic screening devices for BE.

A version of this article appeared on Medscape.com.

Capsule sponge-based surveillance could be used in lieu of endoscopy for low-risk Barrett’s esophagus (BE) surveillance, a prospective multisite UK study found. The biomarker risk panel collected by the panesophageal Cytosponge-on-a-string in more than 900 UK patients helped identify those at highest risk for dysplasia or cancer and needing endoscopy. It was found safe for following low-risk patients who did not need endoscopy. 

Endoscopic surveillance is the clinical standard for BE, but its effectiveness is inconsistent, wrote Rebecca C. Fitzgerald, MD, AGAF, professor in the Early Cancer Institute at the University of Cambridge in Cambridge, England, and colleagues in The Lancet

Dr. Rebecca C. Fitzgerald



“It is often performed by nonspecialists, and recent trials show that around 10% of cases of dysplasia and cancer are missed, which means some patients re-present within a year of their surveillance procedure with a symptomatic cancer that should have been diagnosed earlier,” Fitzgerald told GI & Hepatology News.

Moreover, repeated endoscopy monitoring is stressful. “A simple nonendoscopic capsule sponge test done nearer to home is less scary and could be less operator-dependent. By reducing the burden of endoscopy in patients at very low risk we can focus more on the patients at higher risk,” she said.

In 2022, her research group had reported that the capsule sponge test, coupled with a centralized lab test for p53 and atypia, can risk-stratify patients into low-, moderate-, and high-risk groups. “In the current study, we wanted to check this risk stratification capsule sponge test in the real world. Our main aim was to see if we could conform the 2022 results with the hypothesis that the low-risk patients — more than 50% of patients in surveillance — would have a risk of high-grade dysplasia or cancer that was sufficiently low — that is, less than from 3% — and could therefore have follow-up with the capsule sponge without requiring endoscopy.”

The investigators hypothesized that the 15% at high risk would have a significant chance of dysplasia warranting endoscopy in a specialist center.

“Our results showed that in the low-risk group the risk of high-grade dysplasia or cancer was 0.4%, suggesting these patients could be offered follow-up with the capsule sponge test,” Fitzgerald said.

The high-risk group with a double biomarker positive (p53 and atypia) had an 85% risk for dysplasia or cancer. “We call this a tier 1 or ultra-high risk, and this suggests these cases merit a specialist endoscopy in a center that could treat the dysplasia/cancer,” she said.

 

Study Details

Adult participants (n = 910) were recruited from August 2020 to December 2024 in two multicenter, prospective, pragmatic implementation studies from 13 hospitals. Patients with nondysplastic BE on last endoscopy had a capsule sponge test.

Patient risk was assigned as low (clinical and capsule sponge biomarkers negative), moderate (negative for capsule sponge biomarkers, positive clinical biomarkers: age, sex, and segment length), or high risk (p53 abnormality, glandular atypia regardless of clinical biomarkers, or both). The primary outcome was a diagnosis of high-grade dysplasia or cancer necessitating treatment, according to the risk group.

In the cohort, 138 (15%) were classified as having high risk, 283 (31%) had moderate risk, and 489 (54%) had low risk.

The positive predictive value for any dysplasia or worse in the high-risk group was 37.7% (95% CI, 29.7-46.4). Patients with both atypia and aberrant p53 had the highest risk for high-grade dysplasia or cancer with a relative risk of 135.8 (95% CI, 32.7-564.0) vs the low-risk group. 

The prevalence of high-grade dysplasia or cancer in the low-risk group was, as mentioned, just 0.4% (95% CI, 0.1-1.6), while the negative predictive value for any dysplasia or cancer was 97.8% (95% CI, 95.9-98.8). Applying a machine learning algorithm reduced the proportion needing p53 pathology review to 32% without missing any positive cases.

Offering a US perspective on the study, Nicholas J. Shaheen, MD, MPH, AGAF, professor of medicine and director of the NC Translational & Clinical Sciences Institute at the University of North Carolina School of Medicine in Chapel Hill, called the findings “very provocative.”

 

Dr. Nicholas J. Shaheen



“We have known for some time that nonendoscopic techniques could be used to screen for Barrett’s esophagus and esophageal cancer, allowing us to screen larger groups of patients in a more cost-effective manner compared to traditional upper endoscopy,” he told GI & Hepatology News. “This study suggests that, in addition to case-finding for Barrett’s [esophagus], a nonendoscopic sponge-based technique can also help us stratify risk, finding cases that either already harbor cancer or are at high risk to do so.”

Shaheen said these cases deserve immediate attention since they are most likely to benefit from timely endoscopic intervention. “The study also suggests that a nonendoscopic result could someday be used to decide subsequent follow-up, with low-risk patients undergoing further nonendoscopic surveillance, while higher-risk patients would move on to endoscopy. Such a paradigm could unburden our endoscopy units from low-risk patients unlikely to benefit from endoscopy as well as increase the numbers of patients who are able to be screened.”

Fitzgerald added, “The GI community is realizing that we need a better approach to managing patients with Barrett’s [esophagus]. In the UK this evidence is being considered by our guideline committee, and it would influence the upcoming guidelines in 2025 with a requirement to continue to audit the results. Outside of the UK we hope this will pave the way for nonendoscopic approaches to Barrett’s [esophagus] surveillance.”

One ongoing goal is to optimize the biomarkers, Fitzgerald said. “For patients with longer segments we would like to add additional genomic biomarkers to refine the risk predictions,” she said. “We need a more operator-independent, consistent method for monitoring Barrett’s [esophagus]. This large real-world study is highly encouraging for a more personalized and patient-friendly approach to Barrett’s [esophagus] surveillance.”

This study was funded by Innovate UK, Cancer Research UK, National Health Service England Cancer Alliance. Cytosponge technology is licensed by the Medical Research Council to Medtronic. Fitzgerald declared holding patents related to this test. Fitzgerald reported being a shareholder in Cyted Health. 

Shaheen reported receiving research funding from Lucid Diagnostics and Cyted Health, both of which are manufacturers of nonendoscopic screening devices for BE.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Tue, 07/29/2025 - 09:16
Un-Gate On Date
Tue, 07/29/2025 - 09:16
Use ProPublica
CFC Schedule Remove Status
Tue, 07/29/2025 - 09:16
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Tue, 07/29/2025 - 09:16

Sleep Changes in IBD Could Signal Inflammation, Flareups

Article Type
Changed
Tue, 07/29/2025 - 14:00

Changes in sleep metrics detected with wearable technology could serve as an inflammation marker and potentially predict inflammatory bowel disease (IBD) flareups, regardless of whether a patient has symptoms, an observational study suggested.

Sleep data from 101 study participants over a mean duration of about 228 days revealed that altered sleep architecture was only apparent when inflammation was present — symptoms alone did not impact sleep cycles or signal inflammation.

“We thought symptoms might have an impact on sleep, but interestingly, our data showed that measurable changes like reduced rapid eye movement (REM) sleep and increased light sleep only occurred during periods of active inflammation,” Robert Hirten, MD, associate professor of Medicine (Gastroenterology), and Artificial Intelligence and Human Health, at the Icahn School of Medicine at Mount Sinai, New York City, told GI & Hepatology News.

Dr. Robert Hirten



“It was also interesting to see distinct patterns in sleep metrics begin to shift over the 45 days before a flare, suggesting the potential for sleep to serve as an early indicator of disease activity,” he added.

“Sleep is often overlooked in the management of IBD, but it may provide valuable insights into a patient’s underlying disease state,” he said. “While sleep monitoring isn’t yet a standard part of IBD care, this study highlights its potential as a noninvasive window into disease activity, and a promising area for future clinical integration.”

The study was published online in Clinical Gastroenterology and Hepatology.

 

Less REM Sleep, More Light Sleep

Researchers assessed the impact of inflammation and symptoms on sleep architecture in IBD by analyzing data from 101 individuals who answered daily disease activity surveys and wore a wearable device.

The mean age of participants was 41 years and 65.3% were women. Sixty-three participants (62.4%) had Crohn’s disease (CD) and 38 (37.6%) had ulcerative colitis (UC).

Almost 40 (39.6%) participants used an Apple Watch; 50 (49.5%) used a Fitbit; and 11 (10.9%) used an Oura ring. Sleep architecture, sleep efficiency, and total hours asleep were collected from the devices. Participants were encouraged to wear their devices for at least 4 days per week and 8 hours per day and were not required to wear them at night. Participants provided data by linking their devices to ehive, Mount Sinai’s custom app.

Daily clinical disease activity was assessed using the UC or CD Patient Reported Outcome-2 survey. Participants were asked to answer at least four daily surveys each week.

Associations between sleep metrics and periods of symptomatic and inflammatory flares, and combinations of symptomatic and inflammatory activity, were compared to periods of symptomatic and inflammatory remission.

Furthermore, researchers explored the rate of change in sleep metrics for 45 days before and after inflammatory and symptomatic flares.

Participants contributed a mean duration of 228.16 nights of wearable data. During active inflammation, they spent a lower percentage of sleep time in REM (20% vs 21.59%) and a greater percentage of sleep time in light sleep (62.23% vs 59.95%) than during inflammatory remission. No differences were observed in the mean percentage of time in deep sleep, sleep efficiency, or total time asleep.

During symptomatic flares, there were no differences in the percentage of sleep time in REM sleep, deep sleep, light sleep, or sleep efficiency compared with periods of inflammatory remission. However, participants slept less overall during symptomatic flares compared with during symptomatic remission.

Compared with during asymptomatic and uninflamed periods, during asymptomatic but inflamed periods, participants spent a lower percentage of time in REM sleep, and more time in light sleep; however, there were no differences in sleep efficiency or total time asleep.

Similarly, participants had more light sleep and less REM sleep during symptomatic and inflammatory flares than during asymptomatic and uninflamed periods — but there were no differences in the percentage of time spent in deep sleep, in sleep efficiency, and the total time asleep.

Symptomatic flares alone, without inflammation, did not impact sleep metrics, the researchers concluded. However, periods with active inflammation were associated with a significantly smaller percentage of sleep time in REM sleep and a greater percentage of sleep time in light sleep.

The team also performed longitudinal mapping of sleep patterns before, during, and after disease exacerbations by analyzing sleep data for 6 weeks before and 6 weeks after flare episodes.

They found that sleep disturbances significantly worsen leading up to inflammatory flares and improve afterward, suggesting that sleep changes may signal upcoming increased disease activity. Evaluating the intersection of inflammatory and symptomatic flares, altered sleep architecture was only evident when inflammation was present.

“These findings raise important questions about whether intervening on sleep can actually impact inflammation or disease trajectory in IBD,” Hirten said. “Next steps include studying whether targeted sleep interventions can improve both sleep and IBD outcomes.”

While this research is still in the early stages, he said, “it suggests that sleep may have a relationship with inflammatory activity in IBD. For patients, it reinforces the value of paying attention to sleep changes.”

The findings also show the potential of wearable devices to guide more personalized monitoring, he added. “More work is needed before sleep metrics can be used routinely in clinical decision-making.”

 

Validates the Use of Wearables

Commenting on the study for GI & Hepatology News, Michael Mintz, MD, a gastroenterologist at Weill Cornell Medicine and NewYork-Presbyterian in New York City, observed, “Gastrointestinal symptoms often do not correlate with objective disease activity in IBD, creating a diagnostic challenge for gastroenterologists. Burdensome, expensive, and/or invasive testing, such as colonoscopies, stool tests, or imaging, are frequently required to monitor disease activity.” 

“This study is a first step in objectively monitoring inflammation in a patient-centric way that does not create undue burden to our patients,” he said. “It also provides longitudinal data that suggests changes in sleep patterns can pre-date disease flares, which ideally can lead to earlier intervention to prevent disease complications.”

Like Hirten, he noted that clinical decisions, such as changing IBD therapy, should not be based on the results of this study. “Rather this provides validation that wearable technology can provide useful objective data that correlates with disease activity.”

Furthermore, he said, it is not clear whether analyzing sleep data is a cost-effective way of monitoring IBD disease activity, or whether that data should be used alone or in combination with other objective disease markers, to influence clinical decision-making.

“This study provides proof of concept that there is a relationship between sleep characteristics and objective inflammation, but further studies are needed,” he said. “I am hopeful that this technology will give us another tool that we can use in clinical practice to monitor disease activity and improve outcomes in a way that is comfortable and convenient for our patients.”

This study was supported by a grant to Hirten from the US National Institutes of Health. Hirten reported receiving consulting fees from Bristol Meyers Squibb, AbbVie; stock options from Salvo Health; and research support from Janssen, Intralytix, EnLiSense, Crohn’s and Colitis Foundation. Mintz declared no competing interests.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Changes in sleep metrics detected with wearable technology could serve as an inflammation marker and potentially predict inflammatory bowel disease (IBD) flareups, regardless of whether a patient has symptoms, an observational study suggested.

Sleep data from 101 study participants over a mean duration of about 228 days revealed that altered sleep architecture was only apparent when inflammation was present — symptoms alone did not impact sleep cycles or signal inflammation.

“We thought symptoms might have an impact on sleep, but interestingly, our data showed that measurable changes like reduced rapid eye movement (REM) sleep and increased light sleep only occurred during periods of active inflammation,” Robert Hirten, MD, associate professor of Medicine (Gastroenterology), and Artificial Intelligence and Human Health, at the Icahn School of Medicine at Mount Sinai, New York City, told GI & Hepatology News.

Dr. Robert Hirten



“It was also interesting to see distinct patterns in sleep metrics begin to shift over the 45 days before a flare, suggesting the potential for sleep to serve as an early indicator of disease activity,” he added.

“Sleep is often overlooked in the management of IBD, but it may provide valuable insights into a patient’s underlying disease state,” he said. “While sleep monitoring isn’t yet a standard part of IBD care, this study highlights its potential as a noninvasive window into disease activity, and a promising area for future clinical integration.”

The study was published online in Clinical Gastroenterology and Hepatology.

 

Less REM Sleep, More Light Sleep

Researchers assessed the impact of inflammation and symptoms on sleep architecture in IBD by analyzing data from 101 individuals who answered daily disease activity surveys and wore a wearable device.

The mean age of participants was 41 years and 65.3% were women. Sixty-three participants (62.4%) had Crohn’s disease (CD) and 38 (37.6%) had ulcerative colitis (UC).

Almost 40 (39.6%) participants used an Apple Watch; 50 (49.5%) used a Fitbit; and 11 (10.9%) used an Oura ring. Sleep architecture, sleep efficiency, and total hours asleep were collected from the devices. Participants were encouraged to wear their devices for at least 4 days per week and 8 hours per day and were not required to wear them at night. Participants provided data by linking their devices to ehive, Mount Sinai’s custom app.

Daily clinical disease activity was assessed using the UC or CD Patient Reported Outcome-2 survey. Participants were asked to answer at least four daily surveys each week.

Associations between sleep metrics and periods of symptomatic and inflammatory flares, and combinations of symptomatic and inflammatory activity, were compared to periods of symptomatic and inflammatory remission.

Furthermore, researchers explored the rate of change in sleep metrics for 45 days before and after inflammatory and symptomatic flares.

Participants contributed a mean duration of 228.16 nights of wearable data. During active inflammation, they spent a lower percentage of sleep time in REM (20% vs 21.59%) and a greater percentage of sleep time in light sleep (62.23% vs 59.95%) than during inflammatory remission. No differences were observed in the mean percentage of time in deep sleep, sleep efficiency, or total time asleep.

During symptomatic flares, there were no differences in the percentage of sleep time in REM sleep, deep sleep, light sleep, or sleep efficiency compared with periods of inflammatory remission. However, participants slept less overall during symptomatic flares compared with during symptomatic remission.

Compared with during asymptomatic and uninflamed periods, during asymptomatic but inflamed periods, participants spent a lower percentage of time in REM sleep, and more time in light sleep; however, there were no differences in sleep efficiency or total time asleep.

Similarly, participants had more light sleep and less REM sleep during symptomatic and inflammatory flares than during asymptomatic and uninflamed periods — but there were no differences in the percentage of time spent in deep sleep, in sleep efficiency, and the total time asleep.

Symptomatic flares alone, without inflammation, did not impact sleep metrics, the researchers concluded. However, periods with active inflammation were associated with a significantly smaller percentage of sleep time in REM sleep and a greater percentage of sleep time in light sleep.

The team also performed longitudinal mapping of sleep patterns before, during, and after disease exacerbations by analyzing sleep data for 6 weeks before and 6 weeks after flare episodes.

They found that sleep disturbances significantly worsen leading up to inflammatory flares and improve afterward, suggesting that sleep changes may signal upcoming increased disease activity. Evaluating the intersection of inflammatory and symptomatic flares, altered sleep architecture was only evident when inflammation was present.

“These findings raise important questions about whether intervening on sleep can actually impact inflammation or disease trajectory in IBD,” Hirten said. “Next steps include studying whether targeted sleep interventions can improve both sleep and IBD outcomes.”

While this research is still in the early stages, he said, “it suggests that sleep may have a relationship with inflammatory activity in IBD. For patients, it reinforces the value of paying attention to sleep changes.”

The findings also show the potential of wearable devices to guide more personalized monitoring, he added. “More work is needed before sleep metrics can be used routinely in clinical decision-making.”

 

Validates the Use of Wearables

Commenting on the study for GI & Hepatology News, Michael Mintz, MD, a gastroenterologist at Weill Cornell Medicine and NewYork-Presbyterian in New York City, observed, “Gastrointestinal symptoms often do not correlate with objective disease activity in IBD, creating a diagnostic challenge for gastroenterologists. Burdensome, expensive, and/or invasive testing, such as colonoscopies, stool tests, or imaging, are frequently required to monitor disease activity.” 

“This study is a first step in objectively monitoring inflammation in a patient-centric way that does not create undue burden to our patients,” he said. “It also provides longitudinal data that suggests changes in sleep patterns can pre-date disease flares, which ideally can lead to earlier intervention to prevent disease complications.”

Like Hirten, he noted that clinical decisions, such as changing IBD therapy, should not be based on the results of this study. “Rather this provides validation that wearable technology can provide useful objective data that correlates with disease activity.”

Furthermore, he said, it is not clear whether analyzing sleep data is a cost-effective way of monitoring IBD disease activity, or whether that data should be used alone or in combination with other objective disease markers, to influence clinical decision-making.

“This study provides proof of concept that there is a relationship between sleep characteristics and objective inflammation, but further studies are needed,” he said. “I am hopeful that this technology will give us another tool that we can use in clinical practice to monitor disease activity and improve outcomes in a way that is comfortable and convenient for our patients.”

This study was supported by a grant to Hirten from the US National Institutes of Health. Hirten reported receiving consulting fees from Bristol Meyers Squibb, AbbVie; stock options from Salvo Health; and research support from Janssen, Intralytix, EnLiSense, Crohn’s and Colitis Foundation. Mintz declared no competing interests.

A version of this article appeared on Medscape.com.

Changes in sleep metrics detected with wearable technology could serve as an inflammation marker and potentially predict inflammatory bowel disease (IBD) flareups, regardless of whether a patient has symptoms, an observational study suggested.

Sleep data from 101 study participants over a mean duration of about 228 days revealed that altered sleep architecture was only apparent when inflammation was present — symptoms alone did not impact sleep cycles or signal inflammation.

“We thought symptoms might have an impact on sleep, but interestingly, our data showed that measurable changes like reduced rapid eye movement (REM) sleep and increased light sleep only occurred during periods of active inflammation,” Robert Hirten, MD, associate professor of Medicine (Gastroenterology), and Artificial Intelligence and Human Health, at the Icahn School of Medicine at Mount Sinai, New York City, told GI & Hepatology News.

Dr. Robert Hirten



“It was also interesting to see distinct patterns in sleep metrics begin to shift over the 45 days before a flare, suggesting the potential for sleep to serve as an early indicator of disease activity,” he added.

“Sleep is often overlooked in the management of IBD, but it may provide valuable insights into a patient’s underlying disease state,” he said. “While sleep monitoring isn’t yet a standard part of IBD care, this study highlights its potential as a noninvasive window into disease activity, and a promising area for future clinical integration.”

The study was published online in Clinical Gastroenterology and Hepatology.

 

Less REM Sleep, More Light Sleep

Researchers assessed the impact of inflammation and symptoms on sleep architecture in IBD by analyzing data from 101 individuals who answered daily disease activity surveys and wore a wearable device.

The mean age of participants was 41 years and 65.3% were women. Sixty-three participants (62.4%) had Crohn’s disease (CD) and 38 (37.6%) had ulcerative colitis (UC).

Almost 40 (39.6%) participants used an Apple Watch; 50 (49.5%) used a Fitbit; and 11 (10.9%) used an Oura ring. Sleep architecture, sleep efficiency, and total hours asleep were collected from the devices. Participants were encouraged to wear their devices for at least 4 days per week and 8 hours per day and were not required to wear them at night. Participants provided data by linking their devices to ehive, Mount Sinai’s custom app.

Daily clinical disease activity was assessed using the UC or CD Patient Reported Outcome-2 survey. Participants were asked to answer at least four daily surveys each week.

Associations between sleep metrics and periods of symptomatic and inflammatory flares, and combinations of symptomatic and inflammatory activity, were compared to periods of symptomatic and inflammatory remission.

Furthermore, researchers explored the rate of change in sleep metrics for 45 days before and after inflammatory and symptomatic flares.

Participants contributed a mean duration of 228.16 nights of wearable data. During active inflammation, they spent a lower percentage of sleep time in REM (20% vs 21.59%) and a greater percentage of sleep time in light sleep (62.23% vs 59.95%) than during inflammatory remission. No differences were observed in the mean percentage of time in deep sleep, sleep efficiency, or total time asleep.

During symptomatic flares, there were no differences in the percentage of sleep time in REM sleep, deep sleep, light sleep, or sleep efficiency compared with periods of inflammatory remission. However, participants slept less overall during symptomatic flares compared with during symptomatic remission.

Compared with during asymptomatic and uninflamed periods, during asymptomatic but inflamed periods, participants spent a lower percentage of time in REM sleep, and more time in light sleep; however, there were no differences in sleep efficiency or total time asleep.

Similarly, participants had more light sleep and less REM sleep during symptomatic and inflammatory flares than during asymptomatic and uninflamed periods — but there were no differences in the percentage of time spent in deep sleep, in sleep efficiency, and the total time asleep.

Symptomatic flares alone, without inflammation, did not impact sleep metrics, the researchers concluded. However, periods with active inflammation were associated with a significantly smaller percentage of sleep time in REM sleep and a greater percentage of sleep time in light sleep.

The team also performed longitudinal mapping of sleep patterns before, during, and after disease exacerbations by analyzing sleep data for 6 weeks before and 6 weeks after flare episodes.

They found that sleep disturbances significantly worsen leading up to inflammatory flares and improve afterward, suggesting that sleep changes may signal upcoming increased disease activity. Evaluating the intersection of inflammatory and symptomatic flares, altered sleep architecture was only evident when inflammation was present.

“These findings raise important questions about whether intervening on sleep can actually impact inflammation or disease trajectory in IBD,” Hirten said. “Next steps include studying whether targeted sleep interventions can improve both sleep and IBD outcomes.”

While this research is still in the early stages, he said, “it suggests that sleep may have a relationship with inflammatory activity in IBD. For patients, it reinforces the value of paying attention to sleep changes.”

The findings also show the potential of wearable devices to guide more personalized monitoring, he added. “More work is needed before sleep metrics can be used routinely in clinical decision-making.”

 

Validates the Use of Wearables

Commenting on the study for GI & Hepatology News, Michael Mintz, MD, a gastroenterologist at Weill Cornell Medicine and NewYork-Presbyterian in New York City, observed, “Gastrointestinal symptoms often do not correlate with objective disease activity in IBD, creating a diagnostic challenge for gastroenterologists. Burdensome, expensive, and/or invasive testing, such as colonoscopies, stool tests, or imaging, are frequently required to monitor disease activity.” 

“This study is a first step in objectively monitoring inflammation in a patient-centric way that does not create undue burden to our patients,” he said. “It also provides longitudinal data that suggests changes in sleep patterns can pre-date disease flares, which ideally can lead to earlier intervention to prevent disease complications.”

Like Hirten, he noted that clinical decisions, such as changing IBD therapy, should not be based on the results of this study. “Rather this provides validation that wearable technology can provide useful objective data that correlates with disease activity.”

Furthermore, he said, it is not clear whether analyzing sleep data is a cost-effective way of monitoring IBD disease activity, or whether that data should be used alone or in combination with other objective disease markers, to influence clinical decision-making.

“This study provides proof of concept that there is a relationship between sleep characteristics and objective inflammation, but further studies are needed,” he said. “I am hopeful that this technology will give us another tool that we can use in clinical practice to monitor disease activity and improve outcomes in a way that is comfortable and convenient for our patients.”

This study was supported by a grant to Hirten from the US National Institutes of Health. Hirten reported receiving consulting fees from Bristol Meyers Squibb, AbbVie; stock options from Salvo Health; and research support from Janssen, Intralytix, EnLiSense, Crohn’s and Colitis Foundation. Mintz declared no competing interests.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CLINICAL GASTROENTEROLOGY AND HEPATOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Mon, 07/28/2025 - 15:05
Un-Gate On Date
Mon, 07/28/2025 - 15:05
Use ProPublica
CFC Schedule Remove Status
Mon, 07/28/2025 - 15:05
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date
Mon, 07/28/2025 - 15:05