User login
Belly Fat Beats BMI in Predicting Colorectal Cancer Risk
TOPLINE:
METHODOLOGY:
- General obesity, often measured using BMI, is a recognized risk factor for colorectal cancer, but how much of this association is due to central obesity is unclear.
- Researchers assessed the associations between BMI, waist-to-hip ratio (WHR), and waist circumference (WC) with colorectal cancer risk and the degree of independence among these associations in patients aged 40-69 years recruited in the UK Biobank cohort study from 2006 to 2010.
- Anthropometric measurements were performed using standardized methods.
- Cancer registry and hospital data linkage identified colorectal cancer cases in the UK Biobank.
TAKEAWAY:
- Researchers included 460,784 participants (mean age, 56.3 years; 46.7% men), of whom 67.1% had either overweight or obesity, and 49.4% and 60.5% had high or very high WHR and WC, respectively.
- During the median 12.5-year follow-up period, 5977 participants developed colorectal cancer.
- Every SD increase in WHR (hazard ratio [HR], 1.18) showed a stronger association with colorectal cancer risk than in BMI (HR, 1.10).
- After adjustment for BMI, the association between WHR and colorectal cancer risk became slightly attenuated while still staying robust (HR, 1.15); however, after adjusting for WHR, the association between BMI and colorectal cancer risk became substantially weakened (HR, 1.04).
- WHR showed strongly significant associations with colorectal cancer risk across all BMI categories, whereas associations of BMI with colorectal cancer risk were weak and not statistically significant within all WHR categories.
- Central obesity demonstrated consistent associations with both colon and rectal cancer risks in both sexes before and after adjustment for BMI, whereas BMI showed no significant association with colorectal cancer risk in women or with rectal cancer risk after WHR adjustment.
IN PRACTICE:
“[The study] results also underline the importance of integrating additional anthropometric measures such as WHR alongside BMI into routine clinical practice for more effective prevention and management of obesity, whose prevalence is steadily increasing in many countries worldwide, in order to limit the global burden of colorectal cancer and many other obesity-related adverse health outcomes,” the authors wrote.
SOURCE:
The study was led by Fatemeh Safizadeh, German Cancer Research Center (DKFZ), Heidelberg. It was published online in The International Journal of Obesity.
LIMITATIONS:
This study relied on only one-time measurements of anthropometric measures at baseline, without considering previous lifetime history of overweight and obesity or changes during follow-up. Additionally, WHR and WC may not be the most accurate measures of central obesity, as WC includes both visceral and subcutaneous adipose tissue. The study population also showed evidence of healthy volunteer bias, with more health-conscious and socioeconomically advantaged participants being somewhat overrepresented.
DISCLOSURES:
The authors declared no competing interests.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- General obesity, often measured using BMI, is a recognized risk factor for colorectal cancer, but how much of this association is due to central obesity is unclear.
- Researchers assessed the associations between BMI, waist-to-hip ratio (WHR), and waist circumference (WC) with colorectal cancer risk and the degree of independence among these associations in patients aged 40-69 years recruited in the UK Biobank cohort study from 2006 to 2010.
- Anthropometric measurements were performed using standardized methods.
- Cancer registry and hospital data linkage identified colorectal cancer cases in the UK Biobank.
TAKEAWAY:
- Researchers included 460,784 participants (mean age, 56.3 years; 46.7% men), of whom 67.1% had either overweight or obesity, and 49.4% and 60.5% had high or very high WHR and WC, respectively.
- During the median 12.5-year follow-up period, 5977 participants developed colorectal cancer.
- Every SD increase in WHR (hazard ratio [HR], 1.18) showed a stronger association with colorectal cancer risk than in BMI (HR, 1.10).
- After adjustment for BMI, the association between WHR and colorectal cancer risk became slightly attenuated while still staying robust (HR, 1.15); however, after adjusting for WHR, the association between BMI and colorectal cancer risk became substantially weakened (HR, 1.04).
- WHR showed strongly significant associations with colorectal cancer risk across all BMI categories, whereas associations of BMI with colorectal cancer risk were weak and not statistically significant within all WHR categories.
- Central obesity demonstrated consistent associations with both colon and rectal cancer risks in both sexes before and after adjustment for BMI, whereas BMI showed no significant association with colorectal cancer risk in women or with rectal cancer risk after WHR adjustment.
IN PRACTICE:
“[The study] results also underline the importance of integrating additional anthropometric measures such as WHR alongside BMI into routine clinical practice for more effective prevention and management of obesity, whose prevalence is steadily increasing in many countries worldwide, in order to limit the global burden of colorectal cancer and many other obesity-related adverse health outcomes,” the authors wrote.
SOURCE:
The study was led by Fatemeh Safizadeh, German Cancer Research Center (DKFZ), Heidelberg. It was published online in The International Journal of Obesity.
LIMITATIONS:
This study relied on only one-time measurements of anthropometric measures at baseline, without considering previous lifetime history of overweight and obesity or changes during follow-up. Additionally, WHR and WC may not be the most accurate measures of central obesity, as WC includes both visceral and subcutaneous adipose tissue. The study population also showed evidence of healthy volunteer bias, with more health-conscious and socioeconomically advantaged participants being somewhat overrepresented.
DISCLOSURES:
The authors declared no competing interests.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- General obesity, often measured using BMI, is a recognized risk factor for colorectal cancer, but how much of this association is due to central obesity is unclear.
- Researchers assessed the associations between BMI, waist-to-hip ratio (WHR), and waist circumference (WC) with colorectal cancer risk and the degree of independence among these associations in patients aged 40-69 years recruited in the UK Biobank cohort study from 2006 to 2010.
- Anthropometric measurements were performed using standardized methods.
- Cancer registry and hospital data linkage identified colorectal cancer cases in the UK Biobank.
TAKEAWAY:
- Researchers included 460,784 participants (mean age, 56.3 years; 46.7% men), of whom 67.1% had either overweight or obesity, and 49.4% and 60.5% had high or very high WHR and WC, respectively.
- During the median 12.5-year follow-up period, 5977 participants developed colorectal cancer.
- Every SD increase in WHR (hazard ratio [HR], 1.18) showed a stronger association with colorectal cancer risk than in BMI (HR, 1.10).
- After adjustment for BMI, the association between WHR and colorectal cancer risk became slightly attenuated while still staying robust (HR, 1.15); however, after adjusting for WHR, the association between BMI and colorectal cancer risk became substantially weakened (HR, 1.04).
- WHR showed strongly significant associations with colorectal cancer risk across all BMI categories, whereas associations of BMI with colorectal cancer risk were weak and not statistically significant within all WHR categories.
- Central obesity demonstrated consistent associations with both colon and rectal cancer risks in both sexes before and after adjustment for BMI, whereas BMI showed no significant association with colorectal cancer risk in women or with rectal cancer risk after WHR adjustment.
IN PRACTICE:
“[The study] results also underline the importance of integrating additional anthropometric measures such as WHR alongside BMI into routine clinical practice for more effective prevention and management of obesity, whose prevalence is steadily increasing in many countries worldwide, in order to limit the global burden of colorectal cancer and many other obesity-related adverse health outcomes,” the authors wrote.
SOURCE:
The study was led by Fatemeh Safizadeh, German Cancer Research Center (DKFZ), Heidelberg. It was published online in The International Journal of Obesity.
LIMITATIONS:
This study relied on only one-time measurements of anthropometric measures at baseline, without considering previous lifetime history of overweight and obesity or changes during follow-up. Additionally, WHR and WC may not be the most accurate measures of central obesity, as WC includes both visceral and subcutaneous adipose tissue. The study population also showed evidence of healthy volunteer bias, with more health-conscious and socioeconomically advantaged participants being somewhat overrepresented.
DISCLOSURES:
The authors declared no competing interests.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
Fibrosis Risk High in Young Adults With Both Obesity and T2D
TOPLINE:
METHODOLOGY:
- Researchers aimed to assess the prevalence of hepatic steatosis and clinically significant fibrosis (stage ≥ 2) in young adults without a history of metabolic dysfunction–associated steatotic liver disease (MASLD), hypothesizing that the rates would be comparable with those in older adults, especially in the presence of cardiometabolic risk factors.
- Overall, 1420 participants aged 21-79 years with or without T2D (63% or 37%, respectively) were included from outpatient clinics at the University of Florida, Gainesville, Florida, and divided into two age groups: < 45 years (n = 243) and ≥ 45 years (n = 1177).
- All the participants underwent assessment of liver stiffness via transient elastography, with magnetic resonance elastography (MRE) or liver biopsy recommended when indicated.
- Participants also underwent a medical history review, physical examination, and fasting blood tests to rule out secondary causes of liver disease.
TAKEAWAY:
- Overall, 52% of participants had hepatic steatosis, and 9.5% had clinically significant fibrosis.
- There were no significant differences in the frequencies of hepatic steatosis (50.2% vs 52.7%; P = .6) or clinically significant hepatic fibrosis (7.5% vs 9.9%; P = .2) observed between young and older adults.
- The presence of either T2D or obesity was linked to an increased prevalence of both hepatic steatosis and fibrosis in both the age groups (P < .01).
- In young and older adults, the presence of both T2D and obesity led to the highest rates of both hepatic steatosis and clinically significant fibrosis, with the latter rate being statistically similar between the groups (15.7% vs 17.3%; P = .2).
- The presence of T2D and obesity was the strongest risk factors for hepatic fibrosis in young adults (odds ratios, 4.33 and 1.16, respectively; P < .05 for both).
IN PRACTICE:
“The clinical implication is that young adults with obesity and T2D carry a high risk of future cirrhosis, possibly as high as older adults, and must be aggressively screened at the first visit and carefully followed,” the authors wrote.
SOURCE:
This study, led by Anu Sharma, University of Florida College of Medicine, Gainesville, was published online in Obesity.
LIMITATIONS:
The diagnosis of clinically significant hepatic fibrosis was confirmed via MRE and/or liver biopsy in only 30% of all participants. The study population included a slightly higher proportion of young adults with obesity, T2D, and other cardiometabolic risk factors than that in national averages, which may have limited its generalizability. Genetic variants associated with MASLD were not included in this study.
DISCLOSURES:
This study was funded partly by grants from the National Institutes of Health and Echosens. One author disclosed receiving research support and serving as a consultant for various pharmaceutical companies.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Researchers aimed to assess the prevalence of hepatic steatosis and clinically significant fibrosis (stage ≥ 2) in young adults without a history of metabolic dysfunction–associated steatotic liver disease (MASLD), hypothesizing that the rates would be comparable with those in older adults, especially in the presence of cardiometabolic risk factors.
- Overall, 1420 participants aged 21-79 years with or without T2D (63% or 37%, respectively) were included from outpatient clinics at the University of Florida, Gainesville, Florida, and divided into two age groups: < 45 years (n = 243) and ≥ 45 years (n = 1177).
- All the participants underwent assessment of liver stiffness via transient elastography, with magnetic resonance elastography (MRE) or liver biopsy recommended when indicated.
- Participants also underwent a medical history review, physical examination, and fasting blood tests to rule out secondary causes of liver disease.
TAKEAWAY:
- Overall, 52% of participants had hepatic steatosis, and 9.5% had clinically significant fibrosis.
- There were no significant differences in the frequencies of hepatic steatosis (50.2% vs 52.7%; P = .6) or clinically significant hepatic fibrosis (7.5% vs 9.9%; P = .2) observed between young and older adults.
- The presence of either T2D or obesity was linked to an increased prevalence of both hepatic steatosis and fibrosis in both the age groups (P < .01).
- In young and older adults, the presence of both T2D and obesity led to the highest rates of both hepatic steatosis and clinically significant fibrosis, with the latter rate being statistically similar between the groups (15.7% vs 17.3%; P = .2).
- The presence of T2D and obesity was the strongest risk factors for hepatic fibrosis in young adults (odds ratios, 4.33 and 1.16, respectively; P < .05 for both).
IN PRACTICE:
“The clinical implication is that young adults with obesity and T2D carry a high risk of future cirrhosis, possibly as high as older adults, and must be aggressively screened at the first visit and carefully followed,” the authors wrote.
SOURCE:
This study, led by Anu Sharma, University of Florida College of Medicine, Gainesville, was published online in Obesity.
LIMITATIONS:
The diagnosis of clinically significant hepatic fibrosis was confirmed via MRE and/or liver biopsy in only 30% of all participants. The study population included a slightly higher proportion of young adults with obesity, T2D, and other cardiometabolic risk factors than that in national averages, which may have limited its generalizability. Genetic variants associated with MASLD were not included in this study.
DISCLOSURES:
This study was funded partly by grants from the National Institutes of Health and Echosens. One author disclosed receiving research support and serving as a consultant for various pharmaceutical companies.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Researchers aimed to assess the prevalence of hepatic steatosis and clinically significant fibrosis (stage ≥ 2) in young adults without a history of metabolic dysfunction–associated steatotic liver disease (MASLD), hypothesizing that the rates would be comparable with those in older adults, especially in the presence of cardiometabolic risk factors.
- Overall, 1420 participants aged 21-79 years with or without T2D (63% or 37%, respectively) were included from outpatient clinics at the University of Florida, Gainesville, Florida, and divided into two age groups: < 45 years (n = 243) and ≥ 45 years (n = 1177).
- All the participants underwent assessment of liver stiffness via transient elastography, with magnetic resonance elastography (MRE) or liver biopsy recommended when indicated.
- Participants also underwent a medical history review, physical examination, and fasting blood tests to rule out secondary causes of liver disease.
TAKEAWAY:
- Overall, 52% of participants had hepatic steatosis, and 9.5% had clinically significant fibrosis.
- There were no significant differences in the frequencies of hepatic steatosis (50.2% vs 52.7%; P = .6) or clinically significant hepatic fibrosis (7.5% vs 9.9%; P = .2) observed between young and older adults.
- The presence of either T2D or obesity was linked to an increased prevalence of both hepatic steatosis and fibrosis in both the age groups (P < .01).
- In young and older adults, the presence of both T2D and obesity led to the highest rates of both hepatic steatosis and clinically significant fibrosis, with the latter rate being statistically similar between the groups (15.7% vs 17.3%; P = .2).
- The presence of T2D and obesity was the strongest risk factors for hepatic fibrosis in young adults (odds ratios, 4.33 and 1.16, respectively; P < .05 for both).
IN PRACTICE:
“The clinical implication is that young adults with obesity and T2D carry a high risk of future cirrhosis, possibly as high as older adults, and must be aggressively screened at the first visit and carefully followed,” the authors wrote.
SOURCE:
This study, led by Anu Sharma, University of Florida College of Medicine, Gainesville, was published online in Obesity.
LIMITATIONS:
The diagnosis of clinically significant hepatic fibrosis was confirmed via MRE and/or liver biopsy in only 30% of all participants. The study population included a slightly higher proportion of young adults with obesity, T2D, and other cardiometabolic risk factors than that in national averages, which may have limited its generalizability. Genetic variants associated with MASLD were not included in this study.
DISCLOSURES:
This study was funded partly by grants from the National Institutes of Health and Echosens. One author disclosed receiving research support and serving as a consultant for various pharmaceutical companies.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Sitting for More Than 10 Hours Daily Ups Heart Disease Risk
TOPLINE:
Sedentary time exceeding 10.6 h/d is linked to an increased risk for atrial fibrillation, heart failure, myocardial infarction, and cardiovascular (CV) mortality, researchers found. The risk persists even in individuals who meet recommended physical activity levels.
METHODOLOGY:
- Researchers used a validated machine learning approach to investigate the relationships between sedentary behavior and the future risks for CV illness and mortality in 89,530 middle-aged and older adults (mean age, 62 years; 56% women) from the UK Biobank.
- Participants provided data from a wrist-worn triaxial accelerometer that recorded their movements over a period of 7 days.
- Machine learning algorithms classified accelerometer signals into four classes of activities: Sleep, sedentary behavior, light physical activity, and moderate to vigorous physical activity.
- Participants were followed up for a median of 8 years through linkage to national health-related datasets in England, Scotland, and Wales.
- The median sedentary time was 9.4 h/d.
TAKEAWAY:
- During the follow-up period, 3638 individuals (4.9%) experienced incident atrial fibrillation, 1854 (2.09%) developed incident heart failure, 1610 (1.84%) experienced incident myocardial infarction, and 846 (0.94%) died from cardiovascular causes.
- The risks for atrial fibrillation and myocardial infarction increased steadily with an increase in sedentary time, with sedentary time greater than 10.6 h/d showing a modest increase in risk for atrial fibrillation (hazard ratio [HR], 1.11; 95% CI, 1.01-1.21).
- The risks for heart failure and CV mortality were low until sedentary time surpassed approximately 10.6 h/d, after which they rose by 45% (HR, 1.45; 95% CI, 1.28-1.65) and 62% (HR, 1.62; 95% CI, 1.34-1.96), respectively.
- The associations were attenuated but remained significant for CV mortality (HR, 1.33; 95% CI: 1.07-1.64) in individuals who met the recommended levels for physical activity yet were sedentary for more than 10.6 h/d. Reallocating 30 minutes of sedentary time to other activities reduced the risk for heart failure (HR, 0.93; 95% CI, 0.90-0.96) among those who were sedentary more than 10.6 h/d.
IN PRACTICE:
The study “highlights a complex interplay between sedentary behavior and physical activity, ultimately suggesting that sedentary behavior remains relevant for CV disease risk even among individuals meeting sufficient” levels of activity, the researchers reported.
“Individuals should move more and be less sedentary to reduce CV risk. ... Being a ‘weekend warrior’ and meeting guideline levels of [moderate to vigorous physical activity] of 150 minutes/week will not completely abolish the deleterious effects of extended sedentary time of > 10.6 hours per day,” Charles B. Eaton, MD, MS, of the Warren Alpert Medical School of Brown University in Providence, Rhode Island, wrote in an editorial accompanying the journal article.
SOURCE:
The study was led by Ezimamaka Ajufo, MD, of Brigham and Women’s Hospital in Boston. It was published online on November 15, 2024, in the Journal of the American College of Cardiology.
LIMITATIONS:
Wrist-based accelerometers cannot assess specific contexts for sedentary behavior and may misclassify standing time as sedentary time, and these limitations may have affected the findings. Physical activity was measured for 1 week only, which might not have fully represented habitual activity patterns. The sample included predominantly White participants and was enriched for health and socioeconomic status, which may have limited the generalizability of the findings.
DISCLOSURES:
The authors disclosed receiving research support, grants, and research fellowships and collaborations from various institutions and pharmaceutical companies, as well as serving on their advisory boards.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Sedentary time exceeding 10.6 h/d is linked to an increased risk for atrial fibrillation, heart failure, myocardial infarction, and cardiovascular (CV) mortality, researchers found. The risk persists even in individuals who meet recommended physical activity levels.
METHODOLOGY:
- Researchers used a validated machine learning approach to investigate the relationships between sedentary behavior and the future risks for CV illness and mortality in 89,530 middle-aged and older adults (mean age, 62 years; 56% women) from the UK Biobank.
- Participants provided data from a wrist-worn triaxial accelerometer that recorded their movements over a period of 7 days.
- Machine learning algorithms classified accelerometer signals into four classes of activities: Sleep, sedentary behavior, light physical activity, and moderate to vigorous physical activity.
- Participants were followed up for a median of 8 years through linkage to national health-related datasets in England, Scotland, and Wales.
- The median sedentary time was 9.4 h/d.
TAKEAWAY:
- During the follow-up period, 3638 individuals (4.9%) experienced incident atrial fibrillation, 1854 (2.09%) developed incident heart failure, 1610 (1.84%) experienced incident myocardial infarction, and 846 (0.94%) died from cardiovascular causes.
- The risks for atrial fibrillation and myocardial infarction increased steadily with an increase in sedentary time, with sedentary time greater than 10.6 h/d showing a modest increase in risk for atrial fibrillation (hazard ratio [HR], 1.11; 95% CI, 1.01-1.21).
- The risks for heart failure and CV mortality were low until sedentary time surpassed approximately 10.6 h/d, after which they rose by 45% (HR, 1.45; 95% CI, 1.28-1.65) and 62% (HR, 1.62; 95% CI, 1.34-1.96), respectively.
- The associations were attenuated but remained significant for CV mortality (HR, 1.33; 95% CI: 1.07-1.64) in individuals who met the recommended levels for physical activity yet were sedentary for more than 10.6 h/d. Reallocating 30 minutes of sedentary time to other activities reduced the risk for heart failure (HR, 0.93; 95% CI, 0.90-0.96) among those who were sedentary more than 10.6 h/d.
IN PRACTICE:
The study “highlights a complex interplay between sedentary behavior and physical activity, ultimately suggesting that sedentary behavior remains relevant for CV disease risk even among individuals meeting sufficient” levels of activity, the researchers reported.
“Individuals should move more and be less sedentary to reduce CV risk. ... Being a ‘weekend warrior’ and meeting guideline levels of [moderate to vigorous physical activity] of 150 minutes/week will not completely abolish the deleterious effects of extended sedentary time of > 10.6 hours per day,” Charles B. Eaton, MD, MS, of the Warren Alpert Medical School of Brown University in Providence, Rhode Island, wrote in an editorial accompanying the journal article.
SOURCE:
The study was led by Ezimamaka Ajufo, MD, of Brigham and Women’s Hospital in Boston. It was published online on November 15, 2024, in the Journal of the American College of Cardiology.
LIMITATIONS:
Wrist-based accelerometers cannot assess specific contexts for sedentary behavior and may misclassify standing time as sedentary time, and these limitations may have affected the findings. Physical activity was measured for 1 week only, which might not have fully represented habitual activity patterns. The sample included predominantly White participants and was enriched for health and socioeconomic status, which may have limited the generalizability of the findings.
DISCLOSURES:
The authors disclosed receiving research support, grants, and research fellowships and collaborations from various institutions and pharmaceutical companies, as well as serving on their advisory boards.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Sedentary time exceeding 10.6 h/d is linked to an increased risk for atrial fibrillation, heart failure, myocardial infarction, and cardiovascular (CV) mortality, researchers found. The risk persists even in individuals who meet recommended physical activity levels.
METHODOLOGY:
- Researchers used a validated machine learning approach to investigate the relationships between sedentary behavior and the future risks for CV illness and mortality in 89,530 middle-aged and older adults (mean age, 62 years; 56% women) from the UK Biobank.
- Participants provided data from a wrist-worn triaxial accelerometer that recorded their movements over a period of 7 days.
- Machine learning algorithms classified accelerometer signals into four classes of activities: Sleep, sedentary behavior, light physical activity, and moderate to vigorous physical activity.
- Participants were followed up for a median of 8 years through linkage to national health-related datasets in England, Scotland, and Wales.
- The median sedentary time was 9.4 h/d.
TAKEAWAY:
- During the follow-up period, 3638 individuals (4.9%) experienced incident atrial fibrillation, 1854 (2.09%) developed incident heart failure, 1610 (1.84%) experienced incident myocardial infarction, and 846 (0.94%) died from cardiovascular causes.
- The risks for atrial fibrillation and myocardial infarction increased steadily with an increase in sedentary time, with sedentary time greater than 10.6 h/d showing a modest increase in risk for atrial fibrillation (hazard ratio [HR], 1.11; 95% CI, 1.01-1.21).
- The risks for heart failure and CV mortality were low until sedentary time surpassed approximately 10.6 h/d, after which they rose by 45% (HR, 1.45; 95% CI, 1.28-1.65) and 62% (HR, 1.62; 95% CI, 1.34-1.96), respectively.
- The associations were attenuated but remained significant for CV mortality (HR, 1.33; 95% CI: 1.07-1.64) in individuals who met the recommended levels for physical activity yet were sedentary for more than 10.6 h/d. Reallocating 30 minutes of sedentary time to other activities reduced the risk for heart failure (HR, 0.93; 95% CI, 0.90-0.96) among those who were sedentary more than 10.6 h/d.
IN PRACTICE:
The study “highlights a complex interplay between sedentary behavior and physical activity, ultimately suggesting that sedentary behavior remains relevant for CV disease risk even among individuals meeting sufficient” levels of activity, the researchers reported.
“Individuals should move more and be less sedentary to reduce CV risk. ... Being a ‘weekend warrior’ and meeting guideline levels of [moderate to vigorous physical activity] of 150 minutes/week will not completely abolish the deleterious effects of extended sedentary time of > 10.6 hours per day,” Charles B. Eaton, MD, MS, of the Warren Alpert Medical School of Brown University in Providence, Rhode Island, wrote in an editorial accompanying the journal article.
SOURCE:
The study was led by Ezimamaka Ajufo, MD, of Brigham and Women’s Hospital in Boston. It was published online on November 15, 2024, in the Journal of the American College of Cardiology.
LIMITATIONS:
Wrist-based accelerometers cannot assess specific contexts for sedentary behavior and may misclassify standing time as sedentary time, and these limitations may have affected the findings. Physical activity was measured for 1 week only, which might not have fully represented habitual activity patterns. The sample included predominantly White participants and was enriched for health and socioeconomic status, which may have limited the generalizability of the findings.
DISCLOSURES:
The authors disclosed receiving research support, grants, and research fellowships and collaborations from various institutions and pharmaceutical companies, as well as serving on their advisory boards.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Does Light-Intensity Walking Benefit Blood Glucose?
TOPLINE:
METHODOLOGY:
- Researchers conducted a randomized crossover trial with 16 young adults aged 18-34 years with body mass index (BMI) ≥ 25 in Bangkok, Thailand, to examine the effects of different light-intensity walking patterns on postprandial cardiometabolic responses.
- Participants (mean age, 25; mean BMI, 29.8) engaged in four 7-hour experimental conditions, each involving a different activity: Uninterrupted sitting, 30-minutes of light-intensity walking, 3-minute light-intensity walking every 30 minutes, or a combination of both walking regimens. There was a 7- to 20-day washout period between each experiment period.
- Baseline and 6-hour postprandial concentrations of glucose, insulin, triglycerides, and blood pressure were measured.
- Incremental areas under the curve (iAUC) for each outcome and average blood pressure were compared between sitting and walking conditions.
TAKEAWAY:
- All the walking interventions reduced postprandial glucose concentrations and diastolic blood pressure compared with uninterrupted sitting.
- Continuous 30-minute light-intensity walking alone or combined with brief 3-minute bouts also attenuated postprandial insulin concentrations.
- No significant differences were found for triglycerides iAUC and systolic blood pressure between the four experiment conditions.
IN PRACTICE:
“These findings support the notion that engaging in light-intensity walking, regardless of the pattern, provides benefits to glycemic control. Moreover, the timing and patterns of light-intensity physical activity may be an important factor in reducing postprandial insulin concentrations,” the authors wrote.
SOURCE:
The study, led by Waris Wongpipit, PhD, Division of Health and Physical Education, Chulalongkorn University in Bangkok, Thailand, was published online in The Journal of Clinical Endocrinology & Metabolism.
LIMITATIONS:
The study’s small sample size of 16 participants may limit the generalizability of the findings. The short duration of the study (7-hour experimental conditions) may not reflect long-term effects. The prescribed activities and dietary profiles, along with the controlled laboratory setting, may not accurately represent real-world conditions. The lack of objective physical activity/sedentary behavior measurement to confirm compliance between conditions is a limitation.
DISCLOSURES:
This study was supported by grants from the Office of the Permanent Secretary, Ministry of Higher Education, Science, Research and Innovation, Thailand Science Research and Innovation, and Chulalongkorn University. Wongpipit received grant support from these organizations. Paddy C. Dempsey is supported by a National Health and Medical Research Council of Australia research fellowship. The other authors had no disclosures.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Researchers conducted a randomized crossover trial with 16 young adults aged 18-34 years with body mass index (BMI) ≥ 25 in Bangkok, Thailand, to examine the effects of different light-intensity walking patterns on postprandial cardiometabolic responses.
- Participants (mean age, 25; mean BMI, 29.8) engaged in four 7-hour experimental conditions, each involving a different activity: Uninterrupted sitting, 30-minutes of light-intensity walking, 3-minute light-intensity walking every 30 minutes, or a combination of both walking regimens. There was a 7- to 20-day washout period between each experiment period.
- Baseline and 6-hour postprandial concentrations of glucose, insulin, triglycerides, and blood pressure were measured.
- Incremental areas under the curve (iAUC) for each outcome and average blood pressure were compared between sitting and walking conditions.
TAKEAWAY:
- All the walking interventions reduced postprandial glucose concentrations and diastolic blood pressure compared with uninterrupted sitting.
- Continuous 30-minute light-intensity walking alone or combined with brief 3-minute bouts also attenuated postprandial insulin concentrations.
- No significant differences were found for triglycerides iAUC and systolic blood pressure between the four experiment conditions.
IN PRACTICE:
“These findings support the notion that engaging in light-intensity walking, regardless of the pattern, provides benefits to glycemic control. Moreover, the timing and patterns of light-intensity physical activity may be an important factor in reducing postprandial insulin concentrations,” the authors wrote.
SOURCE:
The study, led by Waris Wongpipit, PhD, Division of Health and Physical Education, Chulalongkorn University in Bangkok, Thailand, was published online in The Journal of Clinical Endocrinology & Metabolism.
LIMITATIONS:
The study’s small sample size of 16 participants may limit the generalizability of the findings. The short duration of the study (7-hour experimental conditions) may not reflect long-term effects. The prescribed activities and dietary profiles, along with the controlled laboratory setting, may not accurately represent real-world conditions. The lack of objective physical activity/sedentary behavior measurement to confirm compliance between conditions is a limitation.
DISCLOSURES:
This study was supported by grants from the Office of the Permanent Secretary, Ministry of Higher Education, Science, Research and Innovation, Thailand Science Research and Innovation, and Chulalongkorn University. Wongpipit received grant support from these organizations. Paddy C. Dempsey is supported by a National Health and Medical Research Council of Australia research fellowship. The other authors had no disclosures.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Researchers conducted a randomized crossover trial with 16 young adults aged 18-34 years with body mass index (BMI) ≥ 25 in Bangkok, Thailand, to examine the effects of different light-intensity walking patterns on postprandial cardiometabolic responses.
- Participants (mean age, 25; mean BMI, 29.8) engaged in four 7-hour experimental conditions, each involving a different activity: Uninterrupted sitting, 30-minutes of light-intensity walking, 3-minute light-intensity walking every 30 minutes, or a combination of both walking regimens. There was a 7- to 20-day washout period between each experiment period.
- Baseline and 6-hour postprandial concentrations of glucose, insulin, triglycerides, and blood pressure were measured.
- Incremental areas under the curve (iAUC) for each outcome and average blood pressure were compared between sitting and walking conditions.
TAKEAWAY:
- All the walking interventions reduced postprandial glucose concentrations and diastolic blood pressure compared with uninterrupted sitting.
- Continuous 30-minute light-intensity walking alone or combined with brief 3-minute bouts also attenuated postprandial insulin concentrations.
- No significant differences were found for triglycerides iAUC and systolic blood pressure between the four experiment conditions.
IN PRACTICE:
“These findings support the notion that engaging in light-intensity walking, regardless of the pattern, provides benefits to glycemic control. Moreover, the timing and patterns of light-intensity physical activity may be an important factor in reducing postprandial insulin concentrations,” the authors wrote.
SOURCE:
The study, led by Waris Wongpipit, PhD, Division of Health and Physical Education, Chulalongkorn University in Bangkok, Thailand, was published online in The Journal of Clinical Endocrinology & Metabolism.
LIMITATIONS:
The study’s small sample size of 16 participants may limit the generalizability of the findings. The short duration of the study (7-hour experimental conditions) may not reflect long-term effects. The prescribed activities and dietary profiles, along with the controlled laboratory setting, may not accurately represent real-world conditions. The lack of objective physical activity/sedentary behavior measurement to confirm compliance between conditions is a limitation.
DISCLOSURES:
This study was supported by grants from the Office of the Permanent Secretary, Ministry of Higher Education, Science, Research and Innovation, Thailand Science Research and Innovation, and Chulalongkorn University. Wongpipit received grant support from these organizations. Paddy C. Dempsey is supported by a National Health and Medical Research Council of Australia research fellowship. The other authors had no disclosures.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
Can Plant-Based Diet Deliver Ample Protein for Older Adults?
TOPLINE:
Replacing animal-based protein sources with plant-based alternatives in older adults reduced both the quality and quantity of protein intake only when all animal-based foods were eliminated for a vegan scenario, finds a simulation study that suggests a switch to 60% plant-based protein seems to be safe.
METHODOLOGY:
- For environmental and health reasons, the Dutch Health Council advises a switch to an animal-based to plant-based protein ratio of 40:60, but older adults also need adequate protein intake to prevent muscle loss and maintain health, and it’s uncertain if they can meet their protein requirements through a more sustainable diet.
- This simulation study evaluated the impact of more sustainable eating patterns on protein quantity and quality by using data of 607 community-dwelling older adults aged 65-79 years from the Dutch National Food Consumption Survey 2019-2021.
- Data on food consumption were collected via two 24-hour dietary recalls per participant on nonconsecutive days and calculated as three main meals and four in-between moments each day.
- In the simulation, certain food products in the original diet were replaced from a list of similar plant-based alternatives, using a random number generator, to create scenarios for two flexitarian diets (40% and 80% meat and fish were replaced), one pescetarian diet (meat was replaced, but not fish and other animal-based products), one vegetarian diet (meat and fish were replaced, but not other animal-based products), and one vegan diet (fish, meat, and animal-based products were replaced).
- Protein intake was calculated in three ways for each meal moment, including by total protein intake (quantity) and by the proportion of indispensable amino acids that must be eaten together within a limited timeframe (quality).
TAKEAWAY:
- In the reference diet, the total daily plant-based protein intake was 39.0% in men and 37.7% in women, while in the vegetarian scenario, it was 59.1% in men and 54.2% in women.
- In the flexitarian, pescetarian, and vegetarian scenarios, the usable protein intake was comparable; in the vegan scenario, both total protein intake and usable protein intake were lower, leading to nearly 50% less usable protein than in the original diet.
- In the original diet, 7.5% of men and 11.1% of women did not meet the estimated average requirements (EARs) for utilizable protein; in the vegan scenario, 83.3% of both sexes had a protein intake below the EAR.
- The loss in protein intake (quantity) in all scenarios was mainly observed at dinner; the loss in protein quality was greatest at breakfast and lunch, especially in lysine (found in beans or soy milk).
IN PRACTICE:
“Changing protein intake to 60% plant-based protein seems to be safe for older adults in terms of protein intake. In contrast, a vegan pattern was associated with a substantial decline in protein availability, leading to a majority of older adults not reaching the recommended protein levels,” the authors wrote.
SOURCE:
The study was led by Jos W. Borkent, HAN University of Applied Sciences, Nijmegen, the Netherlands. It was published online in The Journal of Nutrition, Health and Aging.
LIMITATIONS:
Study limitations included the use of a simulation model, which may not fully reflect real-life dietary practices. The strict timeframe for assessing protein quality (optimal combinations of indispensible amino acids) within one meal moment may have led to an underestimation of protein availability, especially in the vegan scenario. Additionally, the choice of processed meat replacements in the vegan scenario may not have represented protein sources of the highest quality available. Higher protein quality per meal in the vegan scenario is possible when smart combinations are made in multiple meal components.
DISCLOSURES:
The study was partly funded by a grant from the Taskforce for Applied Research SIA, which is part of the Netherlands Organisation for Scientific Research and financed by the Dutch Ministry of Education, Culture and Science and by a fund of the Dutch Dairy Association. The authors declared that they had no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
Replacing animal-based protein sources with plant-based alternatives in older adults reduced both the quality and quantity of protein intake only when all animal-based foods were eliminated for a vegan scenario, finds a simulation study that suggests a switch to 60% plant-based protein seems to be safe.
METHODOLOGY:
- For environmental and health reasons, the Dutch Health Council advises a switch to an animal-based to plant-based protein ratio of 40:60, but older adults also need adequate protein intake to prevent muscle loss and maintain health, and it’s uncertain if they can meet their protein requirements through a more sustainable diet.
- This simulation study evaluated the impact of more sustainable eating patterns on protein quantity and quality by using data of 607 community-dwelling older adults aged 65-79 years from the Dutch National Food Consumption Survey 2019-2021.
- Data on food consumption were collected via two 24-hour dietary recalls per participant on nonconsecutive days and calculated as three main meals and four in-between moments each day.
- In the simulation, certain food products in the original diet were replaced from a list of similar plant-based alternatives, using a random number generator, to create scenarios for two flexitarian diets (40% and 80% meat and fish were replaced), one pescetarian diet (meat was replaced, but not fish and other animal-based products), one vegetarian diet (meat and fish were replaced, but not other animal-based products), and one vegan diet (fish, meat, and animal-based products were replaced).
- Protein intake was calculated in three ways for each meal moment, including by total protein intake (quantity) and by the proportion of indispensable amino acids that must be eaten together within a limited timeframe (quality).
TAKEAWAY:
- In the reference diet, the total daily plant-based protein intake was 39.0% in men and 37.7% in women, while in the vegetarian scenario, it was 59.1% in men and 54.2% in women.
- In the flexitarian, pescetarian, and vegetarian scenarios, the usable protein intake was comparable; in the vegan scenario, both total protein intake and usable protein intake were lower, leading to nearly 50% less usable protein than in the original diet.
- In the original diet, 7.5% of men and 11.1% of women did not meet the estimated average requirements (EARs) for utilizable protein; in the vegan scenario, 83.3% of both sexes had a protein intake below the EAR.
- The loss in protein intake (quantity) in all scenarios was mainly observed at dinner; the loss in protein quality was greatest at breakfast and lunch, especially in lysine (found in beans or soy milk).
IN PRACTICE:
“Changing protein intake to 60% plant-based protein seems to be safe for older adults in terms of protein intake. In contrast, a vegan pattern was associated with a substantial decline in protein availability, leading to a majority of older adults not reaching the recommended protein levels,” the authors wrote.
SOURCE:
The study was led by Jos W. Borkent, HAN University of Applied Sciences, Nijmegen, the Netherlands. It was published online in The Journal of Nutrition, Health and Aging.
LIMITATIONS:
Study limitations included the use of a simulation model, which may not fully reflect real-life dietary practices. The strict timeframe for assessing protein quality (optimal combinations of indispensible amino acids) within one meal moment may have led to an underestimation of protein availability, especially in the vegan scenario. Additionally, the choice of processed meat replacements in the vegan scenario may not have represented protein sources of the highest quality available. Higher protein quality per meal in the vegan scenario is possible when smart combinations are made in multiple meal components.
DISCLOSURES:
The study was partly funded by a grant from the Taskforce for Applied Research SIA, which is part of the Netherlands Organisation for Scientific Research and financed by the Dutch Ministry of Education, Culture and Science and by a fund of the Dutch Dairy Association. The authors declared that they had no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
Replacing animal-based protein sources with plant-based alternatives in older adults reduced both the quality and quantity of protein intake only when all animal-based foods were eliminated for a vegan scenario, finds a simulation study that suggests a switch to 60% plant-based protein seems to be safe.
METHODOLOGY:
- For environmental and health reasons, the Dutch Health Council advises a switch to an animal-based to plant-based protein ratio of 40:60, but older adults also need adequate protein intake to prevent muscle loss and maintain health, and it’s uncertain if they can meet their protein requirements through a more sustainable diet.
- This simulation study evaluated the impact of more sustainable eating patterns on protein quantity and quality by using data of 607 community-dwelling older adults aged 65-79 years from the Dutch National Food Consumption Survey 2019-2021.
- Data on food consumption were collected via two 24-hour dietary recalls per participant on nonconsecutive days and calculated as three main meals and four in-between moments each day.
- In the simulation, certain food products in the original diet were replaced from a list of similar plant-based alternatives, using a random number generator, to create scenarios for two flexitarian diets (40% and 80% meat and fish were replaced), one pescetarian diet (meat was replaced, but not fish and other animal-based products), one vegetarian diet (meat and fish were replaced, but not other animal-based products), and one vegan diet (fish, meat, and animal-based products were replaced).
- Protein intake was calculated in three ways for each meal moment, including by total protein intake (quantity) and by the proportion of indispensable amino acids that must be eaten together within a limited timeframe (quality).
TAKEAWAY:
- In the reference diet, the total daily plant-based protein intake was 39.0% in men and 37.7% in women, while in the vegetarian scenario, it was 59.1% in men and 54.2% in women.
- In the flexitarian, pescetarian, and vegetarian scenarios, the usable protein intake was comparable; in the vegan scenario, both total protein intake and usable protein intake were lower, leading to nearly 50% less usable protein than in the original diet.
- In the original diet, 7.5% of men and 11.1% of women did not meet the estimated average requirements (EARs) for utilizable protein; in the vegan scenario, 83.3% of both sexes had a protein intake below the EAR.
- The loss in protein intake (quantity) in all scenarios was mainly observed at dinner; the loss in protein quality was greatest at breakfast and lunch, especially in lysine (found in beans or soy milk).
IN PRACTICE:
“Changing protein intake to 60% plant-based protein seems to be safe for older adults in terms of protein intake. In contrast, a vegan pattern was associated with a substantial decline in protein availability, leading to a majority of older adults not reaching the recommended protein levels,” the authors wrote.
SOURCE:
The study was led by Jos W. Borkent, HAN University of Applied Sciences, Nijmegen, the Netherlands. It was published online in The Journal of Nutrition, Health and Aging.
LIMITATIONS:
Study limitations included the use of a simulation model, which may not fully reflect real-life dietary practices. The strict timeframe for assessing protein quality (optimal combinations of indispensible amino acids) within one meal moment may have led to an underestimation of protein availability, especially in the vegan scenario. Additionally, the choice of processed meat replacements in the vegan scenario may not have represented protein sources of the highest quality available. Higher protein quality per meal in the vegan scenario is possible when smart combinations are made in multiple meal components.
DISCLOSURES:
The study was partly funded by a grant from the Taskforce for Applied Research SIA, which is part of the Netherlands Organisation for Scientific Research and financed by the Dutch Ministry of Education, Culture and Science and by a fund of the Dutch Dairy Association. The authors declared that they had no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
Vitamin D May Lower Blood Pressure in Seniors With Overweight
TOPLINE:
Supplementation with vitamin D and calcium can reduce systolic and diastolic blood pressure in older individuals with overweight, particularly in those with a body mass index (BMI) > 30 and those diagnosed with hypertension.
METHODOLOGY:
- Large cohort data have provided epidemiologic evidence linking vitamin D deficiency to a higher risk for cardiovascular disorders, including hypertension; however, evidence on the beneficial effects of vitamin D supplementation on blood pressure outcomes remains inconclusive.
- A post hoc analysis of a randomized controlled trial was conducted to investigate the effect of two doses of cholecalciferol (vitamin D3) on blood pressure in individuals aged 65 years or older with a BMI > 25 and serum vitamin D levels of 10-30 ng/mL.
- A total of 221 participants were recruited through outpatient departments, clinics, and advertisements in the greater Beirut area and received calcium supplementation in combination with either a low dose (600 IU/d, as recommended by the Institute of Medicine [IOM]) or a high dose (3750 IU/d) of vitamin D3.
- Blood pressure measurements were taken at baseline, 6 months, and 12 months using a SureSigns VS3 monitor.
- Participants were also stratified by BMI and hypertension status to assess the effects of vitamin D and calcium on blood pressure.
TAKEAWAY:
- Systolic and diastolic blood pressures were significantly reduced with vitamin D supplementation in the overall cohort (mean difference, 3.5 and 2.8 mm Hg, respectively; P = .005 and P = .002, respectively), with the effect more prominent in those in the high-dose vitamin D group.
- Participants with a BMI > 30 experienced reductions in both systolic and diastolic blood pressures in the overall cohort (P < .0001 and P = .01, respectively); although the systolic blood pressure was significantly reduced with both high- and low-dose vitamin D, the diastolic blood pressure decreased in the high-dose group only.
- Patients with hypertension benefited from all doses of vitamin D, regardless of the BMI.
- Systolic blood pressure at 6 and 12 months was significantly predicted by BMI and baseline systolic blood pressure measurements, although not by the dose of vitamin D received.
IN PRACTICE:
“Our study found vitamin D supplementation may decrease blood pressure in specific subgroups such as older people, people with obesity, and possibly those with low vitamin D levels,” said study author Ghada El-Hajj Fuleihan, MD, MPH, of the American University of Beirut Medical Center in Beirut, Lebanon, said in a news release. “High vitamin D doses compared to the IOM’s recommended daily dose did not provide additional health benefits.”
SOURCE:
This study was led by Maya Rahme, Department of Internal Medicine, Division of Endocrinology, Calcium Metabolism and Osteoporosis Program, World Health Organization Collaborating Center for Metabolic Bone Disorders, American University of Beirut Medical Center in Beirut, Lebanon. It was published online in Journal of the Endocrine Society.
LIMITATIONS:
This study’s limitations included the exploratory nature of the analyses and the low power of the subgroup analyses. Additionally, the study focused on older individuals who were sedentary and had overweight, many of whom had prediabetes — conditions known to influence blood pressure. The possible effect of calcium alone on blood pressure reduction was also unclear.
DISCLOSURES:
This study was supported by grants from the American University of Beirut, St Joseph University, and the Lebanese Council for National Scientific Research. No relevant conflicts of interest were disclosed by the authors.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
A version of this article appeared on Medscape.com.
TOPLINE:
Supplementation with vitamin D and calcium can reduce systolic and diastolic blood pressure in older individuals with overweight, particularly in those with a body mass index (BMI) > 30 and those diagnosed with hypertension.
METHODOLOGY:
- Large cohort data have provided epidemiologic evidence linking vitamin D deficiency to a higher risk for cardiovascular disorders, including hypertension; however, evidence on the beneficial effects of vitamin D supplementation on blood pressure outcomes remains inconclusive.
- A post hoc analysis of a randomized controlled trial was conducted to investigate the effect of two doses of cholecalciferol (vitamin D3) on blood pressure in individuals aged 65 years or older with a BMI > 25 and serum vitamin D levels of 10-30 ng/mL.
- A total of 221 participants were recruited through outpatient departments, clinics, and advertisements in the greater Beirut area and received calcium supplementation in combination with either a low dose (600 IU/d, as recommended by the Institute of Medicine [IOM]) or a high dose (3750 IU/d) of vitamin D3.
- Blood pressure measurements were taken at baseline, 6 months, and 12 months using a SureSigns VS3 monitor.
- Participants were also stratified by BMI and hypertension status to assess the effects of vitamin D and calcium on blood pressure.
TAKEAWAY:
- Systolic and diastolic blood pressures were significantly reduced with vitamin D supplementation in the overall cohort (mean difference, 3.5 and 2.8 mm Hg, respectively; P = .005 and P = .002, respectively), with the effect more prominent in those in the high-dose vitamin D group.
- Participants with a BMI > 30 experienced reductions in both systolic and diastolic blood pressures in the overall cohort (P < .0001 and P = .01, respectively); although the systolic blood pressure was significantly reduced with both high- and low-dose vitamin D, the diastolic blood pressure decreased in the high-dose group only.
- Patients with hypertension benefited from all doses of vitamin D, regardless of the BMI.
- Systolic blood pressure at 6 and 12 months was significantly predicted by BMI and baseline systolic blood pressure measurements, although not by the dose of vitamin D received.
IN PRACTICE:
“Our study found vitamin D supplementation may decrease blood pressure in specific subgroups such as older people, people with obesity, and possibly those with low vitamin D levels,” said study author Ghada El-Hajj Fuleihan, MD, MPH, of the American University of Beirut Medical Center in Beirut, Lebanon, said in a news release. “High vitamin D doses compared to the IOM’s recommended daily dose did not provide additional health benefits.”
SOURCE:
This study was led by Maya Rahme, Department of Internal Medicine, Division of Endocrinology, Calcium Metabolism and Osteoporosis Program, World Health Organization Collaborating Center for Metabolic Bone Disorders, American University of Beirut Medical Center in Beirut, Lebanon. It was published online in Journal of the Endocrine Society.
LIMITATIONS:
This study’s limitations included the exploratory nature of the analyses and the low power of the subgroup analyses. Additionally, the study focused on older individuals who were sedentary and had overweight, many of whom had prediabetes — conditions known to influence blood pressure. The possible effect of calcium alone on blood pressure reduction was also unclear.
DISCLOSURES:
This study was supported by grants from the American University of Beirut, St Joseph University, and the Lebanese Council for National Scientific Research. No relevant conflicts of interest were disclosed by the authors.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
A version of this article appeared on Medscape.com.
TOPLINE:
Supplementation with vitamin D and calcium can reduce systolic and diastolic blood pressure in older individuals with overweight, particularly in those with a body mass index (BMI) > 30 and those diagnosed with hypertension.
METHODOLOGY:
- Large cohort data have provided epidemiologic evidence linking vitamin D deficiency to a higher risk for cardiovascular disorders, including hypertension; however, evidence on the beneficial effects of vitamin D supplementation on blood pressure outcomes remains inconclusive.
- A post hoc analysis of a randomized controlled trial was conducted to investigate the effect of two doses of cholecalciferol (vitamin D3) on blood pressure in individuals aged 65 years or older with a BMI > 25 and serum vitamin D levels of 10-30 ng/mL.
- A total of 221 participants were recruited through outpatient departments, clinics, and advertisements in the greater Beirut area and received calcium supplementation in combination with either a low dose (600 IU/d, as recommended by the Institute of Medicine [IOM]) or a high dose (3750 IU/d) of vitamin D3.
- Blood pressure measurements were taken at baseline, 6 months, and 12 months using a SureSigns VS3 monitor.
- Participants were also stratified by BMI and hypertension status to assess the effects of vitamin D and calcium on blood pressure.
TAKEAWAY:
- Systolic and diastolic blood pressures were significantly reduced with vitamin D supplementation in the overall cohort (mean difference, 3.5 and 2.8 mm Hg, respectively; P = .005 and P = .002, respectively), with the effect more prominent in those in the high-dose vitamin D group.
- Participants with a BMI > 30 experienced reductions in both systolic and diastolic blood pressures in the overall cohort (P < .0001 and P = .01, respectively); although the systolic blood pressure was significantly reduced with both high- and low-dose vitamin D, the diastolic blood pressure decreased in the high-dose group only.
- Patients with hypertension benefited from all doses of vitamin D, regardless of the BMI.
- Systolic blood pressure at 6 and 12 months was significantly predicted by BMI and baseline systolic blood pressure measurements, although not by the dose of vitamin D received.
IN PRACTICE:
“Our study found vitamin D supplementation may decrease blood pressure in specific subgroups such as older people, people with obesity, and possibly those with low vitamin D levels,” said study author Ghada El-Hajj Fuleihan, MD, MPH, of the American University of Beirut Medical Center in Beirut, Lebanon, said in a news release. “High vitamin D doses compared to the IOM’s recommended daily dose did not provide additional health benefits.”
SOURCE:
This study was led by Maya Rahme, Department of Internal Medicine, Division of Endocrinology, Calcium Metabolism and Osteoporosis Program, World Health Organization Collaborating Center for Metabolic Bone Disorders, American University of Beirut Medical Center in Beirut, Lebanon. It was published online in Journal of the Endocrine Society.
LIMITATIONS:
This study’s limitations included the exploratory nature of the analyses and the low power of the subgroup analyses. Additionally, the study focused on older individuals who were sedentary and had overweight, many of whom had prediabetes — conditions known to influence blood pressure. The possible effect of calcium alone on blood pressure reduction was also unclear.
DISCLOSURES:
This study was supported by grants from the American University of Beirut, St Joseph University, and the Lebanese Council for National Scientific Research. No relevant conflicts of interest were disclosed by the authors.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
A version of this article appeared on Medscape.com.
Sustained Benefits With TransCon PTH in Hypoparathyroidism
TOPLINE:
Long-term treatment with TransCon parathyroid hormone (PTH), a replacement therapy for hypoparathyroidism, demonstrates sustained efficacy and safety in patients with hypoparathyroidism over 52 weeks, with 95% of participants able to discontinue conventional therapy.
METHODOLOGY:
- Conventional therapy for hypoparathyroidism (active vitamin D and elemental calcium) alleviates symptoms of hypocalcemia, but it does not improve insufficient PTH levels and is linked to long-term complications, such as nephrocalcinosis, nephrolithiasis, and renal dysfunction.
- This phase 3 (PaTHway) trial aimed to investigate the long-term efficacy, safety, and tolerability of TransCon PTH (palopegteriparatide) in adults with hypoparathyroidism.
- Overall, 82 patients with chronic hypoparathyroidism (mean age, 48.6 years; 78% women; 93% White) were randomly assigned to receive TransCon PTH or placebo, both coadministered with conventional therapy for 26 weeks.
- At the 26-week visit, patients who completed the blinded treatment (n = 79) were assigned to receive only TransCon PTH with conventional therapy in an ongoing 156-week open-label extension.
- For this analysis at week 52, the main efficacy endpoint was the proportion of patients (n = 78) with normal serum calcium levels (8.3-10.6 mg/dL) and independence from conventional therapy (active vitamin D and therapeutic doses of calcium); safety assessments included serum chemistries, 24-hour urine calcium excretion, and treatment-emergent adverse events.
TAKEAWAY:
- At week 52, the majority of the patients receiving TransCon PTH achieved normal serum calcium levels within the normal range (86%) and independence from conventional therapy (95%). None required active vitamin D.
- In secondary endpoints, patients receiving TransCon PTH showed sustained improvement in Hypoparathyroidism Patient Experience Scale scores, reflecting better symptom management, enhanced functioning, and overall well-being through week 52.
- At week 52, the mean 24-hour urine calcium excretion in patients first randomized to TransCon PTH was 185.1 mg/d, remaining well below the upper limit of normal (≤ 250 mg/d), while the placebo group mean fell to 223.1 mg/d during the open-label extension of TransCon PTH.
- TransCon PTH was well-tolerated, with most treatment-emergent adverse events being mild or moderate and none leading to treatment discontinuation.
IN PRACTICE:
“These results suggest that TransCon PTH may improve outcomes and advance the standard of care for adults living with hypoparathyroidism,” the authors wrote.
SOURCE:
The study was led by Bart L. Clarke, MD, Mayo Clinic, Rochester, Minnesota. It was published online in The Journal of Clinical Endocrinology & Metabolism.
LIMITATIONS:
The study’s limitations included the open-label design during the extension period, which may have introduced bias in patient-reported outcomes. Additionally, the study population was predominantly women and White, which may have limited the generalizability of the findings. Further research is needed to assess the long-term effects of TransCon PTH on renal complications. One patient died of fatal cardiac arrest deemed unrelated to the study drug.
DISCLOSURES:
The study was funded by Ascendis Pharma A/S. Seven authors declared being current or former employees of Ascendis Pharma. The other authors declared receiving grants, research funding, honoraria, serving as consultants, advisory board members, study investigators, and other ties with Ascendis Pharma and multiple other pharmaceutical companies.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Long-term treatment with TransCon parathyroid hormone (PTH), a replacement therapy for hypoparathyroidism, demonstrates sustained efficacy and safety in patients with hypoparathyroidism over 52 weeks, with 95% of participants able to discontinue conventional therapy.
METHODOLOGY:
- Conventional therapy for hypoparathyroidism (active vitamin D and elemental calcium) alleviates symptoms of hypocalcemia, but it does not improve insufficient PTH levels and is linked to long-term complications, such as nephrocalcinosis, nephrolithiasis, and renal dysfunction.
- This phase 3 (PaTHway) trial aimed to investigate the long-term efficacy, safety, and tolerability of TransCon PTH (palopegteriparatide) in adults with hypoparathyroidism.
- Overall, 82 patients with chronic hypoparathyroidism (mean age, 48.6 years; 78% women; 93% White) were randomly assigned to receive TransCon PTH or placebo, both coadministered with conventional therapy for 26 weeks.
- At the 26-week visit, patients who completed the blinded treatment (n = 79) were assigned to receive only TransCon PTH with conventional therapy in an ongoing 156-week open-label extension.
- For this analysis at week 52, the main efficacy endpoint was the proportion of patients (n = 78) with normal serum calcium levels (8.3-10.6 mg/dL) and independence from conventional therapy (active vitamin D and therapeutic doses of calcium); safety assessments included serum chemistries, 24-hour urine calcium excretion, and treatment-emergent adverse events.
TAKEAWAY:
- At week 52, the majority of the patients receiving TransCon PTH achieved normal serum calcium levels within the normal range (86%) and independence from conventional therapy (95%). None required active vitamin D.
- In secondary endpoints, patients receiving TransCon PTH showed sustained improvement in Hypoparathyroidism Patient Experience Scale scores, reflecting better symptom management, enhanced functioning, and overall well-being through week 52.
- At week 52, the mean 24-hour urine calcium excretion in patients first randomized to TransCon PTH was 185.1 mg/d, remaining well below the upper limit of normal (≤ 250 mg/d), while the placebo group mean fell to 223.1 mg/d during the open-label extension of TransCon PTH.
- TransCon PTH was well-tolerated, with most treatment-emergent adverse events being mild or moderate and none leading to treatment discontinuation.
IN PRACTICE:
“These results suggest that TransCon PTH may improve outcomes and advance the standard of care for adults living with hypoparathyroidism,” the authors wrote.
SOURCE:
The study was led by Bart L. Clarke, MD, Mayo Clinic, Rochester, Minnesota. It was published online in The Journal of Clinical Endocrinology & Metabolism.
LIMITATIONS:
The study’s limitations included the open-label design during the extension period, which may have introduced bias in patient-reported outcomes. Additionally, the study population was predominantly women and White, which may have limited the generalizability of the findings. Further research is needed to assess the long-term effects of TransCon PTH on renal complications. One patient died of fatal cardiac arrest deemed unrelated to the study drug.
DISCLOSURES:
The study was funded by Ascendis Pharma A/S. Seven authors declared being current or former employees of Ascendis Pharma. The other authors declared receiving grants, research funding, honoraria, serving as consultants, advisory board members, study investigators, and other ties with Ascendis Pharma and multiple other pharmaceutical companies.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Long-term treatment with TransCon parathyroid hormone (PTH), a replacement therapy for hypoparathyroidism, demonstrates sustained efficacy and safety in patients with hypoparathyroidism over 52 weeks, with 95% of participants able to discontinue conventional therapy.
METHODOLOGY:
- Conventional therapy for hypoparathyroidism (active vitamin D and elemental calcium) alleviates symptoms of hypocalcemia, but it does not improve insufficient PTH levels and is linked to long-term complications, such as nephrocalcinosis, nephrolithiasis, and renal dysfunction.
- This phase 3 (PaTHway) trial aimed to investigate the long-term efficacy, safety, and tolerability of TransCon PTH (palopegteriparatide) in adults with hypoparathyroidism.
- Overall, 82 patients with chronic hypoparathyroidism (mean age, 48.6 years; 78% women; 93% White) were randomly assigned to receive TransCon PTH or placebo, both coadministered with conventional therapy for 26 weeks.
- At the 26-week visit, patients who completed the blinded treatment (n = 79) were assigned to receive only TransCon PTH with conventional therapy in an ongoing 156-week open-label extension.
- For this analysis at week 52, the main efficacy endpoint was the proportion of patients (n = 78) with normal serum calcium levels (8.3-10.6 mg/dL) and independence from conventional therapy (active vitamin D and therapeutic doses of calcium); safety assessments included serum chemistries, 24-hour urine calcium excretion, and treatment-emergent adverse events.
TAKEAWAY:
- At week 52, the majority of the patients receiving TransCon PTH achieved normal serum calcium levels within the normal range (86%) and independence from conventional therapy (95%). None required active vitamin D.
- In secondary endpoints, patients receiving TransCon PTH showed sustained improvement in Hypoparathyroidism Patient Experience Scale scores, reflecting better symptom management, enhanced functioning, and overall well-being through week 52.
- At week 52, the mean 24-hour urine calcium excretion in patients first randomized to TransCon PTH was 185.1 mg/d, remaining well below the upper limit of normal (≤ 250 mg/d), while the placebo group mean fell to 223.1 mg/d during the open-label extension of TransCon PTH.
- TransCon PTH was well-tolerated, with most treatment-emergent adverse events being mild or moderate and none leading to treatment discontinuation.
IN PRACTICE:
“These results suggest that TransCon PTH may improve outcomes and advance the standard of care for adults living with hypoparathyroidism,” the authors wrote.
SOURCE:
The study was led by Bart L. Clarke, MD, Mayo Clinic, Rochester, Minnesota. It was published online in The Journal of Clinical Endocrinology & Metabolism.
LIMITATIONS:
The study’s limitations included the open-label design during the extension period, which may have introduced bias in patient-reported outcomes. Additionally, the study population was predominantly women and White, which may have limited the generalizability of the findings. Further research is needed to assess the long-term effects of TransCon PTH on renal complications. One patient died of fatal cardiac arrest deemed unrelated to the study drug.
DISCLOSURES:
The study was funded by Ascendis Pharma A/S. Seven authors declared being current or former employees of Ascendis Pharma. The other authors declared receiving grants, research funding, honoraria, serving as consultants, advisory board members, study investigators, and other ties with Ascendis Pharma and multiple other pharmaceutical companies.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Onset of Rheumatoid Arthritis Presaged by Changes in Gut Microbiome
TOPLINE:
Individuals at an increased risk of developing rheumatoid arthritis (RA) have a unique gut microbial composition, characterized by a notable increase in certain strains of Prevotella bacteria. These changes begin approximately 10 months prior to the onset of RA.
METHODOLOGY:
- In this cross-sectional and longitudinal observational study, researchers aimed to identify microbial associations in the early stages of RA, focusing specifically on Prevotellaceae strains.
- The cross-sectional analysis assessed the gut microbiome profiles of 124 individuals at risk of developing RA, 7 patients with newly diagnosed RA, and 22 healthy control individuals free of musculoskeletal symptoms at five different time points over a period of 15 months; 30 patients progressed to RA during the study period.
- The longitudinal analysis was performed in 19 individuals at risk of developing RA, of whom 5 progressed to the condition.
- The risk of developing RA was identified by the presence of anti–cyclic citrullinated protein (anti-CCP) antibodies and the onset of musculoskeletal pain in the preceding 3 months.
- Gut microbiome taxonomic alterations were investigated using 16S rRNA amplicon sequencing and confirmed with shotgun metagenomic DNA sequencing of 49 samples.
TAKEAWAY:
- Gut microbial diversity, particularly alpha diversity, was notably reduced in CCP+ individuals at risk of developing RA vs healthy control individuals (P = .012). Recognized risk factors for RA development such as the presence of rheumatoid factor antibodies and the human leukocyte antigen shared epitope, were significantly linked to diminished gut microbial diversity, in addition to steroid use.
- A specific Prevotellaceae strain (ASV2058) was found to be overabundant in CCP+ individuals at risk of developing RA and in those newly diagnosed with the condition but not in healthy control individuals. Further analysis showed that enrichment and depletion of three and five strains of Prevotellaceae, respectively, were associated with the progression to RA in CCP+ individuals.
- CCP+ individuals who progressed to RA were found to have substantial fluctuations in gut microbiome profiles around 10 months before clinical diagnosis; however, these profiles were relatively stable 10-15 months before the onset of RA, suggesting that changes in the microbiome occur at a later stage.
- Patients with new-onset RA were found to have distinct metabolic shifts, particularly in pathways related to amino acid and energy metabolism.
IN PRACTICE:
“Individuals at risk of RA harbor a distinctive gut microbial composition, including but not limited to an overabundance of Prevotellaceae species. This microbial signature is consistent and correlates with traditional RA risk factors,” the authors wrote.
SOURCE:
The study was led by Christopher M. Rooney, MD, PhD, University of Leeds in England. It was published online in Annals of the Rheumatic Diseases.
LIMITATIONS:
The small longitudinal sample size and lack of a 1:1 longitudinal comparison between CCP+ individuals at risk for RA and healthy control individuals were major limitations of this study. The new-onset RA cohort was heterogeneous, reflecting the practical constraints of recruitment from standard care clinics. Integrated transcriptomic or metabolomic data were unavailable, restricting interpretation to potential rather than confirmed metabolic activity.
DISCLOSURES:
This study was funded by personal fellowships received by the lead author from Versus Arthritis, Leeds Cares, and a National Institute for Health Research Clinical Lectureship. Some authors disclosed receiving grants, funding, consulting fees, or honoraria from various pharmaceutical companies.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Individuals at an increased risk of developing rheumatoid arthritis (RA) have a unique gut microbial composition, characterized by a notable increase in certain strains of Prevotella bacteria. These changes begin approximately 10 months prior to the onset of RA.
METHODOLOGY:
- In this cross-sectional and longitudinal observational study, researchers aimed to identify microbial associations in the early stages of RA, focusing specifically on Prevotellaceae strains.
- The cross-sectional analysis assessed the gut microbiome profiles of 124 individuals at risk of developing RA, 7 patients with newly diagnosed RA, and 22 healthy control individuals free of musculoskeletal symptoms at five different time points over a period of 15 months; 30 patients progressed to RA during the study period.
- The longitudinal analysis was performed in 19 individuals at risk of developing RA, of whom 5 progressed to the condition.
- The risk of developing RA was identified by the presence of anti–cyclic citrullinated protein (anti-CCP) antibodies and the onset of musculoskeletal pain in the preceding 3 months.
- Gut microbiome taxonomic alterations were investigated using 16S rRNA amplicon sequencing and confirmed with shotgun metagenomic DNA sequencing of 49 samples.
TAKEAWAY:
- Gut microbial diversity, particularly alpha diversity, was notably reduced in CCP+ individuals at risk of developing RA vs healthy control individuals (P = .012). Recognized risk factors for RA development such as the presence of rheumatoid factor antibodies and the human leukocyte antigen shared epitope, were significantly linked to diminished gut microbial diversity, in addition to steroid use.
- A specific Prevotellaceae strain (ASV2058) was found to be overabundant in CCP+ individuals at risk of developing RA and in those newly diagnosed with the condition but not in healthy control individuals. Further analysis showed that enrichment and depletion of three and five strains of Prevotellaceae, respectively, were associated with the progression to RA in CCP+ individuals.
- CCP+ individuals who progressed to RA were found to have substantial fluctuations in gut microbiome profiles around 10 months before clinical diagnosis; however, these profiles were relatively stable 10-15 months before the onset of RA, suggesting that changes in the microbiome occur at a later stage.
- Patients with new-onset RA were found to have distinct metabolic shifts, particularly in pathways related to amino acid and energy metabolism.
IN PRACTICE:
“Individuals at risk of RA harbor a distinctive gut microbial composition, including but not limited to an overabundance of Prevotellaceae species. This microbial signature is consistent and correlates with traditional RA risk factors,” the authors wrote.
SOURCE:
The study was led by Christopher M. Rooney, MD, PhD, University of Leeds in England. It was published online in Annals of the Rheumatic Diseases.
LIMITATIONS:
The small longitudinal sample size and lack of a 1:1 longitudinal comparison between CCP+ individuals at risk for RA and healthy control individuals were major limitations of this study. The new-onset RA cohort was heterogeneous, reflecting the practical constraints of recruitment from standard care clinics. Integrated transcriptomic or metabolomic data were unavailable, restricting interpretation to potential rather than confirmed metabolic activity.
DISCLOSURES:
This study was funded by personal fellowships received by the lead author from Versus Arthritis, Leeds Cares, and a National Institute for Health Research Clinical Lectureship. Some authors disclosed receiving grants, funding, consulting fees, or honoraria from various pharmaceutical companies.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
Individuals at an increased risk of developing rheumatoid arthritis (RA) have a unique gut microbial composition, characterized by a notable increase in certain strains of Prevotella bacteria. These changes begin approximately 10 months prior to the onset of RA.
METHODOLOGY:
- In this cross-sectional and longitudinal observational study, researchers aimed to identify microbial associations in the early stages of RA, focusing specifically on Prevotellaceae strains.
- The cross-sectional analysis assessed the gut microbiome profiles of 124 individuals at risk of developing RA, 7 patients with newly diagnosed RA, and 22 healthy control individuals free of musculoskeletal symptoms at five different time points over a period of 15 months; 30 patients progressed to RA during the study period.
- The longitudinal analysis was performed in 19 individuals at risk of developing RA, of whom 5 progressed to the condition.
- The risk of developing RA was identified by the presence of anti–cyclic citrullinated protein (anti-CCP) antibodies and the onset of musculoskeletal pain in the preceding 3 months.
- Gut microbiome taxonomic alterations were investigated using 16S rRNA amplicon sequencing and confirmed with shotgun metagenomic DNA sequencing of 49 samples.
TAKEAWAY:
- Gut microbial diversity, particularly alpha diversity, was notably reduced in CCP+ individuals at risk of developing RA vs healthy control individuals (P = .012). Recognized risk factors for RA development such as the presence of rheumatoid factor antibodies and the human leukocyte antigen shared epitope, were significantly linked to diminished gut microbial diversity, in addition to steroid use.
- A specific Prevotellaceae strain (ASV2058) was found to be overabundant in CCP+ individuals at risk of developing RA and in those newly diagnosed with the condition but not in healthy control individuals. Further analysis showed that enrichment and depletion of three and five strains of Prevotellaceae, respectively, were associated with the progression to RA in CCP+ individuals.
- CCP+ individuals who progressed to RA were found to have substantial fluctuations in gut microbiome profiles around 10 months before clinical diagnosis; however, these profiles were relatively stable 10-15 months before the onset of RA, suggesting that changes in the microbiome occur at a later stage.
- Patients with new-onset RA were found to have distinct metabolic shifts, particularly in pathways related to amino acid and energy metabolism.
IN PRACTICE:
“Individuals at risk of RA harbor a distinctive gut microbial composition, including but not limited to an overabundance of Prevotellaceae species. This microbial signature is consistent and correlates with traditional RA risk factors,” the authors wrote.
SOURCE:
The study was led by Christopher M. Rooney, MD, PhD, University of Leeds in England. It was published online in Annals of the Rheumatic Diseases.
LIMITATIONS:
The small longitudinal sample size and lack of a 1:1 longitudinal comparison between CCP+ individuals at risk for RA and healthy control individuals were major limitations of this study. The new-onset RA cohort was heterogeneous, reflecting the practical constraints of recruitment from standard care clinics. Integrated transcriptomic or metabolomic data were unavailable, restricting interpretation to potential rather than confirmed metabolic activity.
DISCLOSURES:
This study was funded by personal fellowships received by the lead author from Versus Arthritis, Leeds Cares, and a National Institute for Health Research Clinical Lectureship. Some authors disclosed receiving grants, funding, consulting fees, or honoraria from various pharmaceutical companies.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
No Link Between PPI Use and Risk for Cardiovascular Events
TOPLINE:
There is no significant association between the use of proton pump inhibitors (PPIs) and risk for cardiovascular events, a meta-analysis shows. However, patients with gastroesophageal reflux disease (GERD) do experience a slight increase in cardiovascular events with PPI use.
METHODOLOGY:
- PPIs are commonly used gastric acid suppressants; however, they have pleiotropic effects, some of which have been hypothesized to augment cardiovascular disorders.
- Researchers conducted a meta-analysis of randomized clinical trials with at least 100 patients and treatment durations > 30 days, which compared groups receiving PPIs to those on placebo or other active treatments.
- The primary outcome was a composite of nonfatal myocardial infarctions, nonfatal strokes, fatal cardiovascular adverse events, coronary revascularizations, and hospitalizations for unstable angina.
TAKEAWAY:
- Researchers included data from 52 placebo-controlled trials, with 14,988 patients and 8323 patients randomized to receive a PPI or placebo, respectively; the mean treatment duration was 0.45 person-years for those treated with PPIs and 0.32 person-years for those treated with placebo.
- Among placebo-controlled trials, 24 were conducted in patients with GERD.
- Researchers also included 61 active-controlled trials that compared PPIs with histamine-2 receptor antagonists (51 trials) or other active treatments.
- The incidence rate ratio for the primary outcome was 0.72 when comparing PPI to placebo, indicating no significant association between PPI and cardiovascular events.
- Among patients with GERD, cardiovascular events occurred only in those treated with PPIs, leading to approximately one excess cardiovascular event per 100 person-years of PPI treatment relative to placebo.
- Researchers found no association between PPI treatment and the risk for cardiovascular events in trials comparing PPIs with other active treatments.
IN PRACTICE:
“We found no association of cardiovascular events with PPI treatment,” the authors wrote. “Cardiovascular events appeared more frequent with PPI treatment in GERD trials, but results from this subgroup should be interpreted with the limitations of the analysis in mind.”
SOURCE:
The study, led by Andrew D. Mosholder, MD, MPH, Division of Epidemiology, US Food and Drug Administration Center for Drug Evaluation and Research, Silver Spring, Maryland, was published online in The American Journal of Gastroenterology.
LIMITATIONS:
This study lacked individual patient data, which precluded a time-to-event analysis or an analysis accounting for patient characteristics such as age or sex. The mean duration of PPI treatment in these trials was a few months, limiting the assessment of cardiovascular risk with extended use. The risk estimates were influenced the most by data on omeprazole and esomeprazole.
DISCLOSURES:
This study did not receive any funding. The authors declared no conflicts of interests.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
There is no significant association between the use of proton pump inhibitors (PPIs) and risk for cardiovascular events, a meta-analysis shows. However, patients with gastroesophageal reflux disease (GERD) do experience a slight increase in cardiovascular events with PPI use.
METHODOLOGY:
- PPIs are commonly used gastric acid suppressants; however, they have pleiotropic effects, some of which have been hypothesized to augment cardiovascular disorders.
- Researchers conducted a meta-analysis of randomized clinical trials with at least 100 patients and treatment durations > 30 days, which compared groups receiving PPIs to those on placebo or other active treatments.
- The primary outcome was a composite of nonfatal myocardial infarctions, nonfatal strokes, fatal cardiovascular adverse events, coronary revascularizations, and hospitalizations for unstable angina.
TAKEAWAY:
- Researchers included data from 52 placebo-controlled trials, with 14,988 patients and 8323 patients randomized to receive a PPI or placebo, respectively; the mean treatment duration was 0.45 person-years for those treated with PPIs and 0.32 person-years for those treated with placebo.
- Among placebo-controlled trials, 24 were conducted in patients with GERD.
- Researchers also included 61 active-controlled trials that compared PPIs with histamine-2 receptor antagonists (51 trials) or other active treatments.
- The incidence rate ratio for the primary outcome was 0.72 when comparing PPI to placebo, indicating no significant association between PPI and cardiovascular events.
- Among patients with GERD, cardiovascular events occurred only in those treated with PPIs, leading to approximately one excess cardiovascular event per 100 person-years of PPI treatment relative to placebo.
- Researchers found no association between PPI treatment and the risk for cardiovascular events in trials comparing PPIs with other active treatments.
IN PRACTICE:
“We found no association of cardiovascular events with PPI treatment,” the authors wrote. “Cardiovascular events appeared more frequent with PPI treatment in GERD trials, but results from this subgroup should be interpreted with the limitations of the analysis in mind.”
SOURCE:
The study, led by Andrew D. Mosholder, MD, MPH, Division of Epidemiology, US Food and Drug Administration Center for Drug Evaluation and Research, Silver Spring, Maryland, was published online in The American Journal of Gastroenterology.
LIMITATIONS:
This study lacked individual patient data, which precluded a time-to-event analysis or an analysis accounting for patient characteristics such as age or sex. The mean duration of PPI treatment in these trials was a few months, limiting the assessment of cardiovascular risk with extended use. The risk estimates were influenced the most by data on omeprazole and esomeprazole.
DISCLOSURES:
This study did not receive any funding. The authors declared no conflicts of interests.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
There is no significant association between the use of proton pump inhibitors (PPIs) and risk for cardiovascular events, a meta-analysis shows. However, patients with gastroesophageal reflux disease (GERD) do experience a slight increase in cardiovascular events with PPI use.
METHODOLOGY:
- PPIs are commonly used gastric acid suppressants; however, they have pleiotropic effects, some of which have been hypothesized to augment cardiovascular disorders.
- Researchers conducted a meta-analysis of randomized clinical trials with at least 100 patients and treatment durations > 30 days, which compared groups receiving PPIs to those on placebo or other active treatments.
- The primary outcome was a composite of nonfatal myocardial infarctions, nonfatal strokes, fatal cardiovascular adverse events, coronary revascularizations, and hospitalizations for unstable angina.
TAKEAWAY:
- Researchers included data from 52 placebo-controlled trials, with 14,988 patients and 8323 patients randomized to receive a PPI or placebo, respectively; the mean treatment duration was 0.45 person-years for those treated with PPIs and 0.32 person-years for those treated with placebo.
- Among placebo-controlled trials, 24 were conducted in patients with GERD.
- Researchers also included 61 active-controlled trials that compared PPIs with histamine-2 receptor antagonists (51 trials) or other active treatments.
- The incidence rate ratio for the primary outcome was 0.72 when comparing PPI to placebo, indicating no significant association between PPI and cardiovascular events.
- Among patients with GERD, cardiovascular events occurred only in those treated with PPIs, leading to approximately one excess cardiovascular event per 100 person-years of PPI treatment relative to placebo.
- Researchers found no association between PPI treatment and the risk for cardiovascular events in trials comparing PPIs with other active treatments.
IN PRACTICE:
“We found no association of cardiovascular events with PPI treatment,” the authors wrote. “Cardiovascular events appeared more frequent with PPI treatment in GERD trials, but results from this subgroup should be interpreted with the limitations of the analysis in mind.”
SOURCE:
The study, led by Andrew D. Mosholder, MD, MPH, Division of Epidemiology, US Food and Drug Administration Center for Drug Evaluation and Research, Silver Spring, Maryland, was published online in The American Journal of Gastroenterology.
LIMITATIONS:
This study lacked individual patient data, which precluded a time-to-event analysis or an analysis accounting for patient characteristics such as age or sex. The mean duration of PPI treatment in these trials was a few months, limiting the assessment of cardiovascular risk with extended use. The risk estimates were influenced the most by data on omeprazole and esomeprazole.
DISCLOSURES:
This study did not receive any funding. The authors declared no conflicts of interests.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Treating Obesity May Reduce Pelvic Organ Prolapse Risk
TOPLINE:
People with central obesity (CO), characterized by excess fat around the abdomen, are at a greater risk for pelvic organ prolapse (POP), particularly those who are younger than 60 years or without a history of hysterectomy. Also, women who have overweight but do not have CO are at greater risk.
METHODOLOGY:
- Researchers conducted a prospective cohort study to estimate the association between CO and general obesity and the risk for POP in individuals using the UK Biobank.
- A total of 251,143 participants (median age, 57 years) without preexisting POP were included, of whom 60.9% were postmenopausal and 17.2% had undergone hysterectomy before enrollment.
- Participants were followed for a median duration of 13.8 years, and POP cases were identified using International Classification of Diseases, 10th Revision (ICD-10) codes.
- Waist circumference, height, and body weight were measured at enrollment for the calculation of waist/height ratio and body mass index (BMI); CO was defined as a waist/height ratio ≥ 0.5.
- The relative risk of POP for the various combinations of waist/height ratio and BMI was evaluated against the reference group (waist/height ratio < 0.5; BMI < 25) using Cox proportional hazards models.
TAKEAWAY:
- During the follow-up period, 9781 cases of POP were identified, of which 71.2% occurred in a single pelvic compartment.
- Around 21.7% of all POP cases were attributable to CO; 2% were attributable to being overweight without CO.
- The risk for POP was 48% higher in individuals with CO regardless of BMI (hazard ratio [HR], 1.48; 95% CI, 1.41-1.56) and 23% higher in those who had overweight without CO (HR, 1.23; 95% CI, 1.14-1.34).
- The association between POP and CO was further strengthened in individuals who were younger than 60 years and those without a history of hysterectomy.
IN PRACTICE:
“We found that waist/height ratio combined with BMI could help differentiate individuals with varying risks of prolapse more accurately. Among individuals within the same BMI category, waist/height ratio can vary, with those having a higher ratio generally facing a greater risk of POP, compared with those with a normal ratio. Therefore, they should not be grouped together based solely on a single measure of obesity. In addition, this combination can help identify more individuals at high risk for POP, compared with using either alone,” the study authors wrote.
SOURCE:
This study was led by Keyi Si, PhD, of Tongji University in Shanghai, China, and was published online in Obstetrics & Gynecology.
LIMITATIONS:
Differences in healthcare-seeking behavior could have biased the association between obesity and risk for POP, as individuals with obesity may have been less likely to notice or report symptoms of POP. The diagnosis of POP was according to ICD-10 codes rather than physical examination, which may have affected accuracy. Other limitations included missing data on delivery mode and history of constipation.
DISCLOSURES:
This study was supported by grants from the National Natural Science Foundation of China, the Science and Technology Commission of Shanghai Municipality, the Shanghai Hospital Development Center, and the Shanghai First Maternity and Infant Hospital. The authors reported no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
People with central obesity (CO), characterized by excess fat around the abdomen, are at a greater risk for pelvic organ prolapse (POP), particularly those who are younger than 60 years or without a history of hysterectomy. Also, women who have overweight but do not have CO are at greater risk.
METHODOLOGY:
- Researchers conducted a prospective cohort study to estimate the association between CO and general obesity and the risk for POP in individuals using the UK Biobank.
- A total of 251,143 participants (median age, 57 years) without preexisting POP were included, of whom 60.9% were postmenopausal and 17.2% had undergone hysterectomy before enrollment.
- Participants were followed for a median duration of 13.8 years, and POP cases were identified using International Classification of Diseases, 10th Revision (ICD-10) codes.
- Waist circumference, height, and body weight were measured at enrollment for the calculation of waist/height ratio and body mass index (BMI); CO was defined as a waist/height ratio ≥ 0.5.
- The relative risk of POP for the various combinations of waist/height ratio and BMI was evaluated against the reference group (waist/height ratio < 0.5; BMI < 25) using Cox proportional hazards models.
TAKEAWAY:
- During the follow-up period, 9781 cases of POP were identified, of which 71.2% occurred in a single pelvic compartment.
- Around 21.7% of all POP cases were attributable to CO; 2% were attributable to being overweight without CO.
- The risk for POP was 48% higher in individuals with CO regardless of BMI (hazard ratio [HR], 1.48; 95% CI, 1.41-1.56) and 23% higher in those who had overweight without CO (HR, 1.23; 95% CI, 1.14-1.34).
- The association between POP and CO was further strengthened in individuals who were younger than 60 years and those without a history of hysterectomy.
IN PRACTICE:
“We found that waist/height ratio combined with BMI could help differentiate individuals with varying risks of prolapse more accurately. Among individuals within the same BMI category, waist/height ratio can vary, with those having a higher ratio generally facing a greater risk of POP, compared with those with a normal ratio. Therefore, they should not be grouped together based solely on a single measure of obesity. In addition, this combination can help identify more individuals at high risk for POP, compared with using either alone,” the study authors wrote.
SOURCE:
This study was led by Keyi Si, PhD, of Tongji University in Shanghai, China, and was published online in Obstetrics & Gynecology.
LIMITATIONS:
Differences in healthcare-seeking behavior could have biased the association between obesity and risk for POP, as individuals with obesity may have been less likely to notice or report symptoms of POP. The diagnosis of POP was according to ICD-10 codes rather than physical examination, which may have affected accuracy. Other limitations included missing data on delivery mode and history of constipation.
DISCLOSURES:
This study was supported by grants from the National Natural Science Foundation of China, the Science and Technology Commission of Shanghai Municipality, the Shanghai Hospital Development Center, and the Shanghai First Maternity and Infant Hospital. The authors reported no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
People with central obesity (CO), characterized by excess fat around the abdomen, are at a greater risk for pelvic organ prolapse (POP), particularly those who are younger than 60 years or without a history of hysterectomy. Also, women who have overweight but do not have CO are at greater risk.
METHODOLOGY:
- Researchers conducted a prospective cohort study to estimate the association between CO and general obesity and the risk for POP in individuals using the UK Biobank.
- A total of 251,143 participants (median age, 57 years) without preexisting POP were included, of whom 60.9% were postmenopausal and 17.2% had undergone hysterectomy before enrollment.
- Participants were followed for a median duration of 13.8 years, and POP cases were identified using International Classification of Diseases, 10th Revision (ICD-10) codes.
- Waist circumference, height, and body weight were measured at enrollment for the calculation of waist/height ratio and body mass index (BMI); CO was defined as a waist/height ratio ≥ 0.5.
- The relative risk of POP for the various combinations of waist/height ratio and BMI was evaluated against the reference group (waist/height ratio < 0.5; BMI < 25) using Cox proportional hazards models.
TAKEAWAY:
- During the follow-up period, 9781 cases of POP were identified, of which 71.2% occurred in a single pelvic compartment.
- Around 21.7% of all POP cases were attributable to CO; 2% were attributable to being overweight without CO.
- The risk for POP was 48% higher in individuals with CO regardless of BMI (hazard ratio [HR], 1.48; 95% CI, 1.41-1.56) and 23% higher in those who had overweight without CO (HR, 1.23; 95% CI, 1.14-1.34).
- The association between POP and CO was further strengthened in individuals who were younger than 60 years and those without a history of hysterectomy.
IN PRACTICE:
“We found that waist/height ratio combined with BMI could help differentiate individuals with varying risks of prolapse more accurately. Among individuals within the same BMI category, waist/height ratio can vary, with those having a higher ratio generally facing a greater risk of POP, compared with those with a normal ratio. Therefore, they should not be grouped together based solely on a single measure of obesity. In addition, this combination can help identify more individuals at high risk for POP, compared with using either alone,” the study authors wrote.
SOURCE:
This study was led by Keyi Si, PhD, of Tongji University in Shanghai, China, and was published online in Obstetrics & Gynecology.
LIMITATIONS:
Differences in healthcare-seeking behavior could have biased the association between obesity and risk for POP, as individuals with obesity may have been less likely to notice or report symptoms of POP. The diagnosis of POP was according to ICD-10 codes rather than physical examination, which may have affected accuracy. Other limitations included missing data on delivery mode and history of constipation.
DISCLOSURES:
This study was supported by grants from the National Natural Science Foundation of China, the Science and Technology Commission of Shanghai Municipality, the Shanghai Hospital Development Center, and the Shanghai First Maternity and Infant Hospital. The authors reported no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.